Sample records for non-randomly distributed locations-exemplified

  1. Non-random distribution and co-localization of purine/pyrimidine-encoded information and transcriptional regulatory domains.

    PubMed

    Povinelli, C M

    1992-01-01

    In order to detect sequence-based information predictive for the location of eukaryotic transcriptional regulatory domains, the frequencies and distributions of the 36 possible purine/pyrimidine reverse complement hexamer pairs was determined for test sets of real and random sequences. The distribution of one of the hexamer pairs (RRYYRR/YYRRYY, referred to as M1) was further examined in a larger set of sequences (> 32 genes, 230 kb). Predominant clusters of M1 and the locations of eukaryotic transcriptional regulatory domains were found to be associated and non-randomly distributed along the DNA consistent with a periodicity of approximately 1.2 kb. In the context of higher ordered chromatin this would align promoters, enhancers and the predominant clusters of M1 longitudinally along one face of a 30 nm fiber. Using only information about the distribution of the M1 motif, 50-70% of a sequence could be eliminated as being unlikely to contain transcriptional regulatory domains with an 87% recovery of the regulatory domains present.

  2. Interpolating Non-Parametric Distributions of Hourly Rainfall Intensities Using Random Mixing

    NASA Astrophysics Data System (ADS)

    Mosthaf, Tobias; Bárdossy, András; Hörning, Sebastian

    2015-04-01

    The correct spatial interpolation of hourly rainfall intensity distributions is of great importance for stochastical rainfall models. Poorly interpolated distributions may lead to over- or underestimation of rainfall and consequently to wrong estimates of following applications, like hydrological or hydraulic models. By analyzing the spatial relation of empirical rainfall distribution functions, a persistent order of the quantile values over a wide range of non-exceedance probabilities is observed. As the order remains similar, the interpolation weights of quantile values for one certain non-exceedance probability can be applied to the other probabilities. This assumption enables the use of kernel smoothed distribution functions for interpolation purposes. Comparing the order of hourly quantile values over different gauges with the order of their daily quantile values for equal probabilities, results in high correlations. The hourly quantile values also show high correlations with elevation. The incorporation of these two covariates into the interpolation is therefore tested. As only positive interpolation weights for the quantile values assure a monotonically increasing distribution function, the use of geostatistical methods like kriging is problematic. Employing kriging with external drift to incorporate secondary information is not applicable. Nonetheless, it would be fruitful to make use of covariates. To overcome this shortcoming, a new random mixing approach of spatial random fields is applied. Within the mixing process hourly quantile values are considered as equality constraints and correlations with elevation values are included as relationship constraints. To profit from the dependence of daily quantile values, distribution functions of daily gauges are used to set up lower equal and greater equal constraints at their locations. In this way the denser daily gauge network can be included in the interpolation of the hourly distribution functions. The

  3. Modeling Linguistic Variables With Regression Models: Addressing Non-Gaussian Distributions, Non-independent Observations, and Non-linear Predictors With Random Effects and Generalized Additive Models for Location, Scale, and Shape

    PubMed Central

    Coupé, Christophe

    2018-01-01

    As statistical approaches are getting increasingly used in linguistics, attention must be paid to the choice of methods and algorithms used. This is especially true since they require assumptions to be satisfied to provide valid results, and because scientific articles still often fall short of reporting whether such assumptions are met. Progress is being, however, made in various directions, one of them being the introduction of techniques able to model data that cannot be properly analyzed with simpler linear regression models. We report recent advances in statistical modeling in linguistics. We first describe linear mixed-effects regression models (LMM), which address grouping of observations, and generalized linear mixed-effects models (GLMM), which offer a family of distributions for the dependent variable. Generalized additive models (GAM) are then introduced, which allow modeling non-linear parametric or non-parametric relationships between the dependent variable and the predictors. We then highlight the possibilities offered by generalized additive models for location, scale, and shape (GAMLSS). We explain how they make it possible to go beyond common distributions, such as Gaussian or Poisson, and offer the appropriate inferential framework to account for ‘difficult’ variables such as count data with strong overdispersion. We also demonstrate how they offer interesting perspectives on data when not only the mean of the dependent variable is modeled, but also its variance, skewness, and kurtosis. As an illustration, the case of phonemic inventory size is analyzed throughout the article. For over 1,500 languages, we consider as predictors the number of speakers, the distance from Africa, an estimation of the intensity of language contact, and linguistic relationships. We discuss the use of random effects to account for genealogical relationships, the choice of appropriate distributions to model count data, and non-linear relationships. Relying on GAMLSS

  4. Modeling Linguistic Variables With Regression Models: Addressing Non-Gaussian Distributions, Non-independent Observations, and Non-linear Predictors With Random Effects and Generalized Additive Models for Location, Scale, and Shape.

    PubMed

    Coupé, Christophe

    2018-01-01

    As statistical approaches are getting increasingly used in linguistics, attention must be paid to the choice of methods and algorithms used. This is especially true since they require assumptions to be satisfied to provide valid results, and because scientific articles still often fall short of reporting whether such assumptions are met. Progress is being, however, made in various directions, one of them being the introduction of techniques able to model data that cannot be properly analyzed with simpler linear regression models. We report recent advances in statistical modeling in linguistics. We first describe linear mixed-effects regression models (LMM), which address grouping of observations, and generalized linear mixed-effects models (GLMM), which offer a family of distributions for the dependent variable. Generalized additive models (GAM) are then introduced, which allow modeling non-linear parametric or non-parametric relationships between the dependent variable and the predictors. We then highlight the possibilities offered by generalized additive models for location, scale, and shape (GAMLSS). We explain how they make it possible to go beyond common distributions, such as Gaussian or Poisson, and offer the appropriate inferential framework to account for 'difficult' variables such as count data with strong overdispersion. We also demonstrate how they offer interesting perspectives on data when not only the mean of the dependent variable is modeled, but also its variance, skewness, and kurtosis. As an illustration, the case of phonemic inventory size is analyzed throughout the article. For over 1,500 languages, we consider as predictors the number of speakers, the distance from Africa, an estimation of the intensity of language contact, and linguistic relationships. We discuss the use of random effects to account for genealogical relationships, the choice of appropriate distributions to model count data, and non-linear relationships. Relying on GAMLSS, we

  5. Simulation of flight maneuver-load distributions by utilizing stationary, non-Gaussian random load histories

    NASA Technical Reports Server (NTRS)

    Leybold, H. A.

    1971-01-01

    Random numbers were generated with the aid of a digital computer and transformed such that the probability density function of a discrete random load history composed of these random numbers had one of the following non-Gaussian distributions: Poisson, binomial, log-normal, Weibull, and exponential. The resulting random load histories were analyzed to determine their peak statistics and were compared with cumulative peak maneuver-load distributions for fighter and transport aircraft in flight.

  6. Effects of ignition location models on the burn patterns of simulated wildfires

    USGS Publications Warehouse

    Bar-Massada, A.; Syphard, A.D.; Hawbaker, T.J.; Stewart, S.I.; Radeloff, V.C.

    2011-01-01

    Fire simulation studies that use models such as FARSITE often assume that ignition locations are distributed randomly, because spatially explicit information about actual ignition locations are difficult to obtain. However, many studies show that the spatial distribution of ignition locations, whether human-caused or natural, is non-random. Thus, predictions from fire simulations based on random ignitions may be unrealistic. However, the extent to which the assumption of ignition location affects the predictions of fire simulation models has never been systematically explored. Our goal was to assess the difference in fire simulations that are based on random versus non-random ignition location patterns. We conducted four sets of 6000 FARSITE simulations for the Santa Monica Mountains in California to quantify the influence of random and non-random ignition locations and normal and extreme weather conditions on fire size distributions and spatial patterns of burn probability. Under extreme weather conditions, fires were significantly larger for non-random ignitions compared to random ignitions (mean area of 344.5 ha and 230.1 ha, respectively), but burn probability maps were highly correlated (r = 0.83). Under normal weather, random ignitions produced significantly larger fires than non-random ignitions (17.5 ha and 13.3 ha, respectively), and the spatial correlations between burn probability maps were not high (r = 0.54), though the difference in the average burn probability was small. The results of the study suggest that the location of ignitions used in fire simulation models may substantially influence the spatial predictions of fire spread patterns. However, the spatial bias introduced by using a random ignition location model may be minimized if the fire simulations are conducted under extreme weather conditions when fire spread is greatest. ?? 2010 Elsevier Ltd.

  7. Modeling and statistical analysis of non-Gaussian random fields with heavy-tailed distributions.

    PubMed

    Nezhadhaghighi, Mohsen Ghasemi; Nakhlband, Abbas

    2017-04-01

    In this paper, we investigate and develop an alternative approach to the numerical analysis and characterization of random fluctuations with the heavy-tailed probability distribution function (PDF), such as turbulent heat flow and solar flare fluctuations. We identify the heavy-tailed random fluctuations based on the scaling properties of the tail exponent of the PDF, power-law growth of qth order correlation function, and the self-similar properties of the contour lines in two-dimensional random fields. Moreover, this work leads to a substitution for the fractional Edwards-Wilkinson (EW) equation that works in the presence of μ-stable Lévy noise. Our proposed model explains the configuration dynamics of the systems with heavy-tailed correlated random fluctuations. We also present an alternative solution to the fractional EW equation in the presence of μ-stable Lévy noise in the steady state, which is implemented numerically, using the μ-stable fractional Lévy motion. Based on the analysis of the self-similar properties of contour loops, we numerically show that the scaling properties of contour loop ensembles can qualitatively and quantitatively distinguish non-Gaussian random fields from Gaussian random fluctuations.

  8. A polymer, random walk model for the size-distribution of large DNA fragments after high linear energy transfer radiation

    NASA Technical Reports Server (NTRS)

    Ponomarev, A. L.; Brenner, D.; Hlatky, L. R.; Sachs, R. K.

    2000-01-01

    DNA double-strand breaks (DSBs) produced by densely ionizing radiation are not located randomly in the genome: recent data indicate DSB clustering along chromosomes. Stochastic DSB clustering at large scales, from > 100 Mbp down to < 0.01 Mbp, is modeled using computer simulations and analytic equations. A random-walk, coarse-grained polymer model for chromatin is combined with a simple track structure model in Monte Carlo software called DNAbreak and is applied to data on alpha-particle irradiation of V-79 cells. The chromatin model neglects molecular details but systematically incorporates an increase in average spatial separation between two DNA loci as the number of base-pairs between the loci increases. Fragment-size distributions obtained using DNAbreak match data on large fragments about as well as distributions previously obtained with a less mechanistic approach. Dose-response relations, linear at small doses of high linear energy transfer (LET) radiation, are obtained. They are found to be non-linear when the dose becomes so large that there is a significant probability of overlapping or close juxtaposition, along one chromosome, for different DSB clusters from different tracks. The non-linearity is more evident for large fragments than for small. The DNAbreak results furnish an example of the RLC (randomly located clusters) analytic formalism, which generalizes the broken-stick fragment-size distribution of the random-breakage model that is often applied to low-LET data.

  9. Randomness versus specifics for word-frequency distributions

    NASA Astrophysics Data System (ADS)

    Yan, Xiaoyong; Minnhagen, Petter

    2016-02-01

    The text-length-dependence of real word-frequency distributions can be connected to the general properties of a random book. It is pointed out that this finding has strong implications, when deciding between two conceptually different views on word-frequency distributions, i.e. the specific 'Zipf's-view' and the non-specific 'Randomness-view', as is discussed. It is also noticed that the text-length transformation of a random book does have an exact scaling property precisely for the power-law index γ = 1, as opposed to the Zipf's exponent γ = 2 and the implication of this exact scaling property is discussed. However a real text has γ > 1 and as a consequence γ increases when shortening a real text. The connections to the predictions from the RGF (Random Group Formation) and to the infinite length-limit of a meta-book are also discussed. The difference between 'curve-fitting' and 'predicting' word-frequency distributions is stressed. It is pointed out that the question of randomness versus specifics for the distribution of outcomes in case of sufficiently complex systems has a much wider relevance than just the word-frequency example analyzed in the present work.

  10. Non-random distribution of DNA double-strand breaks induced by particle irradiation

    NASA Technical Reports Server (NTRS)

    Lobrich, M.; Cooper, P. K.; Rydberg, B.; Chatterjee, A. (Principal Investigator)

    1996-01-01

    Induction of DNA double-strand breaks (dsbs) in mammalian cells is dependent on the spatial distribution of energy deposition from the ionizing radiation. For high LET particle radiations the primary ionization sites occur in a correlated manner along the track of the particles, while for X-rays these sites are much more randomly distributed throughout the volume of the cell. It can therefore be expected that the distribution of dsbs linearly along the DNA molecule also varies with the type of radiation and the ionization density. Using pulsed-field gel and conventional gel techniques, we measured the size distribution of DNA molecules from irradiated human fibroblasts in the total range of 0.1 kbp-10 Mbp for X-rays and high LET particles (N ions, 97 keV/microns and Fe ions, 150 keV/microns). On a mega base pair scale we applied conventional pulsed-field gel electrophoresis techniques such as measurement of the fraction of DNA released from the well (FAR) and measurement of breakage within a specific NotI restriction fragment (hybridization assay). The induction rate for widely spaced breaks was found to decrease with LET. However, when the entire distribution of radiation-induced fragments was analysed, we detected an excess of fragments with sizes below about 200 kbp for the particles compared with X-irradiation. X-rays are thus more effective than high LET radiations in producing large DNA fragments but less effective in the production of smaller fragments. We determined the total induction rate of dsbs for the three radiations based on a quantitative analysis of all the measured radiation-induced fragments and found that the high LET particles were more efficient than X-rays at inducing dsbs, indicating an increasing total efficiency with LET. Conventional assays that are based only on the measurement of large fragments are therefore misleading when determining total dsb induction rates of high LET particles. The possible biological significance of this non-randomness

  11. Weighted Scaling in Non-growth Random Networks

    NASA Astrophysics Data System (ADS)

    Chen, Guang; Yang, Xu-Hua; Xu, Xin-Li

    2012-09-01

    We propose a weighted model to explain the self-organizing formation of scale-free phenomenon in non-growth random networks. In this model, we use multiple-edges to represent the connections between vertices and define the weight of a multiple-edge as the total weights of all single-edges within it and the strength of a vertex as the sum of weights for those multiple-edges attached to it. The network evolves according to a vertex strength preferential selection mechanism. During the evolution process, the network always holds its total number of vertices and its total number of single-edges constantly. We show analytically and numerically that a network will form steady scale-free distributions with our model. The results show that a weighted non-growth random network can evolve into scale-free state. It is interesting that the network also obtains the character of an exponential edge weight distribution. Namely, coexistence of scale-free distribution and exponential distribution emerges.

  12. Discovery of Non-random Spatial Distribution of Impacts in the Stardust Cometary Collector

    NASA Technical Reports Server (NTRS)

    Horz, Friedrich; Westphal, Andrew J.; Gainsforth, Zack; Borg, Janet; Djouadi, Zahia; Bridges, John; Franchi, Ian; Brownlee, Donald E.; Cheng. Andrew F.; Clark, Benton C.; hide

    2007-01-01

    We report the discovery that impacts in the Stardust cometary collector are not distributed randomly in the collecting media, but appear to be clustered on scales smaller than 10 cm. We also report the discovery of at least two populations of oblique tracks. We evaluated several hypotheses that could explain the observations. No hypothesis was consistent with all the observations, but the preponderance of evidence points toward at least one impact on the central Whipple shield of the spacecraft as the origin of both clustering and low-angle oblique tracks. High-angle oblique tracks unambiguously originate from a non-cometary impact on the spacecraft bus just forward of the collector.

  13. Practice Location Characteristics of Non-Traditional Dental Practices.

    PubMed

    Solomon, Eric S; Jones, Daniel L

    2016-04-01

    Current and future dental school graduates are increasingly likely to choose a non-traditional dental practice-a group practice managed by a dental service organization or a corporate practice with employed dentists-for their initial practice experience. In addition, the growth of non-traditional practices, which are located primarily in major urban areas, could accelerate the movement of dentists to those areas and contribute to geographic disparities in the distribution of dental services. To help the profession understand the implications of these developments, the aim of this study was to compare the location characteristics of non-traditional practices and traditional dental practices. After identifying non-traditional practices across the United States, the authors located those practices and traditional dental practices geographically by zip code. Non-traditional dental practices were found to represent about 3.1% of all dental practices, but they had a greater impact on the marketplace with almost twice the average number of staff and annual revenue. Virtually all non-traditional dental practices were located in zip codes that also had a traditional dental practice. Zip codes with non-traditional practices had significant differences from zip codes with only a traditional dental practice: the populations in areas with non-traditional practices had higher income levels and higher education and were slightly younger and proportionally more Hispanic; those practices also had a much higher likelihood of being located in a major metropolitan area. Dental educators and leaders need to understand the impact of these trends in the practice environment in order to both prepare graduates for practice and make decisions about planning for the workforce of the future.

  14. Deterministic walks with inverse-square power-law scaling are an emergent property of predators that use chemotaxis to locate randomly distributed prey

    NASA Astrophysics Data System (ADS)

    Reynolds, A. M.

    2008-07-01

    The results of numerical simulations indicate that deterministic walks with inverse-square power-law scaling are a robust emergent property of predators that use chemotaxis to locate randomly and sparsely distributed stationary prey items. It is suggested that chemotactic destructive foraging accounts for the apparent Lévy flight movement patterns of Oxyrrhis marina microzooplankton in still water containing prey items. This challenges the view that these organisms are executing an innate optimal Lévy flight searching strategy. Crucial for the emergence of inverse-square power-law scaling is the tendency of chemotaxis to occasionally cause predators to miss the nearest prey item, an occurrence which would not arise if prey were located through the employment of a reliable cognitive map or if prey location were visually cued and perfect.

  15. Competitive Facility Location with Fuzzy Random Demands

    NASA Astrophysics Data System (ADS)

    Uno, Takeshi; Katagiri, Hideki; Kato, Kosuke

    2010-10-01

    This paper proposes a new location problem of competitive facilities, e.g. shops, with uncertainty and vagueness including demands for the facilities in a plane. By representing the demands for facilities as fuzzy random variables, the location problem can be formulated as a fuzzy random programming problem. For solving the fuzzy random programming problem, first the α-level sets for fuzzy numbers are used for transforming it to a stochastic programming problem, and secondly, by using their expectations and variances, it can be reformulated to a deterministic programming problem. After showing that one of their optimal solutions can be found by solving 0-1 programming problems, their solution method is proposed by improving the tabu search algorithm with strategic oscillation. The efficiency of the proposed method is shown by applying it to numerical examples of the facility location problems.

  16. Efficiency of the human observer for detecting a Gaussian signal at a known location in non-Gaussian distributed lumpy backgrounds.

    PubMed

    Park, Subok; Gallas, Bradon D; Badano, Aldo; Petrick, Nicholas A; Myers, Kyle J

    2007-04-01

    A previous study [J. Opt. Soc. Am. A22, 3 (2005)] has shown that human efficiency for detecting a Gaussian signal at a known location in non-Gaussian distributed lumpy backgrounds is approximately 4%. This human efficiency is much less than the reported 40% efficiency that has been documented for Gaussian-distributed lumpy backgrounds [J. Opt. Soc. Am. A16, 694 (1999) and J. Opt. Soc. Am. A18, 473 (2001)]. We conducted a psychophysical study with a number of changes, specifically in display-device calibration and data scaling, from the design of the aforementioned study. Human efficiency relative to the ideal observer was found again to be approximately 5%. Our variance analysis indicates that neither scaling nor display made a statistically significant difference in human performance for the task. We conclude that the non-Gaussian distributed lumpy background is a major factor in our low human-efficiency results.

  17. Simulating Pre-Asymptotic, Non-Fickian Transport Although Doing Simple Random Walks - Supported By Empirical Pore-Scale Velocity Distributions and Memory Effects

    NASA Astrophysics Data System (ADS)

    Most, S.; Jia, N.; Bijeljic, B.; Nowak, W.

    2016-12-01

    Pre-asymptotic characteristics are almost ubiquitous when analyzing solute transport processes in porous media. These pre-asymptotic aspects are caused by spatial coherence in the velocity field and by its heterogeneity. For the Lagrangian perspective of particle displacements, the causes of pre-asymptotic, non-Fickian transport are skewed velocity distribution, statistical dependencies between subsequent increments of particle positions (memory) and dependence between the x, y and z-components of particle increments. Valid simulation frameworks should account for these factors. We propose a particle tracking random walk (PTRW) simulation technique that can use empirical pore-space velocity distributions as input, enforces memory between subsequent random walk steps, and considers cross dependence. Thus, it is able to simulate pre-asymptotic non-Fickian transport phenomena. Our PTRW framework contains an advection/dispersion term plus a diffusion term. The advection/dispersion term produces time-series of particle increments from the velocity CDFs. These time series are equipped with memory by enforcing that the CDF values of subsequent velocities change only slightly. The latter is achieved through a random walk on the axis of CDF values between 0 and 1. The virtual diffusion coefficient for that random walk is our only fitting parameter. Cross-dependence can be enforced by constraining the random walk to certain combinations of CDF values between the three velocity components in x, y and z. We will show that this modelling framework is capable of simulating non-Fickian transport by comparison with a pore-scale transport simulation and we analyze the approach to asymptotic behavior.

  18. SSRscanner: a program for reporting distribution and exact location of simple sequence repeats.

    PubMed

    Anwar, Tamanna; Khan, Asad U

    2006-02-20

    Simple sequence repeats (SSRs) have become important molecular markers for a broad range of applications, such as genome mapping and characterization, phenotype mapping, marker assisted selection of crop plants and a range of molecular ecology and diversity studies. These repeated DNA sequences are found in both prokaryotes and eukaryotes. They are distributed almost at random throughout the genome, ranging from mononucleotide to trinucleotide repeats. They are also found at longer lengths (> 6 repeating units) of tracts. Most of the computer programs that find SSRs do not report its exact position. A computer program SSRscanner was written to find out distribution, frequency and exact location of each SSR in the genome. SSRscanner is user friendly. It can search repeats of any length and produce outputs with their exact position on chromosome and their frequency of occurrence in the sequence. This program has been written in PERL and is freely available for non-commercial users by request from the authors. Please contact the authors by E-mail: huzzi99@hotmail.com.

  19. SSRscanner: a program for reporting distribution and exact location of simple sequence repeats

    PubMed Central

    Anwar, Tamanna; Khan, Asad U

    2006-01-01

    Simple sequence repeats (SSRs) have become important molecular markers for a broad range of applications, such as genome mapping and characterization, phenotype mapping, marker assisted selection of crop plants and a range of molecular ecology and diversity studies. These repeated DNA sequences are found in both prokaryotes and eukaryotes. They are distributed almost at random throughout the genome, ranging from mononucleotide to trinucleotide repeats. They are also found at longer lengths (> 6 repeating units) of tracts. Most of the computer programs that find SSRs do not report its exact position. A computer program SSRscanner was written to find out distribution, frequency and exact location of each SSR in the genome. SSRscanner is user friendly. It can search repeats of any length and produce outputs with their exact position on chromosome and their frequency of occurrence in the sequence. Availability This program has been written in PERL and is freely available for non-commercial users by request from the authors. Please contact the authors by E-mail: huzzi99@hotmail.com PMID:17597863

  20. Source Localization in Wireless Sensor Networks with Randomly Distributed Elements under Multipath Propagation Conditions

    DTIC Science & Technology

    2009-03-01

    IN WIRELESS SENSOR NETWORKS WITH RANDOMLY DISTRIBUTED ELEMENTS UNDER MULTIPATH PROPAGATION CONDITIONS by Georgios Tsivgoulis March 2009...COVERED Engineer’s Thesis 4. TITLE Source Localization in Wireless Sensor Networks with Randomly Distributed Elements under Multipath Propagation...the non-line-of-sight information. 15. NUMBER OF PAGES 111 14. SUBJECT TERMS Wireless Sensor Network , Direction of Arrival, DOA, Random

  1. Probability distributions for Markov chain based quantum walks

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan; Liu, Chaobin; Venegas-Andraca, Salvador E.

    2018-01-01

    We analyze the probability distributions of the quantum walks induced from Markov chains by Szegedy (2004). The first part of this paper is devoted to the quantum walks induced from finite state Markov chains. It is shown that the probability distribution on the states of the underlying Markov chain is always convergent in the Cesaro sense. In particular, we deduce that the limiting distribution is uniform if the transition matrix is symmetric. In the case of a non-symmetric Markov chain, we exemplify that the limiting distribution of the quantum walk is not necessarily identical with the stationary distribution of the underlying irreducible Markov chain. The Szegedy scheme can be extended to infinite state Markov chains (random walks). In the second part, we formulate the quantum walk induced from a lazy random walk on the line. We then obtain the weak limit of the quantum walk. It is noted that the current quantum walk appears to spread faster than its counterpart-quantum walk on the line driven by the Grover coin discussed in literature. The paper closes with an outlook on possible future directions.

  2. Competitive Facility Location with Random Demands

    NASA Astrophysics Data System (ADS)

    Uno, Takeshi; Katagiri, Hideki; Kato, Kosuke

    2009-10-01

    This paper proposes a new location problem of competitive facilities, e.g. shops and stores, with uncertain demands in the plane. By representing the demands for facilities as random variables, the location problem is formulated to a stochastic programming problem, and for finding its solution, three deterministic programming problems: expectation maximizing problem, probability maximizing problem, and satisfying level maximizing problem are considered. After showing that one of their optimal solutions can be found by solving 0-1 programming problems, their solution method is proposed by improving the tabu search algorithm with strategic vibration. Efficiency of the solution method is shown by applying to numerical examples of the facility location problems.

  3. Smart intimation and location of faults in distribution system

    NASA Astrophysics Data System (ADS)

    Hari Krishna, K.; Srinivasa Rao, B.

    2018-04-01

    Location of faults in the distribution system is one of the most complicated problems that we are facing today. Identification of fault location and severity of fault within a short time is required to provide continuous power supply but fault identification and information transfer to the operator is the biggest challenge in the distribution network. This paper proposes a fault location method in the distribution system based on Arduino nano and GSM module with flame sensor. The main idea is to locate the fault in the distribution transformer by sensing the arc coming out from the fuse element. The biggest challenge in the distribution network is to identify the location and the severity of faults under different conditions. Well operated transmission and distribution systems will play a key role for uninterrupted power supply. Whenever fault occurs in the distribution system the time taken to locate and eliminate the fault has to be reduced. The proposed design was achieved with flame sensor and GSM module. Under faulty condition, the system will automatically send an alert message to the operator in the distribution system, about the abnormal conditions near the transformer, site code and its exact location for possible power restoration.

  4. The relationship between multilevel models and non-parametric multilevel mixture models: Discrete approximation of intraclass correlation, random coefficient distributions, and residual heteroscedasticity.

    PubMed

    Rights, Jason D; Sterba, Sonya K

    2016-11-01

    Multilevel data structures are common in the social sciences. Often, such nested data are analysed with multilevel models (MLMs) in which heterogeneity between clusters is modelled by continuously distributed random intercepts and/or slopes. Alternatively, the non-parametric multilevel regression mixture model (NPMM) can accommodate the same nested data structures through discrete latent class variation. The purpose of this article is to delineate analytic relationships between NPMM and MLM parameters that are useful for understanding the indirect interpretation of the NPMM as a non-parametric approximation of the MLM, with relaxed distributional assumptions. We define how seven standard and non-standard MLM specifications can be indirectly approximated by particular NPMM specifications. We provide formulas showing how the NPMM can serve as an approximation of the MLM in terms of intraclass correlation, random coefficient means and (co)variances, heteroscedasticity of residuals at level 1, and heteroscedasticity of residuals at level 2. Further, we discuss how these relationships can be useful in practice. The specific relationships are illustrated with simulated graphical demonstrations, and direct and indirect interpretations of NPMM classes are contrasted. We provide an R function to aid in implementing and visualizing an indirect interpretation of NPMM classes. An empirical example is presented and future directions are discussed. © 2016 The British Psychological Society.

  5. Stable and efficient retrospective 4D-MRI using non-uniformly distributed quasi-random numbers

    NASA Astrophysics Data System (ADS)

    Breuer, Kathrin; Meyer, Cord B.; Breuer, Felix A.; Richter, Anne; Exner, Florian; Weng, Andreas M.; Ströhle, Serge; Polat, Bülent; Jakob, Peter M.; Sauer, Otto A.; Flentje, Michael; Weick, Stefan

    2018-04-01

    The purpose of this work is the development of a robust and reliable three-dimensional (3D) Cartesian imaging technique for fast and flexible retrospective 4D abdominal MRI during free breathing. To this end, a non-uniform quasi random (NU-QR) reordering of the phase encoding (k y –k z ) lines was incorporated into 3D Cartesian acquisition. The proposed sampling scheme allocates more phase encoding points near the k-space origin while reducing the sampling density in the outer part of the k-space. Respiratory self-gating in combination with SPIRiT-reconstruction is used for the reconstruction of abdominal data sets in different respiratory phases (4D-MRI). Six volunteers and three patients were examined at 1.5 T during free breathing. Additionally, data sets with conventional two-dimensional (2D) linear and 2D quasi random phase encoding order were acquired for the volunteers for comparison. A quantitative evaluation of image quality versus scan times (from 70 s to 626 s) for the given sampling schemes was obtained by calculating the normalized mutual information (NMI) for all volunteers. Motion estimation was accomplished by calculating the maximum derivative of a signal intensity profile of a transition (e.g. tumor or diaphragm). The 2D non-uniform quasi-random distribution of phase encoding lines in Cartesian 3D MRI yields more efficient undersampling patterns for parallel imaging compared to conventional uniform quasi-random and linear sampling. Median NMI values of NU-QR sampling are the highest for all scan times. Therefore, within the same scan time 4D imaging could be performed with improved image quality. The proposed method allows for the reconstruction of motion artifact reduced 4D data sets with isotropic spatial resolution of 2.1  ×  2.1  ×  2.1 mm3 in a short scan time, e.g. 10 respiratory phases in only 3 min. Cranio-caudal tumor displacements between 23 and 46 mm could be observed. NU-QR sampling enables for stable 4D

  6. Privacy-Preserving Location-Based Query Using Location Indexes and Parallel Searching in Distributed Networks

    PubMed Central

    Liu, Lei; Zhao, Jing

    2014-01-01

    An efficient location-based query algorithm of protecting the privacy of the user in the distributed networks is given. This algorithm utilizes the location indexes of the users and multiple parallel threads to search and select quickly all the candidate anonymous sets with more users and their location information with more uniform distribution to accelerate the execution of the temporal-spatial anonymous operations, and it allows the users to configure their custom-made privacy-preserving location query requests. The simulated experiment results show that the proposed algorithm can offer simultaneously the location query services for more users and improve the performance of the anonymous server and satisfy the anonymous location requests of the users. PMID:24790579

  7. Privacy-preserving location-based query using location indexes and parallel searching in distributed networks.

    PubMed

    Zhong, Cheng; Liu, Lei; Zhao, Jing

    2014-01-01

    An efficient location-based query algorithm of protecting the privacy of the user in the distributed networks is given. This algorithm utilizes the location indexes of the users and multiple parallel threads to search and select quickly all the candidate anonymous sets with more users and their location information with more uniform distribution to accelerate the execution of the temporal-spatial anonymous operations, and it allows the users to configure their custom-made privacy-preserving location query requests. The simulated experiment results show that the proposed algorithm can offer simultaneously the location query services for more users and improve the performance of the anonymous server and satisfy the anonymous location requests of the users.

  8. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1970-01-01

    This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that…

  9. Cluster analysis for determining distribution center location

    NASA Astrophysics Data System (ADS)

    Lestari Widaningrum, Dyah; Andika, Aditya; Murphiyanto, Richard Dimas Julian

    2017-12-01

    Determination of distribution facilities is highly important to survive in the high level of competition in today’s business world. Companies can operate multiple distribution centers to mitigate supply chain risk. Thus, new problems arise, namely how many and where the facilities should be provided. This study examines a fast-food restaurant brand, which located in the Greater Jakarta. This brand is included in the category of top 5 fast food restaurant chain based on retail sales. There were three stages in this study, compiling spatial data, cluster analysis, and network analysis. Cluster analysis results are used to consider the location of the additional distribution center. Network analysis results show a more efficient process referring to a shorter distance to the distribution process.

  10. The Role of Experience in Location Estimation: Target Distributions Shift Location Memory Biases

    ERIC Educational Resources Information Center

    Lipinski, John; Simmering, Vanessa R.; Johnson, Jeffrey S.; Spencer, John P.

    2010-01-01

    Research based on the Category Adjustment model concluded that the spatial distribution of target locations does not influence location estimation responses [Huttenlocher, J., Hedges, L., Corrigan, B., & Crawford, L. E. (2004). Spatial categories and the estimation of location. "Cognition, 93", 75-97]. This conflicts with earlier results showing…

  11. The role of experience in location estimation: Target distributions shift location memory biases.

    PubMed

    Lipinski, John; Simmering, Vanessa R; Johnson, Jeffrey S; Spencer, John P

    2010-04-01

    Research based on the Category Adjustment model concluded that the spatial distribution of target locations does not influence location estimation responses [Huttenlocher, J., Hedges, L., Corrigan, B., & Crawford, L. E. (2004). Spatial categories and the estimation of location. Cognition, 93, 75-97]. This conflicts with earlier results showing that location estimation is biased relative to the spatial distribution of targets [Spencer, J. P., & Hund, A. M. (2002). Prototypes and particulars: Geometric and experience-dependent spatial categories. Journal of Experimental Psychology: General, 131, 16-37]. Here, we resolve this controversy by using a task based on Huttenlocher et al. (Experiment 4) with minor modifications to enhance our ability to detect experience-dependent effects. Results after the first block of trials replicate the pattern reported in Huttenlocher et al. After additional experience, however, participants showed biases that significantly shifted according to the target distributions. These results are consistent with the Dynamic Field Theory, an alternative theory of spatial cognition that integrates long-term memory traces across trials relative to the perceived structure of the task space. Copyright 2009 Elsevier B.V. All rights reserved.

  12. Protection of Location Privacy Based on Distributed Collaborative Recommendations

    PubMed Central

    Wang, Peng; Yang, Jing; Zhang, Jian-Pei

    2016-01-01

    In the existing centralized location services system structure, the server is easily attracted and be the communication bottleneck. It caused the disclosure of users’ location. For this, we presented a new distributed collaborative recommendation strategy that is based on the distributed system. In this strategy, each node establishes profiles of their own location information. When requests for location services appear, the user can obtain the corresponding location services according to the recommendation of the neighboring users’ location information profiles. If no suitable recommended location service results are obtained, then the user can send a service request to the server according to the construction of a k-anonymous data set with a centroid position of the neighbors. In this strategy, we designed a new model of distributed collaborative recommendation location service based on the users’ location information profiles and used generalization and encryption to ensure the safety of the user’s location information privacy. Finally, we used the real location data set to make theoretical and experimental analysis. And the results show that the strategy proposed in this paper is capable of reducing the frequency of access to the location server, providing better location services and protecting better the user’s location privacy. PMID:27649308

  13. Protection of Location Privacy Based on Distributed Collaborative Recommendations.

    PubMed

    Wang, Peng; Yang, Jing; Zhang, Jian-Pei

    2016-01-01

    In the existing centralized location services system structure, the server is easily attracted and be the communication bottleneck. It caused the disclosure of users' location. For this, we presented a new distributed collaborative recommendation strategy that is based on the distributed system. In this strategy, each node establishes profiles of their own location information. When requests for location services appear, the user can obtain the corresponding location services according to the recommendation of the neighboring users' location information profiles. If no suitable recommended location service results are obtained, then the user can send a service request to the server according to the construction of a k-anonymous data set with a centroid position of the neighbors. In this strategy, we designed a new model of distributed collaborative recommendation location service based on the users' location information profiles and used generalization and encryption to ensure the safety of the user's location information privacy. Finally, we used the real location data set to make theoretical and experimental analysis. And the results show that the strategy proposed in this paper is capable of reducing the frequency of access to the location server, providing better location services and protecting better the user's location privacy.

  14. Chromatin position in human HepG2 cells: although being non-random, significantly changed in daughter cells.

    PubMed

    Cvacková, Zuzana; Masata, Martin; Stanĕk, David; Fidlerová, Helena; Raska, Ivan

    2009-02-01

    Mammalian chromosomes occupy chromosome territories within nuclear space the positions of which are generally accepted as non-random. However, it is still controversial whether position of chromosome territories/chromatin is maintained in daughter cells. We addressed this issue and investigated maintenance of various chromatin regions of unknown composition as well as nucleolus-associated chromatin, a significant part of which is composed of nucleolus organizer region-bearing chromosomes. The photoconvertible histone H4-Dendra2 was used to label such regions in transfected HepG2 cells, and its position was followed up to next interphase. The distribution of labeled chromatin in daughter cells exhibited a non-random character. However, its distribution in a vast majority of daughter cells extensively differed from the original ones and the labeled nucleolus-associated chromatin differently located into the vicinity of different nucleoli. Therefore, our results were not consistent with a concept of preservation chromatin position. This conclusion was supported by the finding that the numbers of nucleoli significantly differed between the two daughter cells. Our results support a view that while the transfected daughter HepG2 cells maintain some features of the parental cell chromosome organization, there is also a significant stochastic component associated with reassortment of chromosome territories/chromatin that results in their positional rearrangements.

  15. Online location of a break in water distribution systems

    NASA Astrophysics Data System (ADS)

    Liang, Jianwen; Xiao, Di; Zhao, Xinhua; Zhang, Hongwei

    2003-08-01

    Breaks often occur to urban water distribution systems under severely cold weather, or due to corrosion of pipes, deformation of ground, etc., and the breaks cannot easily be located, especially immediately after the events. This paper develops a methodology to locate a break in a water distribution system by monitoring water pressure online at some nodes in the water distribution system. For the purpose of online monitoring, supervisory control and data acquisition (SCADA) technology can well be used. A neural network-based inverse analysis method is constructed for locating the break based on the variation of water pressure. The neural network is trained by using analytically simulated data from the water distribution system, and validated by using a set of data that have never been used in the training. It is found that the methodology provides a quick, effective, and practical way in which a break in a water distribution system can be located.

  16. Place field assembly distribution encodes preferred locations

    PubMed Central

    Mamad, Omar; Stumpp, Lars; McNamara, Harold M.; Ramakrishnan, Charu; Deisseroth, Karl; Reilly, Richard B.

    2017-01-01

    The hippocampus is the main locus of episodic memory formation and the neurons there encode the spatial map of the environment. Hippocampal place cells represent location, but their role in the learning of preferential location remains unclear. The hippocampus may encode locations independently from the stimuli and events that are associated with these locations. We have discovered a unique population code for the experience-dependent value of the context. The degree of reward-driven navigation preference highly correlates with the spatial distribution of the place fields recorded in the CA1 region of the hippocampus. We show place field clustering towards rewarded locations. Optogenetic manipulation of the ventral tegmental area demonstrates that the experience-dependent place field assembly distribution is directed by tegmental dopaminergic activity. The ability of the place cells to remap parallels the acquisition of reward context. Our findings present key evidence that the hippocampal neurons are not merely mapping the static environment but also store the concurrent context reward value, enabling episodic memory for past experience to support future adaptive behavior. PMID:28898248

  17. All optical mode controllable Er-doped random fiber laser with distributed Bragg gratings.

    PubMed

    Zhang, W L; Ma, R; Tang, C H; Rao, Y J; Zeng, X P; Yang, Z J; Wang, Z N; Gong, Y; Wang, Y S

    2015-07-01

    An all-optical method to control the lasing modes of Er-doped random fiber lasers (RFLs) is proposed and demonstrated. In the RFL, an Er-doped fiber (EDF) recoded with randomly separated fiber Bragg gratings (FBG) is used as the gain medium and randomly distributed reflectors, as well as the controllable element. By combining random feedback of the FBG array and Fresnel feedback of a cleaved fiber end, multi-mode coherent random lasing is obtained with a threshold of 14 mW and power efficiency of 14.4%. Moreover, a laterally-injected control light is used to induce local gain perturbation, providing additional gain for certain random resonance modes. As a result, active mode selection of the RFL is realized by changing locations of the laser cavity that is exposed to the control light.

  18. Parameter estimation of multivariate multiple regression model using bayesian with non-informative Jeffreys’ prior distribution

    NASA Astrophysics Data System (ADS)

    Saputro, D. R. S.; Amalia, F.; Widyaningsih, P.; Affan, R. C.

    2018-05-01

    Bayesian method is a method that can be used to estimate the parameters of multivariate multiple regression model. Bayesian method has two distributions, there are prior and posterior distributions. Posterior distribution is influenced by the selection of prior distribution. Jeffreys’ prior distribution is a kind of Non-informative prior distribution. This prior is used when the information about parameter not available. Non-informative Jeffreys’ prior distribution is combined with the sample information resulting the posterior distribution. Posterior distribution is used to estimate the parameter. The purposes of this research is to estimate the parameters of multivariate regression model using Bayesian method with Non-informative Jeffreys’ prior distribution. Based on the results and discussion, parameter estimation of β and Σ which were obtained from expected value of random variable of marginal posterior distribution function. The marginal posterior distributions for β and Σ are multivariate normal and inverse Wishart. However, in calculation of the expected value involving integral of a function which difficult to determine the value. Therefore, approach is needed by generating of random samples according to the posterior distribution characteristics of each parameter using Markov chain Monte Carlo (MCMC) Gibbs sampling algorithm.

  19. Laser absorption of carbon fiber reinforced polymer with randomly distributed carbon fibers

    NASA Astrophysics Data System (ADS)

    Hu, Jun; Xu, Hebing; Li, Chao

    2018-03-01

    Laser processing of carbon fiber reinforced polymer (CFRP) is a non-traditional machining method which has many prospective applications. The laser absorption characteristics of CFRP are analyzed in this paper. A ray tracing model describing the interaction of the laser spot with CFRP is established. The material model contains randomly distributed carbon fibers which are generated using an improved carbon fiber placement method. It was found that CFRP has good laser absorption due to multiple reflections of the light rays in the material’s microstructure. The randomly distributed carbon fibers make the absorptivity of the light rays change randomly in the laser spot. Meanwhile, the average absorptivity fluctuation is obvious during movement of the laser. The experimental measurements agree well with the values predicted by the ray tracing model.

  20. Steinhaus’ Geometric Location Problem for Random Samples in the Plane.

    DTIC Science & Technology

    1982-05-11

    NAL 411R A1, ’I 7 - I STEINHAUS ’ GEOMETRIC LOCATION PROBLEM FOR RANDOM SAMPLES IN THE PLANE By Dorit Hochbaum and J. Michael Steele TECHNICAL REPORT...DEPARTMENT OF STATISTICS -Dltrib’ytion/ STANFORD UNIVERSITY A-I.abilty Codes STANFORD, CALIFORNIA Dist Spciat ecial Steinhaus ’ Geometric Location Problem for...Random Samples in the Plane By Dorit Hochbaum and J. Michael Steele I. Introduction. The work of H. Steinhaus U wf94 as apparently the first explicit

  1. Solute location in a nanoconfined liquid depends on charge distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, Jacob A.; Thompson, Ward H., E-mail: wthompson@ku.edu

    2015-07-28

    Nanostructured materials that can confine liquids have attracted increasing attention for their diverse properties and potential applications. Yet, significant gaps remain in our fundamental understanding of such nanoconfined liquids. Using replica exchange molecular dynamics simulations of a nanoscale, hydroxyl-terminated silica pore system, we determine how the locations explored by a coumarin 153 (C153) solute in ethanol depend on its charge distribution, which can be changed through a charge transfer electronic excitation. The solute position change is driven by the internal energy, which favors C153 at the pore surface compared to the pore interior, but less so for the more polar,more » excited-state molecule. This is attributed to more favorable non-specific solvation of the large dipole moment excited-state C153 by ethanol at the expense of hydrogen-bonding with the pore. It is shown that a change in molecule location resulting from shifts in the charge distribution is a general result, though how the solute position changes will depend upon the specific system. This has important implications for interpreting measurements and designing applications of mesoporous materials.« less

  2. All-Direction Random Routing for Source-Location Privacy Protecting against Parasitic Sensor Networks.

    PubMed

    Wang, Na; Zeng, Jiwen

    2017-03-17

    Wireless sensor networks are deployed to monitor the surrounding physical environments and they also act as the physical environments of parasitic sensor networks, whose purpose is analyzing the contextual privacy and obtaining valuable information from the original wireless sensor networks. Recently, contextual privacy issues associated with wireless communication in open spaces have not been thoroughly addressed and one of the most important challenges is protecting the source locations of the valuable packages. In this paper, we design an all-direction random routing algorithm (ARR) for source-location protecting against parasitic sensor networks. For each package, the routing process of ARR is divided into three stages, i.e., selecting a proper agent node, delivering the package to the agent node from the source node, and sending it to the final destination from the agent node. In ARR, the agent nodes are randomly chosen in all directions by the source nodes using only local decisions, rather than knowing the whole topology of the networks. ARR can control the distributions of the routing paths in a very flexible way and it can guarantee that the routing paths with the same source and destination are totally different from each other. Therefore, it is extremely difficult for the parasitic sensor nodes to trace the packages back to the source nodes. Simulation results illustrate that ARR perfectly confuses the parasitic nodes and obviously outperforms traditional routing-based schemes in protecting source-location privacy, with a marginal increase in the communication overhead and energy consumption. In addition, ARR also requires much less energy than the cloud-based source-location privacy protection schemes.

  3. All-Direction Random Routing for Source-Location Privacy Protecting against Parasitic Sensor Networks

    PubMed Central

    Wang, Na; Zeng, Jiwen

    2017-01-01

    Wireless sensor networks are deployed to monitor the surrounding physical environments and they also act as the physical environments of parasitic sensor networks, whose purpose is analyzing the contextual privacy and obtaining valuable information from the original wireless sensor networks. Recently, contextual privacy issues associated with wireless communication in open spaces have not been thoroughly addressed and one of the most important challenges is protecting the source locations of the valuable packages. In this paper, we design an all-direction random routing algorithm (ARR) for source-location protecting against parasitic sensor networks. For each package, the routing process of ARR is divided into three stages, i.e., selecting a proper agent node, delivering the package to the agent node from the source node, and sending it to the final destination from the agent node. In ARR, the agent nodes are randomly chosen in all directions by the source nodes using only local decisions, rather than knowing the whole topology of the networks. ARR can control the distributions of the routing paths in a very flexible way and it can guarantee that the routing paths with the same source and destination are totally different from each other. Therefore, it is extremely difficult for the parasitic sensor nodes to trace the packages back to the source nodes. Simulation results illustrate that ARR perfectly confuses the parasitic nodes and obviously outperforms traditional routing-based schemes in protecting source-location privacy, with a marginal increase in the communication overhead and energy consumption. In addition, ARR also requires much less energy than the cloud-based source-location privacy protection schemes. PMID:28304367

  4. Non-random dispersal in the butterfly Maniola jurtina: implications for metapopulation models.

    PubMed Central

    Conradt, L; Bodsworth, E J; Roper, T J; Thomas, C D

    2000-01-01

    The dispersal patterns of animals are important in metapopulation ecology because they affect the dynamics and survival of populations. Theoretical models assume random dispersal but little is known in practice about the dispersal behaviour of individual animals or the strategy by which dispersers locate distant habitat patches. In the present study, we released individual meadow brown butterflies (Maniola jurtina) in a non-habitat and investigated their ability to return to a suitable habitat. The results provided three reasons for supposing that meadow brown butterflies do not seek habitat by means of random flight. First, when released within the range of their normal dispersal distances, the butterflies orientated towards suitable habitat at a higher rate than expected at random. Second, when released at larger distances from their habitat, they used a non-random, systematic, search strategy in which they flew in loops around the release point and returned periodically to it. Third, butterflies returned to a familiar habitat patch rather than a non-familiar one when given a choice. If dispersers actively orientate towards or search systematically for distant habitat, this may be problematic for existing metapopulation models, including models of the evolution of dispersal rates in metapopulations. PMID:11007325

  5. Work distributions for random sudden quantum quenches

    NASA Astrophysics Data System (ADS)

    Łobejko, Marcin; Łuczka, Jerzy; Talkner, Peter

    2017-05-01

    The statistics of work performed on a system by a sudden random quench is investigated. Considering systems with finite dimensional Hilbert spaces we model a sudden random quench by randomly choosing elements from a Gaussian unitary ensemble (GUE) consisting of Hermitian matrices with identically, Gaussian distributed matrix elements. A probability density function (pdf) of work in terms of initial and final energy distributions is derived and evaluated for a two-level system. Explicit results are obtained for quenches with a sharply given initial Hamiltonian, while the work pdfs for quenches between Hamiltonians from two independent GUEs can only be determined in explicit form in the limits of zero and infinite temperature. The same work distribution as for a sudden random quench is obtained for an adiabatic, i.e., infinitely slow, protocol connecting the same initial and final Hamiltonians.

  6. Random Distribution Pattern and Non-adaptivity of Genome Size in a Highly Variable Population of Festuca pallens

    PubMed Central

    Šmarda, Petr; Bureš, Petr; Horová, Lucie

    2007-01-01

    Background and Aims The spatial and statistical distribution of genome sizes and the adaptivity of genome size to some types of habitat, vegetation or microclimatic conditions were investigated in a tetraploid population of Festuca pallens. The population was previously documented to vary highly in genome size and is assumed as a model for the study of the initial stages of genome size differentiation. Methods Using DAPI flow cytometry, samples were measured repeatedly with diploid Festuca pallens as the internal standard. Altogether 172 plants from 57 plots (2·25 m2), distributed in contrasting habitats over the whole locality in South Moravia, Czech Republic, were sampled. The differences in DNA content were confirmed by the double peaks of simultaneously measured samples. Key Results At maximum, a 1·115-fold difference in genome size was observed. The statistical distribution of genome sizes was found to be continuous and best fits the extreme (Gumbel) distribution with rare occurrences of extremely large genomes (positive-skewed), as it is similar for the log-normal distribution of the whole Angiosperms. Even plants from the same plot frequently varied considerably in genome size and the spatial distribution of genome sizes was generally random and unautocorrelated (P > 0·05). The observed spatial pattern and the overall lack of correlations of genome size with recognized vegetation types or microclimatic conditions indicate the absence of ecological adaptivity of genome size in the studied population. Conclusions These experimental data on intraspecific genome size variability in Festuca pallens argue for the absence of natural selection and the selective non-significance of genome size in the initial stages of genome size differentiation, and corroborate the current hypothetical model of genome size evolution in Angiosperms (Bennetzen et al., 2005, Annals of Botany 95: 127–132). PMID:17565968

  7. Contact Time in Random Walk and Random Waypoint: Dichotomy in Tail Distribution

    NASA Astrophysics Data System (ADS)

    Zhao, Chen; Sichitiu, Mihail L.

    Contact time (or link duration) is a fundamental factor that affects performance in Mobile Ad Hoc Networks. Previous research on theoretical analysis of contact time distribution for random walk models (RW) assume that the contact events can be modeled as either consecutive random walks or direct traversals, which are two extreme cases of random walk, thus with two different conclusions. In this paper we conduct a comprehensive research on this topic in the hope of bridging the gap between the two extremes. The conclusions from the two extreme cases will result in a power-law or exponential tail in the contact time distribution, respectively. However, we show that the actual distribution will vary between the two extremes: a power-law-sub-exponential dichotomy, whose transition point depends on the average flight duration. Through simulation results we show that such conclusion also applies to random waypoint.

  8. A location-based multiple point statistics method: modelling the reservoir with non-stationary characteristics

    NASA Astrophysics Data System (ADS)

    Yin, Yanshu; Feng, Wenjie

    2017-12-01

    In this paper, a location-based multiple point statistics method is developed to model a non-stationary reservoir. The proposed method characterizes the relationship between the sedimentary pattern and the deposit location using the relative central position distance function, which alleviates the requirement that the training image and the simulated grids have the same dimension. The weights in every direction of the distance function can be changed to characterize the reservoir heterogeneity in various directions. The local integral replacements of data events, structured random path, distance tolerance and multi-grid strategy are applied to reproduce the sedimentary patterns and obtain a more realistic result. This method is compared with the traditional Snesim method using a synthesized 3-D training image of Poyang Lake and a reservoir model of Shengli Oilfield in China. The results indicate that the new method can reproduce the non-stationary characteristics better than the traditional method and is more suitable for simulation of delta-front deposits. These results show that the new method is a powerful tool for modelling a reservoir with non-stationary characteristics.

  9. Joint location, inventory, and preservation decisions for non-instantaneous deterioration items under delay in payments

    NASA Astrophysics Data System (ADS)

    Tsao, Yu-Chung

    2016-02-01

    This study models a joint location, inventory and preservation decision-making problem for non-instantaneous deteriorating items under delay in payments. An outside supplier provides a credit period to the wholesaler which has a distribution system with distribution centres (DCs). The non-instantaneous deteriorating means no deterioration occurs in the earlier stage, which is very useful for items such as fresh food and fruits. This paper also considers that the deteriorating rate will decrease and the reservation cost will increase as the preservation effort increases. Therefore, how much preservation effort should be made is a crucial decision. The objective of this paper is to determine the optimal locations and number of DCs, the optimal replenishment cycle time at DCs, and the optimal preservation effort simultaneously such that the total network profit is maximised. The problem is formulated as piecewise nonlinear functions and has three different cases. Algorithms based on piecewise nonlinear optimisation are provided to solve the joint location and inventory problem for all cases. Computational analysis illustrates the solution procedures and the impacts of the related parameters on decisions and profits. The results of this study can serve as references for business managers or administrators.

  10. Distribution of blood types in a sample of 245 New Zealand non-purebred cats.

    PubMed

    Cattin, R P

    2016-05-01

    To determine the distribution of feline blood types in a sample of non-pedigree, domestic cats in New Zealand, whether a difference exists in this distribution between domestic short haired and domestic long haired cats, and between the North and South Islands of New Zealand; and to calculate the risk of a random blood transfusion causing a severe transfusion reaction, and the risk of a random mating producing kittens susceptible to neonatal isoerythrolysis. The results of 245 blood typing tests in non-pedigree cats performed at the New Zealand Veterinary Pathology (NZVP) and Gribbles Veterinary Pathology laboratories between the beginning of 2009 and the end of 2014 were retrospectively collated and analysed. Cats that were identified as domestic short or long haired were included. For the cats tested at Gribbles Veterinary Pathology 62 were from the North Island, and 27 from the South Island. The blood type distribution differed between samples from the two laboratories (p=0.029), but not between domestic short and long haired cats (p=0.50), or between the North and South Islands (p=0.76). Of the 89 cats tested at Gribbles Veterinary Pathology, 70 (79%) were type A, 18 (20%) type B, and 1 (1%) type AB; for NZVP 139/156 (89.1%) cats were type A, 16 (10.3%) type B, and 1 (0.6%) type AB. It was estimated that 18.3-31.9% of random blood transfusions would be at risk of a transfusion reaction, and neonatal isoerythrolysis would be a risk in 9.2-16.1% of random matings between non-pedigree cats. The results from this study suggest that there is a high risk of complications for a random blood transfusion between non-purebred cats in New Zealand. Neonatal isoerythrolysis should be considered an important differential diagnosis in illness or mortality in kittens during the first days of life.

  11. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions.

    PubMed

    Yura, Harold T; Hanson, Steen G

    2012-04-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.

  12. Locating dayside magnetopause reconnection with exhaust ion distributions

    NASA Astrophysics Data System (ADS)

    Broll, J. M.; Fuselier, S. A.; Trattner, K. J.

    2017-05-01

    Magnetic reconnection at Earth's dayside magnetopause is essential to magnetospheric dynamics. Determining where reconnection takes place is important to understanding the processes involved, and many questions about reconnection location remain unanswered. We present a method for locating the magnetic reconnection X line at Earth's dayside magnetopause under southward interplanetary magnetic field conditions using only ion velocity distribution measurements. Particle-in-cell simulations based on Cluster magnetopause crossings produce ion velocity distributions that we propagate through a model magnetosphere, allowing us to calculate the field-aligned distance between an exhaust observation and its associated reconnection line. We demonstrate this procedure for two events and compare our results with those of the Maximum Magnetic Shear Model; we find good agreement with its results and show that when our method is applicable, it produces more precise locations than the Maximum Shear Model.

  13. Landscape-scale spatial abundance distributions discriminate core from random components of boreal lake bacterioplankton.

    PubMed

    Niño-García, Juan Pablo; Ruiz-González, Clara; Del Giorgio, Paul A

    2016-12-01

    Aquatic bacterial communities harbour thousands of coexisting taxa. To meet the challenge of discriminating between a 'core' and a sporadically occurring 'random' component of these communities, we explored the spatial abundance distribution of individual bacterioplankton taxa across 198 boreal lakes and their associated fluvial networks (188 rivers). We found that all taxa could be grouped into four distinct categories based on model statistical distributions (normal like, bimodal, logistic and lognormal). The distribution patterns across lakes and their associated river networks showed that lake communities are composed of a core of taxa whose distribution appears to be linked to in-lake environmental sorting (normal-like and bimodal categories), and a large fraction of mostly rare bacteria (94% of all taxa) whose presence appears to be largely random and linked to downstream transport in aquatic networks (logistic and lognormal categories). These rare taxa are thus likely to reflect species sorting at upstream locations, providing a perspective of the conditions prevailing in entire aquatic networks rather than only in lakes. © 2016 John Wiley & Sons Ltd/CNRS.

  14. Cover estimation and payload location using Markov random fields

    NASA Astrophysics Data System (ADS)

    Quach, Tu-Thach

    2014-02-01

    Payload location is an approach to find the message bits hidden in steganographic images, but not necessarily their logical order. Its success relies primarily on the accuracy of the underlying cover estimators and can be improved if more estimators are used. This paper presents an approach based on Markov random field to estimate the cover image given a stego image. It uses pairwise constraints to capture the natural two-dimensional statistics of cover images and forms a basis for more sophisticated models. Experimental results show that it is competitive against current state-of-the-art estimators and can locate payload embedded by simple LSB steganography and group-parity steganography. Furthermore, when combined with existing estimators, payload location accuracy improves significantly.

  15. Modulation of early cortical processing during divided attention to non-contiguous locations

    PubMed Central

    Frey, Hans-Peter; Schmid, Anita M.; Murphy, Jeremy W.; Molholm, Sophie; Lalor, Edmund C.; Foxe, John J.

    2015-01-01

    We often face the challenge of simultaneously attending to multiple non-contiguous regions of space. There is ongoing debate as to how spatial attention is divided under these situations. While for several years the predominant view was that humans could divide the attentional spotlight, several recent studies argue in favor of a unitary spotlight that rhythmically samples relevant locations. Here, this issue was addressed using high-density electrophysiology in concert with the multifocal m-sequence technique to examine visual evoked responses to multiple simultaneous streams of stimulation. Concurrently, we assayed the topographic distribution of alpha-band oscillatory mechanisms, a measure of attentional suppression. Participants performed a difficult detection task that required simultaneous attention to two stimuli in contiguous (undivided) or non-contiguous parts of space. In the undivided condition, the classical pattern of attentional modulation was observed, with increased amplitude of the early visual evoked response and increased alpha amplitude ipsilateral to the attended hemifield. For the divided condition, early visual responses to attended stimuli were also enhanced and the observed multifocal topographic distribution of alpha suppression was in line with the divided attention hypothesis. These results support the existence of divided attentional spotlights, providing evidence that the corresponding modulation occurs during initial sensory processing timeframes in hierarchically early visual regions and that suppressive mechanisms of visual attention selectively target distracter locations during divided spatial attention. PMID:24606564

  16. Stationary Random Metrics on Hierarchical Graphs Via {(min,+)}-type Recursive Distributional Equations

    NASA Astrophysics Data System (ADS)

    Khristoforov, Mikhail; Kleptsyn, Victor; Triestino, Michele

    2016-07-01

    This paper is inspired by the problem of understanding in a mathematical sense the Liouville quantum gravity on surfaces. Here we show how to define a stationary random metric on self-similar spaces which are the limit of nice finite graphs: these are the so-called hierarchical graphs. They possess a well-defined level structure and any level is built using a simple recursion. Stopping the construction at any finite level, we have a discrete random metric space when we set the edges to have random length (using a multiplicative cascade with fixed law {m}). We introduce a tool, the cut-off process, by means of which one finds that renormalizing the sequence of metrics by an exponential factor, they converge in law to a non-trivial metric on the limit space. Such limit law is stationary, in the sense that glueing together a certain number of copies of the random limit space, according to the combinatorics of the brick graph, the obtained random metric has the same law when rescaled by a random factor of law {m} . In other words, the stationary random metric is the solution of a distributional equation. When the measure m has continuous positive density on {mathbf{R}+}, the stationary law is unique up to rescaling and any other distribution tends to a rescaled stationary law under the iterations of the hierarchical transformation. We also investigate topological and geometric properties of the random space when m is log-normal, detecting a phase transition influenced by the branching random walk associated to the multiplicative cascade.

  17. Transfer of location-specific control to untrained locations.

    PubMed

    Weidler, Blaire J; Bugg, Julie M

    2016-11-01

    Recent research highlights a seemingly flexible and automatic form of cognitive control that is triggered by potent contextual cues, as exemplified by the location-specific proportion congruence effect--reduced compatibility effects in locations associated with a high as compared to low likelihood of conflict. We investigated just how flexible location-specific control is by examining whether novel locations effectively cue control for congruency-unbiased stimuli. In two experiments, biased (mostly compatible or mostly incompatible) training stimuli appeared in distinct locations. During a final block, unbiased (50% compatible) stimuli appeared in novel untrained locations spatially linked to biased locations. The flanker compatibly effect was reduced for unbiased stimuli in novel locations linked to a mostly incompatible compared to a mostly compatible location, indicating transfer. Transfer was observed when stimuli appeared along a linear function (Experiment 1) or in rings of a bullseye (Experiment 2). The novel transfer effects imply that location-specific control is more flexible than previously reported and further counter the complex stimulus-response learning account of location-specific proportion congruence effects. We propose that the representation and retrieval of control settings in untrained locations may depend on environmental support and the presentation of stimuli in novel locations that fall within the same categories of space as trained locations.

  18. Probability distribution for the Gaussian curvature of the zero level surface of a random function

    NASA Astrophysics Data System (ADS)

    Hannay, J. H.

    2018-04-01

    A rather natural construction for a smooth random surface in space is the level surface of value zero, or ‘nodal’ surface f(x,y,z)  =  0, of a (real) random function f; the interface between positive and negative regions of the function. A physically significant local attribute at a point of a curved surface is its Gaussian curvature (the product of its principal curvatures) because, when integrated over the surface it gives the Euler characteristic. Here the probability distribution for the Gaussian curvature at a random point on the nodal surface f  =  0 is calculated for a statistically homogeneous (‘stationary’) and isotropic zero mean Gaussian random function f. Capitalizing on the isotropy, a ‘fixer’ device for axes supplies the probability distribution directly as a multiple integral. Its evaluation yields an explicit algebraic function with a simple average. Indeed, this average Gaussian curvature has long been known. For a non-zero level surface instead of the nodal one, the probability distribution is not fully tractable, but is supplied as an integral expression.

  19. Inheritance on processes, exemplified on distributed termination detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomsen, K.S.

    1987-02-01

    A multiple inheritance mechanism on processes is designed and presented within the framework of a small object oriented language. Processes are described in classes, and the different action parts of a process inherited from different classes are executed in a coroutine-like style called alternation. The inheritance mechanism is a useful tool for factorizing the description of common aspects of processes. This is demonstrated within the domain of distributed programming by using the inheritance mechanism to factorize the description of distributed termination detection algorithms from the description of the distributed main computations for which termination is to be detected. A clearmore » separation of concerns is obtained, and arbitrary combinations of terminations detection algorithms and main computations can be formed. The same termination detection classes can also be used for more general purposes within distributed programming, such as detecting termination of each phase in a multi-phase main computation.« less

  20. Modulation of early cortical processing during divided attention to non-contiguous locations.

    PubMed

    Frey, Hans-Peter; Schmid, Anita M; Murphy, Jeremy W; Molholm, Sophie; Lalor, Edmund C; Foxe, John J

    2014-05-01

    We often face the challenge of simultaneously attending to multiple non-contiguous regions of space. There is ongoing debate as to how spatial attention is divided under these situations. Whereas, for several years, the predominant view was that humans could divide the attentional spotlight, several recent studies argue in favor of a unitary spotlight that rhythmically samples relevant locations. Here, this issue was addressed by the use of high-density electrophysiology in concert with the multifocal m-sequence technique to examine visual evoked responses to multiple simultaneous streams of stimulation. Concurrently, we assayed the topographic distribution of alpha-band oscillatory mechanisms, a measure of attentional suppression. Participants performed a difficult detection task that required simultaneous attention to two stimuli in contiguous (undivided) or non-contiguous parts of space. In the undivided condition, the classic pattern of attentional modulation was observed, with increased amplitude of the early visual evoked response and increased alpha amplitude ipsilateral to the attended hemifield. For the divided condition, early visual responses to attended stimuli were also enhanced, and the observed multifocal topographic distribution of alpha suppression was in line with the divided attention hypothesis. These results support the existence of divided attentional spotlights, providing evidence that the corresponding modulation occurs during initial sensory processing time-frames in hierarchically early visual regions, and that suppressive mechanisms of visual attention selectively target distracter locations during divided spatial attention. © 2014 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  1. Plasticity-Driven Self-Organization under Topological Constraints Accounts for Non-random Features of Cortical Synaptic Wiring

    PubMed Central

    Miner, Daniel; Triesch, Jochen

    2016-01-01

    Understanding the structure and dynamics of cortical connectivity is vital to understanding cortical function. Experimental data strongly suggest that local recurrent connectivity in the cortex is significantly non-random, exhibiting, for example, above-chance bidirectionality and an overrepresentation of certain triangular motifs. Additional evidence suggests a significant distance dependency to connectivity over a local scale of a few hundred microns, and particular patterns of synaptic turnover dynamics, including a heavy-tailed distribution of synaptic efficacies, a power law distribution of synaptic lifetimes, and a tendency for stronger synapses to be more stable over time. Understanding how many of these non-random features simultaneously arise would provide valuable insights into the development and function of the cortex. While previous work has modeled some of the individual features of local cortical wiring, there is no model that begins to comprehensively account for all of them. We present a spiking network model of a rodent Layer 5 cortical slice which, via the interactions of a few simple biologically motivated intrinsic, synaptic, and structural plasticity mechanisms, qualitatively reproduces these non-random effects when combined with simple topological constraints. Our model suggests that mechanisms of self-organization arising from a small number of plasticity rules provide a parsimonious explanation for numerous experimentally observed non-random features of recurrent cortical wiring. Interestingly, similar mechanisms have been shown to endow recurrent networks with powerful learning abilities, suggesting that these mechanism are central to understanding both structure and function of cortical synaptic wiring. PMID:26866369

  2. Distribution of shortest cycle lengths in random networks

    NASA Astrophysics Data System (ADS)

    Bonneau, Haggai; Hassid, Aviv; Biham, Ofer; Kühn, Reimer; Katzav, Eytan

    2017-12-01

    We present analytical results for the distribution of shortest cycle lengths (DSCL) in random networks. The approach is based on the relation between the DSCL and the distribution of shortest path lengths (DSPL). We apply this approach to configuration model networks, for which analytical results for the DSPL were obtained before. We first calculate the fraction of nodes in the network which reside on at least one cycle. Conditioning on being on a cycle, we provide the DSCL over ensembles of configuration model networks with degree distributions which follow a Poisson distribution (Erdős-Rényi network), degenerate distribution (random regular graph), and a power-law distribution (scale-free network). The mean and variance of the DSCL are calculated. The analytical results are found to be in very good agreement with the results of computer simulations.

  3. Narrow-band generation in random distributed feedback fiber laser.

    PubMed

    Sugavanam, Srikanth; Tarasov, Nikita; Shu, Xuewen; Churkin, Dmitry V

    2013-07-15

    Narrow-band emission of spectral width down to ~0.05 nm line-width is achieved in the random distributed feedback fiber laser employing narrow-band fiber Bragg grating or fiber Fabry-Perot interferometer filters. The observed line-width is ~10 times less than line-width of other demonstrated up to date random distributed feedback fiber lasers. The random DFB laser with Fabry-Perot interferometer filter provides simultaneously multi-wavelength and narrow-band (within each line) generation with possibility of further wavelength tuning.

  4. A comparison of methods for estimating the random effects distribution of a linear mixed model.

    PubMed

    Ghidey, Wendimagegn; Lesaffre, Emmanuel; Verbeke, Geert

    2010-12-01

    This article reviews various recently suggested approaches to estimate the random effects distribution in a linear mixed model, i.e. (1) the smoothing by roughening approach of Shen and Louis,(1) (2) the semi-non-parametric approach of Zhang and Davidian,(2) (3) the heterogeneity model of Verbeke and Lesaffre( 3) and (4) a flexible approach of Ghidey et al. (4) These four approaches are compared via an extensive simulation study. We conclude that for the considered cases, the approach of Ghidey et al. (4) often shows to have the smallest integrated mean squared error for estimating the random effects distribution. An analysis of a longitudinal dental data set illustrates the performance of the methods in a practical example.

  5. Random parameter models of interstate crash frequencies by severity, number of vehicles involved, collision and location type.

    PubMed

    Venkataraman, Narayan; Ulfarsson, Gudmundur F; Shankar, Venky N

    2013-10-01

    A nine-year (1999-2007) continuous panel of crash histories on interstates in Washington State, USA, was used to estimate random parameter negative binomial (RPNB) models for various aggregations of crashes. A total of 21 different models were assessed in terms of four ways to aggregate crashes, by: (a) severity, (b) number of vehicles involved, (c) crash type, and by (d) location characteristics. The models within these aggregations include specifications for all severities (property damage only, possible injury, evident injury, disabling injury, and fatality), number of vehicles involved (one-vehicle to five-or-more-vehicle), crash type (sideswipe, same direction, overturn, head-on, fixed object, rear-end, and other), and location types (urban interchange, rural interchange, urban non-interchange, rural non-interchange). A total of 1153 directional road segments comprising of the seven Washington State interstates were analyzed, yielding statistical models of crash frequency based on 10,377 observations. These results suggest that in general there was a significant improvement in log-likelihood when using RPNB compared to a fixed parameter negative binomial baseline model. Heterogeneity effects are most noticeable for lighting type, road curvature, and traffic volume (ADT). Median lighting or right-side lighting are linked to increased crash frequencies in many models for more than half of the road segments compared to both-sides lighting. Both-sides lighting thereby appears to generally lead to a safety improvement. Traffic volume has a random parameter but the effect is always toward increasing crash frequencies as expected. However that the effect is random shows that the effect of traffic volume on crash frequency is complex and varies by road segment. The number of lanes has a random parameter effect only in the interchange type models. The results show that road segment-specific insights into crash frequency occurrence can lead to improved design policy and

  6. Hierarchy of evidence: differences in results between non-randomized studies and randomized trials in patients with femoral neck fractures.

    PubMed

    Bhandari, Mohit; Tornetta, Paul; Ellis, Thomas; Audige, Laurent; Sprague, Sheila; Kuo, Jonathann C; Swiontkowski, Marc F

    2004-01-01

    There have been a number of non-randomized studies comparing arthroplasty with internal fixation in patients with femoral neck fractures. However, there remains considerable debate about whether the results of non-randomized studies are consistent with the results of randomized, controlled trials. Given the economic burden of hip fractures, it remains essential to identify therapies to improve outcomes; however, whether data from non-randomized studies of an intervention should be used to guide patient care remains unclear. We aimed to determine whether the pooled results of mortality and revision surgery among non-randomized studies were similar to those of randomized trials in studies comparing arthroplasty with internal fixation in patients with femoral neck fractures. We conducted a Medline search from 1969 to June 2002, identifying both randomized and non-randomized studies comparing internal fixation with arthroplasty in patients with femoral neck fractures. Additional strategies to identify relevant articles included Cochrane database, SCISEARCH, textbooks, annual meeting programs, and content experts. We abstracted information on mortality and revision rates in each study and compared the pooled results between non-randomized and randomized studies. In addition, we explored potential reasons for dissimilar results between the two study designs. We identified 140 citations that addressed the general topic of comparison of arthroplasty and internal fixation for hip fracture. Of these, 27 studies met the eligibility criteria, 13 of which were non-randomized studies and 14 of which were randomized trials. Mortality data was available in all 13 non-randomized studies ( n=3108 patients) and in 12 randomized studies ( n=1767 patients). Non-randomized studies overestimated the risk of mortality by 40% when compared with the results of randomized trials (relative risk 1.44 vs 1.04, respectively). Information on revision risk was available in 9 non-randomized studies

  7. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti.

    PubMed

    Wampler, Peter J; Rediske, Richard R; Molla, Azizur R

    2013-01-18

    A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This

  8. Multivariate non-normally distributed random variables in climate research - introduction to the copula approach

    NASA Astrophysics Data System (ADS)

    Schölzel, C.; Friederichs, P.

    2008-10-01

    Probability distributions of multivariate random variables are generally more complex compared to their univariate counterparts which is due to a possible nonlinear dependence between the random variables. One approach to this problem is the use of copulas, which have become popular over recent years, especially in fields like econometrics, finance, risk management, or insurance. Since this newly emerging field includes various practices, a controversial discussion, and vast field of literature, it is difficult to get an overview. The aim of this paper is therefore to provide an brief overview of copulas for application in meteorology and climate research. We examine the advantages and disadvantages compared to alternative approaches like e.g. mixture models, summarize the current problem of goodness-of-fit (GOF) tests for copulas, and discuss the connection with multivariate extremes. An application to station data shows the simplicity and the capabilities as well as the limitations of this approach. Observations of daily precipitation and temperature are fitted to a bivariate model and demonstrate, that copulas are valuable complement to the commonly used methods.

  9. Geographic location, network patterns and population distribution of rural settlements in Greece

    NASA Astrophysics Data System (ADS)

    Asimakopoulos, Avraam; Mogios, Emmanuel; Xenikos, Dimitrios G.

    2016-10-01

    Our work addresses the problem of how social networks are embedded in space, by studying the spread of human population over complex geomorphological terrain. We focus on villages or small cities up to a few thousand inhabitants located in mountainous areas in Greece. This terrain presents a familiar tree-like structure of valleys and land plateaus. Cities are found more often at lower altitudes and exhibit preference on south orientation. Furthermore, the population generally avoids flat land plateaus and river beds, preferring locations slightly uphill, away from the plateau edge. Despite the location diversity regarding geomorphological parameters, we find certain quantitative norms when we examine location and population distributions relative to the (man-made) transportation network. In particular, settlements at radial distance ℓ away from road network junctions have the same mean altitude, practically independent of ℓ ranging from a few meters to 10 km. Similarly, the distribution of the settlement population at any given ℓ is the same for all ℓ. Finally, the cumulative distribution of the number of rural cities n(ℓ) is fitted to the Weibull distribution, suggesting that human decisions for creating settlements could be paralleled to mechanisms typically attributed to this particular statistical distribution.

  10. Non-urban mobile radio market demand forecast

    NASA Technical Reports Server (NTRS)

    Castruccio, P. A.; Cooper, J.

    1982-01-01

    A national nonmetropolitan land mobile traffic model for 1990-2000 addresses user classes, density classes, traffic mix statistics, distance distribution, geographic distribution, price elasticity, and service quality elasticity. Traffic demands for business, special industrial, and police were determined on the basis of surveys in 73 randomly selected nonurban counties. The selected services represent 69% of total demand. The results were extrapolated to all services in the non-SMSA areas of the contiguous United States. Radiotelephone services were considered separately. Total non-SMSA mobile radio demand (one way) estimates are given. General functional requirements include: hand portability, privacy, reduction of blind spots, two way data transmission, position location, slow scan imagery.

  11. Weight distributions for turbo codes using random and nonrandom permutations

    NASA Technical Reports Server (NTRS)

    Dolinar, S.; Divsalar, D.

    1995-01-01

    This article takes a preliminary look at the weight distributions achievable for turbo codes using random, nonrandom, and semirandom permutations. Due to the recursiveness of the encoders, it is important to distinguish between self-terminating and non-self-terminating input sequences. The non-self-terminating sequences have little effect on decoder performance, because they accumulate high encoded weight until they are artificially terminated at the end of the block. From probabilistic arguments based on selecting the permutations randomly, it is concluded that the self-terminating weight-2 data sequences are the most important consideration in the design of constituent codes; higher-weight self-terminating sequences have successively decreasing importance. Also, increasing the number of codes and, correspondingly, the number of permutations makes it more and more likely that the bad input sequences will be broken up by one or more of the permuters. It is possible to design nonrandom permutations that ensure that the minimum distance due to weight-2 input sequences grows roughly as the square root of (2N), where N is the block length. However, these nonrandom permutations amplify the bad effects of higher-weight inputs, and as a result they are inferior in performance to randomly selected permutations. But there are 'semirandom' permutations that perform nearly as well as the designed nonrandom permutations with respect to weight-2 input sequences and are not as susceptible to being foiled by higher-weight inputs.

  12. Non-Random Spatial Distribution of Impacts in the Stardust Cometary Collector

    NASA Technical Reports Server (NTRS)

    Westphal, Andrew J.; Bastien, Ronald K.; Borg, Janet; Bridges, John; Brownlee, Donald E.; Burchell, Mark J.; Cheng, Andrew F.; Clark, Benton C.; Djouadi, Zahia; Floss, Christine

    2007-01-01

    In January 2004, the Stardust spacecraft flew through the coma of comet P81/Wild2 at a relative speed of 6.1 km/sec. Cometary dust was collected at in a 0.1 sq m collector consisting of aerogel tiles and aluminum foils. Two years later, the samples successfully returned to earth and were recovered. We report the discovery that impacts in the Stardust cometary collector are not distributed randomly in the collecting media, but appear to be clustered on scales smaller than approx.10 cm. We also report the discovery of at least two populations of oblique tracks. We evaluated several hypotheses that could explain the observations. No hypothesis was consistent with all the observations, but the preponderance of evidence points toward at least one impact on the central Whipple shield of the spacecraft as the origin of both clustering and low-angle oblique tracks. High-angle oblique tracks unambiguously originate from a noncometary impact on the spacecraft bus just forward of the collector. Here we summarize the observations, and review the evidence for and against three scenarios that we have considered for explaining the impact clustering found on the Stardust aerogel and foil collectors.

  13. Discrete Wavelet Transform for Fault Locations in Underground Distribution System

    NASA Astrophysics Data System (ADS)

    Apisit, C.; Ngaopitakkul, A.

    2010-10-01

    In this paper, a technique for detecting faults in underground distribution system is presented. Discrete Wavelet Transform (DWT) based on traveling wave is employed in order to detect the high frequency components and to identify fault locations in the underground distribution system. The first peak time obtained from the faulty bus is employed for calculating the distance of fault from sending end. The validity of the proposed technique is tested with various fault inception angles, fault locations and faulty phases. The result is found that the proposed technique provides satisfactory result and will be very useful in the development of power systems protection scheme.

  14. Robustness of location estimators under t-distributions: a literature review

    NASA Astrophysics Data System (ADS)

    Sumarni, C.; Sadik, K.; Notodiputro, K. A.; Sartono, B.

    2017-03-01

    The assumption of normality is commonly used in estimation of parameters in statistical modelling, but this assumption is very sensitive to outliers. The t-distribution is more robust than the normal distribution since the t-distributions have longer tails. The robustness measures of location estimators under t-distributions are reviewed and discussed in this paper. For the purpose of illustration we use the onion yield data which includes outliers as a case study and showed that the t model produces better fit than the normal model.

  15. Probabilistic SSME blades structural response under random pulse loading

    NASA Technical Reports Server (NTRS)

    Shiao, Michael; Rubinstein, Robert; Nagpal, Vinod K.

    1987-01-01

    The purpose is to develop models of random impacts on a Space Shuttle Main Engine (SSME) turbopump blade and to predict the probabilistic structural response of the blade to these impacts. The random loading is caused by the impact of debris. The probabilistic structural response is characterized by distribution functions for stress and displacements as functions of the loading parameters which determine the random pulse model. These parameters include pulse arrival, amplitude, and location. The analysis can be extended to predict level crossing rates. This requires knowledge of the joint distribution of the response and its derivative. The model of random impacts chosen allows the pulse arrivals, pulse amplitudes, and pulse locations to be random. Specifically, the pulse arrivals are assumed to be governed by a Poisson process, which is characterized by a mean arrival rate. The pulse intensity is modelled as a normally distributed random variable with a zero mean chosen independently at each arrival. The standard deviation of the distribution is a measure of pulse intensity. Several different models were used for the pulse locations. For example, three points near the blade tip were chosen at which pulses were allowed to arrive with equal probability. Again, the locations were chosen independently at each arrival. The structural response was analyzed both by direct Monte Carlo simulation and by a semi-analytical method.

  16. Probability Distributions for Random Quantum Operations

    NASA Astrophysics Data System (ADS)

    Schultz, Kevin

    Motivated by uncertainty quantification and inference of quantum information systems, in this work we draw connections between the notions of random quantum states and operations in quantum information with probability distributions commonly encountered in the field of orientation statistics. This approach identifies natural sample spaces and probability distributions upon these spaces that can be used in the analysis, simulation, and inference of quantum information systems. The theory of exponential families on Stiefel manifolds provides the appropriate generalization to the classical case. Furthermore, this viewpoint motivates a number of additional questions into the convex geometry of quantum operations relative to both the differential geometry of Stiefel manifolds as well as the information geometry of exponential families defined upon them. In particular, we draw on results from convex geometry to characterize which quantum operations can be represented as the average of a random quantum operation. This project was supported by the Intelligence Advanced Research Projects Activity via Department of Interior National Business Center Contract Number 2012-12050800010.

  17. Links between fear of humans, stress and survival support a non-random distribution of birds among urban and rural habitats

    PubMed Central

    Rebolo-Ifrán, Natalia; Carrete, Martina; Sanz-Aguilar, Ana; Rodríguez-Martínez, Sol; Cabezas, Sonia; Marchant, Tracy A.; Bortolotti, Gary R.; Tella, José L.

    2015-01-01

    Urban endocrine ecology aims to understand how organisms cope with new sources of stress and maintain allostatic load to thrive in an increasingly urbanized world. Recent research efforts have yielded controversial results based on short-term measures of stress, without exploring its fitness effects. We measured feather corticosterone (CORTf, reflecting the duration and amplitude of glucocorticoid secretion over several weeks) and subsequent annual survival in urban and rural burrowing owls. This species shows high individual consistency in fear of humans (i.e., flight initiation distance, FID), allowing us to hypothesize that individuals distribute among habitats according to their tolerance to human disturbance. FIDs were shorter in urban than in rural birds, but CORTf levels did not differ, nor were correlated to FIDs. Survival was twice as high in urban as in rural birds and links with CORTf varied between habitats: while a quadratic relationship supports stabilizing selection in urban birds, high predation rates may have masked CORTf-survival relationship in rural ones. These results evidence that urban life does not constitute an additional source of stress for urban individuals, as shown by their near identical CORTf values compared with rural conspecifics supporting the non-random distribution of individuals among habitats according to their behavioural phenotypes. PMID:26348294

  18. Links between fear of humans, stress and survival support a non-random distribution of birds among urban and rural habitats.

    PubMed

    Rebolo-Ifrán, Natalia; Carrete, Martina; Sanz-Aguilar, Ana; Rodríguez-Martínez, Sol; Cabezas, Sonia; Marchant, Tracy A; Bortolotti, Gary R; Tella, José L

    2015-09-08

    Urban endocrine ecology aims to understand how organisms cope with new sources of stress and maintain allostatic load to thrive in an increasingly urbanized world. Recent research efforts have yielded controversial results based on short-term measures of stress, without exploring its fitness effects. We measured feather corticosterone (CORTf, reflecting the duration and amplitude of glucocorticoid secretion over several weeks) and subsequent annual survival in urban and rural burrowing owls. This species shows high individual consistency in fear of humans (i.e., flight initiation distance, FID), allowing us to hypothesize that individuals distribute among habitats according to their tolerance to human disturbance. FIDs were shorter in urban than in rural birds, but CORTf levels did not differ, nor were correlated to FIDs. Survival was twice as high in urban as in rural birds and links with CORTf varied between habitats: while a quadratic relationship supports stabilizing selection in urban birds, high predation rates may have masked CORTf-survival relationship in rural ones. These results evidence that urban life does not constitute an additional source of stress for urban individuals, as shown by their near identical CORTf values compared with rural conspecifics supporting the non-random distribution of individuals among habitats according to their behavioural phenotypes.

  19. Super Generalized Central Limit Theorem —Limit Distributions for Sums of Non-identical Random Variables with Power Laws—

    NASA Astrophysics Data System (ADS)

    Shintani, Masaru; Umeno, Ken

    2018-04-01

    The power law is present ubiquitously in nature and in our societies. Therefore, it is important to investigate the characteristics of power laws in the current era of big data. In this paper we prove that the superposition of non-identical stochastic processes with power laws converges in density to a unique stable distribution. This property can be used to explain the universality of stable laws that the sums of the logarithmic returns of non-identical stock price fluctuations follow stable distributions.

  20. Multiobjective assessment of distributed energy storage location in electricity networks

    NASA Astrophysics Data System (ADS)

    Ribeiro Gonçalves, José António; Neves, Luís Pires; Martins, António Gomes

    2017-07-01

    This paper presents a methodology to provide information to a decision maker on the associated impacts, both of economic and technical nature, of possible management schemes of storage units for choosing the best location of distributed storage devices, with a multiobjective optimisation approach based on genetic algorithms. The methodology was applied to a case study, a known distribution network model in which the installation of distributed storage units was tested, using lithium-ion batteries. The obtained results show a significant influence of the charging/discharging profile of batteries on the choice of their best location, as well as the relevance that these choices may have for the different network management objectives, for example, for reducing network energy losses or minimising voltage deviations. Results also show a difficult cost-effectiveness of an energy-only service, with the tested systems, both due to capital cost and due to the efficiency of conversion.

  1. High-resolution characterization of sequence signatures due to non-random cleavage of cell-free DNA.

    PubMed

    Chandrananda, Dineika; Thorne, Natalie P; Bahlo, Melanie

    2015-06-17

    High-throughput sequencing of cell-free DNA fragments found in human plasma has been used to non-invasively detect fetal aneuploidy, monitor organ transplants and investigate tumor DNA. However, many biological properties of this extracellular genetic material remain unknown. Research that further characterizes circulating DNA could substantially increase its diagnostic value by allowing the application of more sophisticated bioinformatics tools that lead to an improved signal to noise ratio in the sequencing data. In this study, we investigate various features of cell-free DNA in plasma using deep-sequencing data from two pregnant women (>70X, >50X) and compare them with matched cellular DNA. We utilize a descriptive approach to examine how the biological cleavage of cell-free DNA affects different sequence signatures such as fragment lengths, sequence motifs at fragment ends and the distribution of cleavage sites along the genome. We show that the size distributions of these cell-free DNA molecules are dependent on their autosomal and mitochondrial origin as well as the genomic location within chromosomes. DNA mapping to particular microsatellites and alpha repeat elements display unique size signatures. We show how cell-free fragments occur in clusters along the genome, localizing to nucleosomal arrays and are preferentially cleaved at linker regions by correlating the mapping locations of these fragments with ENCODE annotation of chromatin organization. Our work further demonstrates that cell-free autosomal DNA cleavage is sequence dependent. The region spanning up to 10 positions on either side of the DNA cleavage site show a consistent pattern of preference for specific nucleotides. This sequence motif is present in cleavage sites localized to nucleosomal cores and linker regions but is absent in nucleosome-free mitochondrial DNA. These background signals in cell-free DNA sequencing data stem from the non-random biological cleavage of these fragments. This

  2. Randomness determines practical security of BB84 quantum key distribution.

    PubMed

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-11-10

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system.

  3. Randomness determines practical security of BB84 quantum key distribution

    NASA Astrophysics Data System (ADS)

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-11-01

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system.

  4. Randomness determines practical security of BB84 quantum key distribution

    PubMed Central

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-01-01

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system. PMID:26552359

  5. Simulation and analysis of scalable non-Gaussian statistically anisotropic random functions

    NASA Astrophysics Data System (ADS)

    Riva, Monica; Panzeri, Marco; Guadagnini, Alberto; Neuman, Shlomo P.

    2015-12-01

    Many earth and environmental (as well as other) variables, Y, and their spatial or temporal increments, ΔY, exhibit non-Gaussian statistical scaling. Previously we were able to capture some key aspects of such scaling by treating Y or ΔY as standard sub-Gaussian random functions. We were however unable to reconcile two seemingly contradictory observations, namely that whereas sample frequency distributions of Y (or its logarithm) exhibit relatively mild non-Gaussian peaks and tails, those of ΔY display peaks that grow sharper and tails that become heavier with decreasing separation distance or lag. Recently we overcame this difficulty by developing a new generalized sub-Gaussian model which captures both behaviors in a unified and consistent manner, exploring it on synthetically generated random functions in one dimension (Riva et al., 2015). Here we extend our generalized sub-Gaussian model to multiple dimensions, present an algorithm to generate corresponding random realizations of statistically isotropic or anisotropic sub-Gaussian functions and illustrate it in two dimensions. We demonstrate the accuracy of our algorithm by comparing ensemble statistics of Y and ΔY (such as, mean, variance, variogram and probability density function) with those of Monte Carlo generated realizations. We end by exploring the feasibility of estimating all relevant parameters of our model by analyzing jointly spatial moments of Y and ΔY obtained from a single realization of Y.

  6. Mobile agent location in distributed environments

    NASA Astrophysics Data System (ADS)

    Fountoukis, S. G.; Argyropoulos, I. P.

    2012-12-01

    An agent is a small program acting on behalf of a user or an application which plays the role of a user. Artificial intelligence can be encapsulated in agents so that they can be capable of both behaving autonomously and showing an elementary decision ability regarding movement and some specific actions. Therefore they are often called autonomous mobile agents. In a distributed system, they can move themselves from one processing node to another through the interconnecting network infrastructure. Their purpose is to collect useful information and to carry it back to their user. Also, agents are used to start, monitor and stop processes running on the individual interconnected processing nodes of computer cluster systems. An agent has a unique id to discriminate itself from other agents and a current position. The position can be expressed as the address of the processing node which currently hosts the agent. Very often, it is necessary for a user, a processing node or another agent to know the current position of an agent in a distributed system. Several procedures and algorithms have been proposed for the purpose of position location of mobile agents. The most basic of all employs a fixed computing node, which acts as agent position repository, receiving messages from all the moving agents and keeping records of their current positions. The fixed node, responds to position queries and informs users, other nodes and other agents about the position of an agent. Herein, a model is proposed that considers pairs and triples of agents instead of single ones. A location method, which is investigated in this paper, attempts to exploit this model.

  7. Logistics Distribution Center Location Evaluation Based on Genetic Algorithm and Fuzzy Neural Network

    NASA Astrophysics Data System (ADS)

    Shao, Yuxiang; Chen, Qing; Wei, Zhenhua

    Logistics distribution center location evaluation is a dynamic, fuzzy, open and complicated nonlinear system, which makes it difficult to evaluate the distribution center location by the traditional analysis method. The paper proposes a distribution center location evaluation system which uses the fuzzy neural network combined with the genetic algorithm. In this model, the neural network is adopted to construct the fuzzy system. By using the genetic algorithm, the parameters of the neural network are optimized and trained so as to improve the fuzzy system’s abilities of self-study and self-adaptation. At last, the sampled data are trained and tested by Matlab software. The simulation results indicate that the proposed identification model has very small errors.

  8. Response measurement by laser Doppler vibrometry in vibration qualification tests with non-Gaussian random excitation

    NASA Astrophysics Data System (ADS)

    Troncossi, M.; Di Sante, R.; Rivola, A.

    2016-10-01

    In the field of vibration qualification testing, random excitations are typically imposed on the tested system in terms of a power spectral density (PSD) profile. This is the one of the most popular ways to control the shaker or slip table for durability tests. However, these excitations (and the corresponding system responses) exhibit a Gaussian probability distribution, whereas not all real-life excitations are Gaussian, causing the response to be also non-Gaussian. In order to introduce non-Gaussian peaks, a further parameter, i.e., kurtosis, has to be controlled in addition to the PSD. However, depending on the specimen behaviour and input signal characteristics, the use of non-Gaussian excitations with high kurtosis and a given PSD does not automatically imply a non-Gaussian stress response. For an experimental investigation of these coupled features, suitable measurement methods need to be developed in order to estimate the stress amplitude response at critical failure locations and consequently evaluate the input signals most representative for real-life, non-Gaussian excitations. In this paper, a simple test rig with a notched cantilevered specimen was developed to measure the response and examine the kurtosis values in the case of stationary Gaussian, stationary non-Gaussian, and burst non-Gaussian excitation signals. The laser Doppler vibrometry technique was used in this type of test for the first time, in order to estimate the specimen stress amplitude response as proportional to the differential displacement measured at the notch section ends. A method based on the use of measurements using accelerometers to correct for the occasional signal dropouts occurring during the experiment is described. The results demonstrate the ability of the test procedure to evaluate the output signal features and therefore to select the most appropriate input signal for the fatigue test.

  9. Computer simulation of random variables and vectors with arbitrary probability distribution laws

    NASA Technical Reports Server (NTRS)

    Bogdan, V. M.

    1981-01-01

    Assume that there is given an arbitrary n-dimensional probability distribution F. A recursive construction is found for a sequence of functions x sub 1 = f sub 1 (U sub 1, ..., U sub n), ..., x sub n = f sub n (U sub 1, ..., U sub n) such that if U sub 1, ..., U sub n are independent random variables having uniform distribution over the open interval (0,1), then the joint distribution of the variables x sub 1, ..., x sub n coincides with the distribution F. Since uniform independent random variables can be well simulated by means of a computer, this result allows one to simulate arbitrary n-random variables if their joint probability distribution is known.

  10. Non-linear continuous time random walk models★

    NASA Astrophysics Data System (ADS)

    Stage, Helena; Fedotov, Sergei

    2017-11-01

    A standard assumption of continuous time random walk (CTRW) processes is that there are no interactions between the random walkers, such that we obtain the celebrated linear fractional equation either for the probability density function of the walker at a certain position and time, or the mean number of walkers. The question arises how one can extend this equation to the non-linear case, where the random walkers interact. The aim of this work is to take into account this interaction under a mean-field approximation where the statistical properties of the random walker depend on the mean number of walkers. The implementation of these non-linear effects within the CTRW integral equations or fractional equations poses difficulties, leading to the alternative methodology we present in this work. We are concerned with non-linear effects which may either inhibit anomalous effects or induce them where they otherwise would not arise. Inhibition of these effects corresponds to a decrease in the waiting times of the random walkers, be this due to overcrowding, competition between walkers or an inherent carrying capacity of the system. Conversely, induced anomalous effects present longer waiting times and are consistent with symbiotic, collaborative or social walkers, or indirect pinpointing of favourable regions by their attractiveness. Contribution to the Topical Issue "Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.

  11. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  12. Random diffusivity from stochastic equations: comparison of two models for Brownian yet non-Gaussian diffusion

    NASA Astrophysics Data System (ADS)

    Sposini, Vittoria; Chechkin, Aleksei V.; Seno, Flavio; Pagnini, Gianni; Metzler, Ralf

    2018-04-01

    A considerable number of systems have recently been reported in which Brownian yet non-Gaussian dynamics was observed. These are processes characterised by a linear growth in time of the mean squared displacement, yet the probability density function of the particle displacement is distinctly non-Gaussian, and often of exponential (Laplace) shape. This apparently ubiquitous behaviour observed in very different physical systems has been interpreted as resulting from diffusion in inhomogeneous environments and mathematically represented through a variable, stochastic diffusion coefficient. Indeed different models describing a fluctuating diffusivity have been studied. Here we present a new view of the stochastic basis describing time-dependent random diffusivities within a broad spectrum of distributions. Concretely, our study is based on the very generic class of the generalised Gamma distribution. Two models for the particle spreading in such random diffusivity settings are studied. The first belongs to the class of generalised grey Brownian motion while the second follows from the idea of diffusing diffusivities. The two processes exhibit significant characteristics which reproduce experimental results from different biological and physical systems. We promote these two physical models for the description of stochastic particle motion in complex environments.

  13. Hybrid computer technique yields random signal probability distributions

    NASA Technical Reports Server (NTRS)

    Cameron, W. D.

    1965-01-01

    Hybrid computer determines the probability distributions of instantaneous and peak amplitudes of random signals. This combined digital and analog computer system reduces the errors and delays of manual data analysis.

  14. Optimization of pressure gauge locations for water distribution systems using entropy theory.

    PubMed

    Yoo, Do Guen; Chang, Dong Eil; Jun, Hwandon; Kim, Joong Hoon

    2012-12-01

    It is essential to select the optimal pressure gauge location for effective management and maintenance of water distribution systems. This study proposes an objective and quantified standard for selecting the optimal pressure gauge location by defining the pressure change at other nodes as a result of demand change at a specific node using entropy theory. Two cases are considered in terms of demand change: that in which demand at all nodes shows peak load by using a peak factor and that comprising the demand change of the normal distribution whose average is the base demand. The actual pressure change pattern is determined by using the emitter function of EPANET to reflect the pressure that changes practically at each node. The optimal pressure gauge location is determined by prioritizing the node that processes the largest amount of information it gives to (giving entropy) and receives from (receiving entropy) the whole system according to the entropy standard. The suggested model is applied to one virtual and one real pipe network, and the optimal pressure gauge location combination is calculated by implementing the sensitivity analysis based on the study results. These analysis results support the following two conclusions. Firstly, the installation priority of the pressure gauge in water distribution networks can be determined with a more objective standard through the entropy theory. Secondly, the model can be used as an efficient decision-making guide for gauge installation in water distribution systems.

  15. Spatial Distribution of Phase Singularities in Optical Random Vector Waves.

    PubMed

    De Angelis, L; Alpeggiani, F; Di Falco, A; Kuipers, L

    2016-08-26

    Phase singularities are dislocations widely studied in optical fields as well as in other areas of physics. With experiment and theory we show that the vectorial nature of light affects the spatial distribution of phase singularities in random light fields. While in scalar random waves phase singularities exhibit spatial distributions reminiscent of particles in isotropic liquids, in vector fields their distribution for the different vector components becomes anisotropic due to the direct relation between propagation and field direction. By incorporating this relation in the theory for scalar fields by Berry and Dennis [Proc. R. Soc. A 456, 2059 (2000)], we quantitatively describe our experiments.

  16. Non-randomized response model for sensitive survey with noncompliance.

    PubMed

    Wu, Qin; Tang, Man-Lai

    2016-12-01

    Collecting representative data on sensitive issues has long been problematic and challenging in public health prevalence investigation (e.g. non-suicidal self-injury), medical research (e.g. drug habits), social issue studies (e.g. history of child abuse), and their interdisciplinary studies (e.g. premarital sexual intercourse). Alternative data collection techniques that can be adopted to study sensitive questions validly become more important and necessary. As an alternative to the famous Warner randomized response model, non-randomized response triangular model has recently been developed to encourage participants to provide truthful responses in surveys involving sensitive questions. Unfortunately, both randomized and non-randomized response models could underestimate the proportion of subjects with the sensitive characteristic as some respondents do not believe that these techniques can protect their anonymity. As a result, some authors hypothesized that lack of trust and noncompliance should be highest among those who have the most to lose and the least to use for the anonymity provided by using these techniques. Some researchers noticed the existence of noncompliance and proposed new models to measure noncompliance in order to get reliable information. However, all proposed methods were based on randomized response models which require randomizing devices, restrict the survey to only face-to-face interview and are lack of reproductivity. Taking the noncompliance into consideration, we introduce new non-randomized response techniques in which no covariate is required. Asymptotic properties of the proposed estimates for sensitive characteristic as well as noncompliance probabilities are developed. Our proposed techniques are empirically shown to yield accurate estimates for both sensitive and noncompliance probabilities. A real example about premarital sex among university students is used to demonstrate our methodologies. © The Author(s) 2014.

  17. Distributed Detection with Collisions in a Random, Single-Hop Wireless Sensor Network

    DTIC Science & Technology

    2013-05-26

    public release; distribution is unlimited. Distributed detection with collisions in a random, single-hop wireless sensor network The views, opinions...1274 2 ABSTRACT Distributed detection with collisions in a random, single-hop wireless sensor network Report Title We consider the problem of... WIRELESS SENSOR NETWORK Gene T. Whipps?† Emre Ertin† Randolph L. Moses† ?U.S. Army Research Laboratory, Adelphi, MD 20783 †The Ohio State University

  18. Estimating Genomic Distance from DNA Sequence Location in Cell Nuclei by a Random Walk Model

    NASA Astrophysics Data System (ADS)

    van den Engh, Ger; Sachs, Rainer; Trask, Barbara J.

    1992-09-01

    The folding of chromatin in interphase cell nuclei was studied by fluorescent in situ hybridization with pairs of unique DNA sequence probes. The sites of DNA sequences separated by 100 to 2000 kilobase pairs (kbp) are distributed in interphase chromatin according to a random walk model. This model provides the basis for calculating the spacing of sequences along the linear DNA molecule from interphase distance measurements. An interphase mapping strategy based on this model was tested with 13 probes from a 4-megabase pair (Mbp) region of chromosome 4 containing the Huntington disease locus. The results confirmed the locations of the probes and showed that the remaining gap in the published maps of this region is negligible in size. Interphase distance measurements should facilitate construction of chromosome maps with an average marker density of one per 100 kbp, approximately ten times greater than that achieved by hybridization to metaphase chromosomes.

  19. A biorthogonal decomposition for the identification and simulation of non-stationary and non-Gaussian random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zentner, I.; Ferré, G., E-mail: gregoire.ferre@ponts.org; Poirion, F.

    2016-06-01

    In this paper, a new method for the identification and simulation of non-Gaussian and non-stationary stochastic fields given a database is proposed. It is based on two successive biorthogonal decompositions aiming at representing spatio–temporal stochastic fields. The proposed double expansion allows to build the model even in the case of large-size problems by separating the time, space and random parts of the field. A Gaussian kernel estimator is used to simulate the high dimensional set of random variables appearing in the decomposition. The capability of the method to reproduce the non-stationary and non-Gaussian features of random phenomena is illustrated bymore » applications to earthquakes (seismic ground motion) and sea states (wave heights).« less

  20. Is the Non-Dipole Magnetic Field Random?

    NASA Technical Reports Server (NTRS)

    Walker, Andrew D.; Backus, George E.

    1996-01-01

    Statistical modelling of the Earth's magnetic field B has a long history. In particular, the spherical harmonic coefficients of scalar fields derived from B can be treated as Gaussian random variables. In this paper, we give examples of highly organized fields whose spherical harmonic coefficients pass tests for independent Gaussian random variables. The fact that coefficients at some depth may be usefully summarized as independent samples from a normal distribution need not imply that there really is some physical, random process at that depth. In fact, the field can be extremely structured and still be regarded for some purposes as random. In this paper, we examined the radial magnetic field B(sub r) produced by the core, but the results apply to any scalar field on the core-mantle boundary (CMB) which determines B outside the CMB.

  1. On the generation of log-Lévy distributions and extreme randomness

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2011-10-01

    The log-normal distribution is prevalent across the sciences, as it emerges from the combination of multiplicative processes and the central limit theorem (CLT). The CLT, beyond yielding the normal distribution, also yields the class of Lévy distributions. The log-Lévy distributions are the Lévy counterparts of the log-normal distribution, they appear in the context of ultraslow diffusion processes, and they are categorized by Mandelbrot as belonging to the class of extreme randomness. In this paper, we present a natural stochastic growth model from which both the log-normal distribution and the log-Lévy distributions emerge universally—the former in the case of deterministic underlying setting, and the latter in the case of stochastic underlying setting. In particular, we establish a stochastic growth model which universally generates Mandelbrot’s extreme randomness.

  2. The non-equilibrium allele frequency spectrum in a Poisson random field framework.

    PubMed

    Kaj, Ingemar; Mugal, Carina F

    2016-10-01

    In population genetic studies, the allele frequency spectrum (AFS) efficiently summarizes genome-wide polymorphism data and shapes a variety of allele frequency-based summary statistics. While existing theory typically features equilibrium conditions, emerging methodology requires an analytical understanding of the build-up of the allele frequencies over time. In this work, we use the framework of Poisson random fields to derive new representations of the non-equilibrium AFS for the case of a Wright-Fisher population model with selection. In our approach, the AFS is a scaling-limit of the expectation of a Poisson stochastic integral and the representation of the non-equilibrium AFS arises in terms of a fixation time probability distribution. The known duality between the Wright-Fisher diffusion process and a birth and death process generalizing Kingman's coalescent yields an additional representation. The results carry over to the setting of a random sample drawn from the population and provide the non-equilibrium behavior of sample statistics. Our findings are consistent with and extend a previous approach where the non-equilibrium AFS solves a partial differential forward equation with a non-traditional boundary condition. Moreover, we provide a bridge to previous coalescent-based work, and hence tie several frameworks together. Since frequency-based summary statistics are widely used in population genetics, for example, to identify candidate loci of adaptive evolution, to infer the demographic history of a population, or to improve our understanding of the underlying mechanics of speciation events, the presented results are potentially useful for a broad range of topics. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Continuous Time Random Walks with memory and financial distributions

    NASA Astrophysics Data System (ADS)

    Montero, Miquel; Masoliver, Jaume

    2017-11-01

    We study financial distributions from the perspective of Continuous Time Random Walks with memory. We review some of our previous developments and apply them to financial problems. We also present some new models with memory that can be useful in characterizing tendency effects which are inherent in most markets. We also briefly study the effect on return distributions of fractional behaviors in the distribution of pausing times between successive transactions.

  4. Raney Distributions and Random Matrix Theory

    NASA Astrophysics Data System (ADS)

    Forrester, Peter J.; Liu, Dang-Zheng

    2015-03-01

    Recent works have shown that the family of probability distributions with moments given by the Fuss-Catalan numbers permit a simple parameterized form for their density. We extend this result to the Raney distribution which by definition has its moments given by a generalization of the Fuss-Catalan numbers. Such computations begin with an algebraic equation satisfied by the Stieltjes transform, which we show can be derived from the linear differential equation satisfied by the characteristic polynomial of random matrix realizations of the Raney distribution. For the Fuss-Catalan distribution, an equilibrium problem characterizing the density is identified. The Stieltjes transform for the limiting spectral density of the singular values squared of the matrix product formed from inverse standard Gaussian matrices, and standard Gaussian matrices, is shown to satisfy a variant of the algebraic equation relating to the Raney distribution. Supported on , we show that it too permits a simple functional form upon the introduction of an appropriate choice of parameterization. As an application, the leading asymptotic form of the density as the endpoints of the support are approached is computed, and is shown to have some universal features.

  5. Fatigue assessment of vibrating rail vehicle bogie components under non-Gaussian random excitations using power spectral densities

    NASA Astrophysics Data System (ADS)

    Wolfsteiner, Peter; Breuer, Werner

    2013-10-01

    The assessment of fatigue load under random vibrations is usually based on load spectra. Typically they are computed with counting methods (e.g. Rainflow) based on a time domain signal. Alternatively methods are available (e.g. Dirlik) enabling the estimation of load spectra directly from power spectral densities (PSDs) of the corresponding time signals; the knowledge of the time signal is then not necessary. These PSD based methods have the enormous advantage that if for example the signal to assess results from a finite element method based vibration analysis, the computation time of the simulation of PSDs in the frequency domain outmatches by far the simulation of time signals in the time domain. This is especially true for random vibrations with very long signals in the time domain. The disadvantage of the PSD based simulation of vibrations and also the PSD based load spectra estimation is their limitation to Gaussian distributed time signals. Deviations from this Gaussian distribution cause relevant deviations in the estimated load spectra. In these cases usually only computation time intensive time domain calculations produce accurate results. This paper presents a method dealing with non-Gaussian signals with real statistical properties that is still able to use the efficient PSD approach with its computation time advantages. Essentially it is based on a decomposition of the non-Gaussian signal in Gaussian distributed parts. The PSDs of these rearranged signals are then used to perform usual PSD analyses. In particular, detailed methods are described for the decomposition of time signals and the derivation of PSDs and cross power spectral densities (CPSDs) from multiple real measurements without using inaccurate standard procedures. Furthermore the basic intention is to design a general and integrated method that is not just able to analyse a certain single load case for a small time interval, but to generate representative PSD and CPSD spectra replacing

  6. Application of the LSQR algorithm in non-parametric estimation of aerosol size distribution

    NASA Astrophysics Data System (ADS)

    He, Zhenzong; Qi, Hong; Lew, Zhongyuan; Ruan, Liming; Tan, Heping; Luo, Kun

    2016-05-01

    Based on the Least Squares QR decomposition (LSQR) algorithm, the aerosol size distribution (ASD) is retrieved in non-parametric approach. The direct problem is solved by the Anomalous Diffraction Approximation (ADA) and the Lambert-Beer Law. An optimal wavelength selection method is developed to improve the retrieval accuracy of the ASD. The proposed optimal wavelength set is selected by the method which can make the measurement signals sensitive to wavelength and decrease the degree of the ill-condition of coefficient matrix of linear systems effectively to enhance the anti-interference ability of retrieval results. Two common kinds of monomodal and bimodal ASDs, log-normal (L-N) and Gamma distributions, are estimated, respectively. Numerical tests show that the LSQR algorithm can be successfully applied to retrieve the ASD with high stability in the presence of random noise and low susceptibility to the shape of distributions. Finally, the experimental measurement ASD over Harbin in China is recovered reasonably. All the results confirm that the LSQR algorithm combined with the optimal wavelength selection method is an effective and reliable technique in non-parametric estimation of ASD.

  7. Optimal sensor placement for leak location in water distribution networks using genetic algorithms.

    PubMed

    Casillas, Myrna V; Puig, Vicenç; Garza-Castañón, Luis E; Rosich, Albert

    2013-11-04

    This paper proposes a new sensor placement approach for leak location in water distribution networks (WDNs). The sensor placement problem is formulated as an integer optimization problem. The optimization criterion consists in minimizing the number of non-isolable leaks according to the isolability criteria introduced. Because of the large size and non-linear integer nature of the resulting optimization problem, genetic algorithms (GAs) are used as the solution approach. The obtained results are compared with a semi-exhaustive search method with higher computational effort, proving that GA allows one to find near-optimal solutions with less computational load. Moreover, three ways of increasing the robustness of the GA-based sensor placement method have been proposed using a time horizon analysis, a distance-based scoring and considering different leaks sizes. A great advantage of the proposed methodology is that it does not depend on the isolation method chosen by the user, as long as it is based on leak sensitivity analysis. Experiments in two networks allow us to evaluate the performance of the proposed approach.

  8. Optimal Sensor Placement for Leak Location in Water Distribution Networks Using Genetic Algorithms

    PubMed Central

    Casillas, Myrna V.; Puig, Vicenç; Garza-Castañón, Luis E.; Rosich, Albert

    2013-01-01

    This paper proposes a new sensor placement approach for leak location in water distribution networks (WDNs). The sensor placement problem is formulated as an integer optimization problem. The optimization criterion consists in minimizing the number of non-isolable leaks according to the isolability criteria introduced. Because of the large size and non-linear integer nature of the resulting optimization problem, genetic algorithms (GAs) are used as the solution approach. The obtained results are compared with a semi-exhaustive search method with higher computational effort, proving that GA allows one to find near-optimal solutions with less computational load. Moreover, three ways of increasing the robustness of the GA-based sensor placement method have been proposed using a time horizon analysis, a distance-based scoring and considering different leaks sizes. A great advantage of the proposed methodology is that it does not depend on the isolation method chosen by the user, as long as it is based on leak sensitivity analysis. Experiments in two networks allow us to evaluate the performance of the proposed approach. PMID:24193099

  9. Development of requirements for environmental specimen banking in ecological monitoring (exemplified by the Chernobyl NPP accident area).

    PubMed

    Borzilov, V A

    1993-11-01

    Development of requirements for a data bank for natural media as a system of intercorrelated parameters to estimate system states are determined. The problems of functional agreement between experimental and calculation methods are analysed when organizing the ecological monitoring. The methods of forming the environmental specimen bank to estimate and forecast radioactive contamination and exposure dose are considered to be exemplified by the peculiarities of the spatial distribution of radioactive contamination in fields. Analysed is the temporal dynamics of contamination for atmospheric air, soil and water.

  10. Spatial distribution of random velocity inhomogeneities in the western part of Nankai subduction zone

    NASA Astrophysics Data System (ADS)

    Takahashi, T.; Obana, K.; Yamamoto, Y.; Nakanishi, A.; Kodaira, S.; Kaneda, Y.

    2011-12-01

    In the Nankai trough, there are three seismogenic zones of megathrust earthquakes (Tokai, Tonankai and Nankai earthquakes). Lithospheric structures in and around these seismogenic zones are important for the studies on mutual interactions and synchronization of their fault ruptures. Recent studies on seismic wave scattering at high frequencies (>1Hz) make it possible to estimate 3D distributions of random inhomogeneities (or scattering coefficient) in the lithosphere, and clarified that random inhomogeneity is one of the important medium properties related to microseismicity and damaged structure near the fault zone [Asano & Hasegawa, 2004; Takahashi et al. 2009]. This study estimates the spatial distribution of the power spectral density function (PSDF) of random inhomogeneities the western part of Nankai subduction zone, and examines the relations with crustal velocity structure and seismic activity. Seismic waveform data used in this study are those recorded at seismic stations of Hi-net & F-net operated by NIED, and 160 ocean bottom seismographs (OBSs) deployed at Hyuga-nada region from Dec. 2008 to Jan. 2009. This OBS observation was conducted by JAMSTEC as a part of "Research concerning Interaction Between the Tokai, Tonankai and Nankai Earthquakes" funded by Ministry of Education, Culture, Sports, Science and Technology, Japan. Spatial distribution of random inhomogeneities is estimated by the inversion analysis of the peak delay time of small earthquakes [Takahashi et al. 2009], where the peak delay time is defined as the time lag from the S-wave onset to its maximal amplitude arrival. We assumed the von Karman type functional form for the PSDF. Peak delay times are measured from root mean squared envelopes at 4-8Hz, 8-16Hz and 16-32Hz. Inversion result can be summarized as follows. Random inhomogeneities beneath the Quaternary volcanoes are characterized by strong inhomogeneities at small spatial scale (~ a few hundreds meter) and weak spectral gradient

  11. Randomly displaced phase distribution design and its advantage in page-data recording of Fourier transform holograms.

    PubMed

    Emoto, Akira; Fukuda, Takashi

    2013-02-20

    For Fourier transform holography, an effective random phase distribution with randomly displaced phase segments is proposed for obtaining a smooth finite optical intensity distribution in the Fourier transform plane. Since unitary phase segments are randomly distributed in-plane, the blanks give various spatial frequency components to an image, and thus smooth the spectrum. Moreover, by randomly changing the phase segment size, spike generation from the unitary phase segment size in the spectrum can be reduced significantly. As a result, a smooth spectrum including sidebands can be formed at a relatively narrow extent. The proposed phase distribution sustains the primary functions of a random phase mask for holographic-data recording and reconstruction. Therefore, this distribution is expected to find applications in high-density holographic memory systems, replacing conventional random phase mask patterns.

  12. Acoustic Emission Source Location Using a Distributed Feedback Fiber Laser Rosette

    PubMed Central

    Huang, Wenzhu; Zhang, Wentao; Li, Fang

    2013-01-01

    This paper proposes an approach for acoustic emission (AE) source localization in a large marble stone using distributed feedback (DFB) fiber lasers. The aim of this study is to detect damage in structures such as those found in civil applications. The directional sensitivity of DFB fiber laser is investigated by calculating location coefficient using a method of digital signal analysis. In this, autocorrelation is used to extract the location coefficient from the periodic AE signal and wavelet packet energy is calculated to get the location coefficient of a burst AE source. Normalization is processed to eliminate the influence of distance and intensity of AE source. Then a new location algorithm based on the location coefficient is presented and tested to determine the location of AE source using a Delta (Δ) DFB fiber laser rosette configuration. The advantage of the proposed algorithm over the traditional methods based on fiber Bragg Grating (FBG) include the capability of: having higher strain resolution for AE detection and taking into account two different types of AE source for location. PMID:24141266

  13. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    ERIC Educational Resources Information Center

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  14. Role of non-traditional locations for seasonal flu vaccination: Empirical evidence and evaluation.

    PubMed

    Kim, Namhoon; Mountain, Travis P

    2017-05-19

    This study investigated the role of non-traditional locations in the decision to vaccinate for seasonal flu. We measured individuals' preferred location for seasonal flu vaccination by examining the National H1N1 Flu Survey (NHFS) conducted from late 2009 to early 2010. Our econometric model estimated the probabilities of possible choices by varying individual characteristics, and predicted the way in which the probabilities are expected to change given the specific covariates of interest. From this estimation, we observed that non-traditional locations significantly influenced the vaccination of certain individuals, such as those who are high-income, educated, White, employed, and living in a metropolitan statistical area (MSA), by increasing the coverage. Thus, based on the empirical evidence, our study suggested that supporting non-traditional locations for vaccination could be effective in increasing vaccination coverage. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Scalable randomized benchmarking of non-Clifford gates

    NASA Astrophysics Data System (ADS)

    Cross, Andrew; Magesan, Easwar; Bishop, Lev; Smolin, John; Gambetta, Jay

    Randomized benchmarking is a widely used experimental technique to characterize the average error of quantum operations. Benchmarking procedures that scale to enable characterization of n-qubit circuits rely on efficient procedures for manipulating those circuits and, as such, have been limited to subgroups of the Clifford group. However, universal quantum computers require additional, non-Clifford gates to approximate arbitrary unitary transformations. We define a scalable randomized benchmarking procedure over n-qubit unitary matrices that correspond to protected non-Clifford gates for a class of stabilizer codes. We present efficient methods for representing and composing group elements, sampling them uniformly, and synthesizing corresponding poly (n) -sized circuits. The procedure provides experimental access to two independent parameters that together characterize the average gate fidelity of a group element. We acknowledge support from ARO under Contract W911NF-14-1-0124.

  16. On the minimum of independent geometrically distributed random variables

    NASA Technical Reports Server (NTRS)

    Ciardo, Gianfranco; Leemis, Lawrence M.; Nicol, David

    1994-01-01

    The expectations E(X(sub 1)), E(Z(sub 1)), and E(Y(sub 1)) of the minimum of n independent geometric, modifies geometric, or exponential random variables with matching expectations differ. We show how this is accounted for by stochastic variability and how E(X(sub 1))/E(Y(sub 1)) equals the expected number of ties at the minimum for the geometric random variables. We then introduce the 'shifted geometric distribution' and show that there is a unique value of the shift for which the individual shifted geometric and exponential random variables match expectations both individually and in the minimums.

  17. Random Partition Distribution Indexed by Pairwise Information

    PubMed Central

    Dahl, David B.; Day, Ryan; Tsai, Jerry W.

    2017-01-01

    We propose a random partition distribution indexed by pairwise similarity information such that partitions compatible with the similarities are given more probability. The use of pairwise similarities, in the form of distances, is common in some clustering algorithms (e.g., hierarchical clustering), but we show how to use this type of information to define a prior partition distribution for flexible Bayesian modeling. A defining feature of the distribution is that it allocates probability among partitions within a given number of subsets, but it does not shift probability among sets of partitions with different numbers of subsets. Our distribution places more probability on partitions that group similar items yet keeps the total probability of partitions with a given number of subsets constant. The distribution of the number of subsets (and its moments) is available in closed-form and is not a function of the similarities. Our formulation has an explicit probability mass function (with a tractable normalizing constant) so the full suite of MCMC methods may be used for posterior inference. We compare our distribution with several existing partition distributions, showing that our formulation has attractive properties. We provide three demonstrations to highlight the features and relative performance of our distribution. PMID:29276318

  18. Multi-Label Learning via Random Label Selection for Protein Subcellular Multi-Locations Prediction.

    PubMed

    Wang, Xiao; Li, Guo-Zheng

    2013-03-12

    Prediction of protein subcellular localization is an important but challenging problem, particularly when proteins may simultaneously exist at, or move between, two or more different subcellular location sites. Most of the existing protein subcellular localization methods are only used to deal with the single-location proteins. In the past few years, only a few methods have been proposed to tackle proteins with multiple locations. However, they only adopt a simple strategy, that is, transforming the multi-location proteins to multiple proteins with single location, which doesn't take correlations among different subcellular locations into account. In this paper, a novel method named RALS (multi-label learning via RAndom Label Selection), is proposed to learn from multi-location proteins in an effective and efficient way. Through five-fold cross validation test on a benchmark dataset, we demonstrate our proposed method with consideration of label correlations obviously outperforms the baseline BR method without consideration of label correlations, indicating correlations among different subcellular locations really exist and contribute to improvement of prediction performance. Experimental results on two benchmark datasets also show that our proposed methods achieve significantly higher performance than some other state-of-the-art methods in predicting subcellular multi-locations of proteins. The prediction web server is available at http://levis.tongji.edu.cn:8080/bioinfo/MLPred-Euk/ for the public usage.

  19. Disparities in the Population Distribution of African American and Non-Hispanic White Smokers along the Quitting Continuum

    ERIC Educational Resources Information Center

    Trinidad, Dennis R.; Xie, Bin; Fagan, Pebbles; Pulvers, Kim; Romero, Devan R.; Blanco, Lyzette; Sakuma, Kari-Lyn K.

    2015-01-01

    Purpose: To examine disparities and changes over time in the population-level distribution of smokers along a cigarette quitting continuum among African American smokers compared with non-Hispanic Whites. Methods: Secondary data analyses of the 1999, 2002, 2005, and 2008 California Tobacco Surveys (CTS). The CTS are large, random-digit-dialed,…

  20. Micromechanical analysis of composites with fibers distributed randomly over the transverse cross-section

    NASA Astrophysics Data System (ADS)

    Weng, Jingmeng; Wen, Weidong; Cui, Haitao; Chen, Bo

    2018-06-01

    A new method to generate the random distribution of fibers in the transverse cross-section of fiber reinforced composites with high fiber volume fraction is presented in this paper. Based on the microscopy observation of the transverse cross-sections of unidirectional composite laminates, hexagon arrangement is set as the initial arrangement status, and the initial velocity of each fiber is arbitrary at an arbitrary direction, the micro-scale representative volume element (RVE) is established by simulating perfectly elastic collision. Combined with the proposed periodic boundary conditions which are suitable for multi-axial loading, the effective elastic properties of composite materials can be predicted. The predicted properties show reasonable agreement with experimental results. By comparing the stress field of RVE with fibers distributed randomly and RVE with fibers distributed periodically, the predicted elastic modulus of RVE with fibers distributed randomly is greater than RVE with fibers distributed periodically.

  1. Probability distribution of the entanglement across a cut at an infinite-randomness fixed point

    NASA Astrophysics Data System (ADS)

    Devakul, Trithep; Majumdar, Satya N.; Huse, David A.

    2017-03-01

    We calculate the probability distribution of entanglement entropy S across a cut of a finite one-dimensional spin chain of length L at an infinite-randomness fixed point using Fisher's strong randomness renormalization group (RG). Using the random transverse-field Ising model as an example, the distribution is shown to take the form p (S |L ) ˜L-ψ (k ) , where k ≡S /ln[L /L0] , the large deviation function ψ (k ) is found explicitly, and L0 is a nonuniversal microscopic length. We discuss the implications of such a distribution on numerical techniques that rely on entanglement, such as matrix-product-state-based techniques. Our results are verified with numerical RG simulations, as well as the actual entanglement entropy distribution for the random transverse-field Ising model which we calculate for large L via a mapping to Majorana fermions.

  2. Random forests in non-invasive sensorimotor rhythm brain-computer interfaces: a practical and convenient non-linear classifier.

    PubMed

    Steyrl, David; Scherer, Reinhold; Faller, Josef; Müller-Putz, Gernot R

    2016-02-01

    There is general agreement in the brain-computer interface (BCI) community that although non-linear classifiers can provide better results in some cases, linear classifiers are preferable. Particularly, as non-linear classifiers often involve a number of parameters that must be carefully chosen. However, new non-linear classifiers were developed over the last decade. One of them is the random forest (RF) classifier. Although popular in other fields of science, RFs are not common in BCI research. In this work, we address three open questions regarding RFs in sensorimotor rhythm (SMR) BCIs: parametrization, online applicability, and performance compared to regularized linear discriminant analysis (LDA). We found that the performance of RF is constant over a large range of parameter values. We demonstrate - for the first time - that RFs are applicable online in SMR-BCIs. Further, we show in an offline BCI simulation that RFs statistically significantly outperform regularized LDA by about 3%. These results confirm that RFs are practical and convenient non-linear classifiers for SMR-BCIs. Taking into account further properties of RFs, such as independence from feature distributions, maximum margin behavior, multiclass and advanced data mining capabilities, we argue that RFs should be taken into consideration for future BCIs.

  3. Does prism width from the shell prismatic layer have a random distribution?

    NASA Astrophysics Data System (ADS)

    Vancolen, Séverine; Verrecchia, Eric

    2008-10-01

    A study of the distribution of the prism width inside the prismatic layer of Unio tumidus (Philipsson 1788, Diss Hist-Nat, Berling, Lundæ) from Lake Neuchâtel, Switzerland, has been conducted in order to determine whether or not this distribution is random. Measurements of 954 to 1,343 prism widths (depending on shell sample) have been made using a scanning electron microscope in backscattered electron mode. A white noise test has been applied to the distribution of prism sizes (i.e. width). It shows that there is no temporal cycle that could potentially influence their formation and growth. These results suggest that prism widths are randomly distributed, and related neither to external rings nor to environmental constraints.

  4. Drop Spreading with Random Viscosity

    NASA Astrophysics Data System (ADS)

    Xu, Feng; Jensen, Oliver

    2016-11-01

    Airway mucus acts as a barrier to protect the lung. However as a biological material, its physical properties are known imperfectly and can be spatially heterogeneous. In this study we assess the impact of these uncertainties on the rate of spreading of a drop (representing an inhaled aerosol) over a mucus film. We model the film as Newtonian, having a viscosity that depends linearly on the concentration of a passive solute (a crude proxy for mucin proteins). Given an initial random solute (and hence viscosity) distribution, described as a Gaussian random field with a given correlation structure, we seek to quantify the uncertainties in outcomes as the drop spreads. Using lubrication theory, we describe the spreading of the drop in terms of a system of coupled nonlinear PDEs governing the evolution of film height and the vertically-averaged solute concentration. We perform Monte Carlo simulations to predict the variability in the drop centre location and width (1D) or area (2D). We show how simulation results are well described (at much lower computational cost) by a low-order model using a weak disorder expansion. Our results show for example how variability in the drop location is a non-monotonic function of the solute correlation length increases. Engineering and Physical Sciences Research Council.

  5. Random distributed feedback fiber laser at 2.1  μm.

    PubMed

    Jin, Xiaoxi; Lou, Zhaokai; Zhang, Hanwei; Xu, Jiangming; Zhou, Pu; Liu, Zejin

    2016-11-01

    We demonstrate a random distributed feedback fiber laser at 2.1 μm. A high-power pulsed Tm-doped fiber laser operating at 1.94 μm with a temporal duty ratio of 30% was employed as a pump laser to increase the equivalent incident pump power. A piece of 150 m highly GeO2-doped silica fiber that provides a strong Raman gain and random distributed feedbacks was used to act as the gain medium. The maximum output power reached 0.5 W with the optical efficiency of 9%, which could be further improved by more pump power and optimized fiber length. To the best of our knowledge, this is the first demonstration of random distributed feedback fiber laser at 2 μm band based on Raman gain.

  6. Random attractor of non-autonomous stochastic Boussinesq lattice system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Min, E-mail: zhaomin1223@126.com; Zhou, Shengfan, E-mail: zhoushengfan@yahoo.com

    2015-09-15

    In this paper, we first consider the existence of tempered random attractor for second-order non-autonomous stochastic lattice dynamical system of nonlinear Boussinesq equations effected by time-dependent coupled coefficients and deterministic forces and multiplicative white noise. Then, we establish the upper semicontinuity of random attractors as the intensity of noise approaches zero.

  7. Discovering non-random segregation of sister chromatids: the naïve treatment of a premature discovery

    PubMed Central

    Lark, Karl G.

    2013-01-01

    The discovery of non-random chromosome segregation (Figure 1) is discussed from the perspective of what was known in 1965 and 1966. The distinction between daughter, parent, or grandparent strands of DNA was developed in a bacterial system and led to the discovery that multiple copies of DNA elements of bacteria are not distributed randomly with respect to the age of the template strand. Experiments with higher eukaryotic cells demonstrated that during mitosis Mendel’s laws were violated; and the initial serendipitous choice of eukaryotic cell system led to the striking example of non-random segregation of parent and grandparent DNA template strands in primary cultures of cells derived from mouse embryos. Attempts to extrapolate these findings to established tissue culture lines demonstrated that the property could be lost. Experiments using plant root tips demonstrated that the phenomenon exists in plants and that it was, at some level, under genetic control. Despite publication in major journals and symposia (Lark et al., 1966, 1967; Lark, 1967, 1969a,b,c) the potential implications of these findings were ignored for several decades. Here we explore possible reasons for the pre-maturity (Stent, 1972) of this discovery. PMID:23378946

  8. Estimation of distributed Fermat-point location for wireless sensor networking.

    PubMed

    Huang, Po-Hsian; Chen, Jiann-Liang; Larosa, Yanuarius Teofilus; Chiang, Tsui-Lien

    2011-01-01

    This work presents a localization scheme for use in wireless sensor networks (WSNs) that is based on a proposed connectivity-based RF localization strategy called the distributed Fermat-point location estimation algorithm (DFPLE). DFPLE applies triangle area of location estimation formed by intersections of three neighboring beacon nodes. The Fermat point is determined as the shortest path from three vertices of the triangle. The area of estimated location then refined using Fermat point to achieve minimum error in estimating sensor nodes location. DFPLE solves problems of large errors and poor performance encountered by localization schemes that are based on a bounding box algorithm. Performance analysis of a 200-node development environment reveals that, when the number of sensor nodes is below 150, the mean error decreases rapidly as the node density increases, and when the number of sensor nodes exceeds 170, the mean error remains below 1% as the node density increases. Second, when the number of beacon nodes is less than 60, normal nodes lack sufficient beacon nodes to enable their locations to be estimated. However, the mean error changes slightly as the number of beacon nodes increases above 60. Simulation results revealed that the proposed algorithm for estimating sensor positions is more accurate than existing algorithms, and improves upon conventional bounding box strategies.

  9. Models of multidimensional discrete distribution of probabilities of random variables in information systems

    NASA Astrophysics Data System (ADS)

    Gromov, Yu Yu; Minin, Yu V.; Ivanova, O. G.; Morozova, O. N.

    2018-03-01

    Multidimensional discrete distributions of probabilities of independent random values were received. Their one-dimensional distribution is widely used in probability theory. Producing functions of those multidimensional distributions were also received.

  10. Exact posterior computation in non-conjugate Gaussian location-scale parameters models

    NASA Astrophysics Data System (ADS)

    Andrade, J. A. A.; Rathie, P. N.

    2017-12-01

    In Bayesian analysis the class of conjugate models allows to obtain exact posterior distributions, however this class quite restrictive in the sense that it involves only a few distributions. In fact, most of the practical applications involves non-conjugate models, thus approximate methods, such as the MCMC algorithms, are required. Although these methods can deal with quite complex structures, some practical problems can make their applications quite time demanding, for example, when we use heavy-tailed distributions, convergence may be difficult, also the Metropolis-Hastings algorithm can become very slow, in addition to the extra work inevitably required on choosing efficient candidate generator distributions. In this work, we draw attention to the special functions as a tools for Bayesian computation, we propose an alternative method for obtaining the posterior distribution in Gaussian non-conjugate models in an exact form. We use complex integration methods based on the H-function in order to obtain the posterior distribution and some of its posterior quantities in an explicit computable form. Two examples are provided in order to illustrate the theory.

  11. Distributed Synchronization in Networks of Agent Systems With Nonlinearities and Random Switchings.

    PubMed

    Tang, Yang; Gao, Huijun; Zou, Wei; Kurths, Jürgen

    2013-02-01

    In this paper, the distributed synchronization problem of networks of agent systems with controllers and nonlinearities subject to Bernoulli switchings is investigated. Controllers and adaptive updating laws injected in each vertex of networks depend on the state information of its neighborhood. Three sets of Bernoulli stochastic variables are introduced to describe the occurrence probabilities of distributed adaptive controllers, updating laws and nonlinearities, respectively. By the Lyapunov functions method, we show that the distributed synchronization of networks composed of agent systems with multiple randomly occurring nonlinearities, multiple randomly occurring controllers, and multiple randomly occurring updating laws can be achieved in mean square under certain criteria. The conditions derived in this paper can be solved by semi-definite programming. Moreover, by mathematical analysis, we find that the coupling strength, the probabilities of the Bernoulli stochastic variables, and the form of nonlinearities have great impacts on the convergence speed and the terminal control strength. The synchronization criteria and the observed phenomena are demonstrated by several numerical simulation examples. In addition, the advantage of distributed adaptive controllers over conventional adaptive controllers is illustrated.

  12. Survival distributions impact the power of randomized placebo-phase design and parallel groups randomized clinical trials.

    PubMed

    Abrahamyan, Lusine; Li, Chuan Silvia; Beyene, Joseph; Willan, Andrew R; Feldman, Brian M

    2011-03-01

    The study evaluated the power of the randomized placebo-phase design (RPPD)-a new design of randomized clinical trials (RCTs), compared with the traditional parallel groups design, assuming various response time distributions. In the RPPD, at some point, all subjects receive the experimental therapy, and the exposure to placebo is for only a short fixed period of time. For the study, an object-oriented simulation program was written in R. The power of the simulated trials was evaluated using six scenarios, where the treatment response times followed the exponential, Weibull, or lognormal distributions. The median response time was assumed to be 355 days for the placebo and 42 days for the experimental drug. Based on the simulation results, the sample size requirements to achieve the same level of power were different under different response time to treatment distributions. The scenario where the response times followed the exponential distribution had the highest sample size requirement. In most scenarios, the parallel groups RCT had higher power compared with the RPPD. The sample size requirement varies depending on the underlying hazard distribution. The RPPD requires more subjects to achieve a similar power to the parallel groups design. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. A Distribution-class Locational Marginal Price (DLMP) Index for Enhanced Distribution Systems

    NASA Astrophysics Data System (ADS)

    Akinbode, Oluwaseyi Wemimo

    The smart grid initiative is the impetus behind changes that are expected to culminate into an enhanced distribution system with the communication and control infrastructure to support advanced distribution system applications and resources such as distributed generation, energy storage systems, and price responsive loads. This research proposes a distribution-class analog of the transmission LMP (DLMP) as an enabler of the advanced applications of the enhanced distribution system. The DLMP is envisioned as a control signal that can incentivize distribution system resources to behave optimally in a manner that benefits economic efficiency and system reliability and that can optimally couple the transmission and the distribution systems. The DLMP is calculated from a two-stage optimization problem; a transmission system OPF and a distribution system OPF. An iterative framework that ensures accurate representation of the distribution system's price sensitive resources for the transmission system problem and vice versa is developed and its convergence problem is discussed. As part of the DLMP calculation framework, a DCOPF formulation that endogenously captures the effect of real power losses is discussed. The formulation uses piecewise linear functions to approximate losses. This thesis explores, with theoretical proofs, the breakdown of the loss approximation technique when non-positive DLMPs/LMPs occur and discusses a mixed integer linear programming formulation that corrects the breakdown. The DLMP is numerically illustrated in traditional and enhanced distribution systems and its superiority to contemporary pricing mechanisms is demonstrated using price responsive loads. Results show that the impact of the inaccuracy of contemporary pricing schemes becomes significant as flexible resources increase. At high elasticity, aggregate load consumption deviated from the optimal consumption by up to about 45 percent when using a flat or time-of-use rate. Individual load

  14. Mean first-passage times of non-Markovian random walkers in confinement.

    PubMed

    Guérin, T; Levernier, N; Bénichou, O; Voituriez, R

    2016-06-16

    The first-passage time, defined as the time a random walker takes to reach a target point in a confining domain, is a key quantity in the theory of stochastic processes. Its importance comes from its crucial role in quantifying the efficiency of processes as varied as diffusion-limited reactions, target search processes or the spread of diseases. Most methods of determining the properties of first-passage time in confined domains have been limited to Markovian (memoryless) processes. However, as soon as the random walker interacts with its environment, memory effects cannot be neglected: that is, the future motion of the random walker does not depend only on its current position, but also on its past trajectory. Examples of non-Markovian dynamics include single-file diffusion in narrow channels, or the motion of a tracer particle either attached to a polymeric chain or diffusing in simple or complex fluids such as nematics, dense soft colloids or viscoelastic solutions. Here we introduce an analytical approach to calculate, in the limit of a large confining volume, the mean first-passage time of a Gaussian non-Markovian random walker to a target. The non-Markovian features of the dynamics are encompassed by determining the statistical properties of the fictitious trajectory that the random walker would follow after the first-passage event takes place, which are shown to govern the first-passage time kinetics. This analysis is applicable to a broad range of stochastic processes, which may be correlated at long times. Our theoretical predictions are confirmed by numerical simulations for several examples of non-Markovian processes, including the case of fractional Brownian motion in one and higher dimensions. These results reveal, on the basis of Gaussian processes, the importance of memory effects in first-passage statistics of non-Markovian random walkers in confinement.

  15. Mean first-passage times of non-Markovian random walkers in confinement

    NASA Astrophysics Data System (ADS)

    Guérin, T.; Levernier, N.; Bénichou, O.; Voituriez, R.

    2016-06-01

    The first-passage time, defined as the time a random walker takes to reach a target point in a confining domain, is a key quantity in the theory of stochastic processes. Its importance comes from its crucial role in quantifying the efficiency of processes as varied as diffusion-limited reactions, target search processes or the spread of diseases. Most methods of determining the properties of first-passage time in confined domains have been limited to Markovian (memoryless) processes. However, as soon as the random walker interacts with its environment, memory effects cannot be neglected: that is, the future motion of the random walker does not depend only on its current position, but also on its past trajectory. Examples of non-Markovian dynamics include single-file diffusion in narrow channels, or the motion of a tracer particle either attached to a polymeric chain or diffusing in simple or complex fluids such as nematics, dense soft colloids or viscoelastic solutions. Here we introduce an analytical approach to calculate, in the limit of a large confining volume, the mean first-passage time of a Gaussian non-Markovian random walker to a target. The non-Markovian features of the dynamics are encompassed by determining the statistical properties of the fictitious trajectory that the random walker would follow after the first-passage event takes place, which are shown to govern the first-passage time kinetics. This analysis is applicable to a broad range of stochastic processes, which may be correlated at long times. Our theoretical predictions are confirmed by numerical simulations for several examples of non-Markovian processes, including the case of fractional Brownian motion in one and higher dimensions. These results reveal, on the basis of Gaussian processes, the importance of memory effects in first-passage statistics of non-Markovian random walkers in confinement.

  16. Distribution of randomly diffusing particles in inhomogeneous media

    NASA Astrophysics Data System (ADS)

    Li, Yiwei; Kahraman, Osman; Haselwandter, Christoph A.

    2017-09-01

    Diffusion can be conceptualized, at microscopic scales, as the random hopping of particles between neighboring lattice sites. In the case of diffusion in inhomogeneous media, distinct spatial domains in the system may yield distinct particle hopping rates. Starting from the master equations (MEs) governing diffusion in inhomogeneous media we derive here, for arbitrary spatial dimensions, the deterministic lattice equations (DLEs) specifying the average particle number at each lattice site for randomly diffusing particles in inhomogeneous media. We consider the case of free (Fickian) diffusion with no steric constraints on the maximum particle number per lattice site as well as the case of diffusion under steric constraints imposing a maximum particle concentration. We find, for both transient and asymptotic regimes, excellent agreement between the DLEs and kinetic Monte Carlo simulations of the MEs. The DLEs provide a computationally efficient method for predicting the (average) distribution of randomly diffusing particles in inhomogeneous media, with the number of DLEs associated with a given system being independent of the number of particles in the system. From the DLEs we obtain general analytic expressions for the steady-state particle distributions for free diffusion and, in special cases, diffusion under steric constraints in inhomogeneous media. We find that, in the steady state of the system, the average fraction of particles in a given domain is independent of most system properties, such as the arrangement and shape of domains, and only depends on the number of lattice sites in each domain, the particle hopping rates, the number of distinct particle species in the system, and the total number of particles of each particle species in the system. Our results provide general insights into the role of spatially inhomogeneous particle hopping rates in setting the particle distributions in inhomogeneous media.

  17. A Pearson Random Walk with Steps of Uniform Orientation and Dirichlet Distributed Lengths

    NASA Astrophysics Data System (ADS)

    Le Caër, Gérard

    2010-08-01

    A constrained diffusive random walk of n steps in ℝ d and a random flight in ℝ d , which are equivalent, were investigated independently in recent papers (J. Stat. Phys. 127:813, 2007; J. Theor. Probab. 20:769, 2007, and J. Stat. Phys. 131:1039, 2008). The n steps of the walk are independent and identically distributed random vectors of exponential length and uniform orientation. Conditioned on the sum of their lengths being equal to a given value l, closed-form expressions for the distribution of the endpoint of the walk were obtained altogether for any n for d=1,2,4. Uniform distributions of the endpoint inside a ball of radius l were evidenced for a walk of three steps in 2D and of two steps in 4D. The previous walk is generalized by considering step lengths which have independent and identical gamma distributions with a shape parameter q>0. Given the total walk length being equal to 1, the step lengths have a Dirichlet distribution whose parameters are all equal to q. The walk and the flight above correspond to q=1. Simple analytical expressions are obtained for any d≥2 and n≥2 for the endpoint distributions of two families of walks whose q are integers or half-integers which depend solely on d. These endpoint distributions have a simple geometrical interpretation. Expressed for a two-step planar walk whose q=1, it means that the distribution of the endpoint on a disc of radius 1 is identical to the distribution of the projection on the disc of a point M uniformly distributed over the surface of the 3D unit sphere. Five additional walks, with a uniform distribution of the endpoint in the inside of a ball, are found from known finite integrals of products of powers and Bessel functions of the first kind. They include four different walks in ℝ3, two of two steps and two of three steps, and one walk of two steps in ℝ4. Pearson-Liouville random walks, obtained by distributing the total lengths of the previous Pearson-Dirichlet walks according to some

  18. A Comparative Study of Two Azimuth Based Non Standard Location Methods

    DTIC Science & Technology

    2017-03-23

    Standard Location Methods Rongsong JIH U.S. Department of State / Arms Control, Verification, and Compliance Bureau, 2201 C Street, NW, Washington...COMPARATIVE STUDY OF TWO AZIMUTH-BASED NON-STANDARD LOCATION METHODS R. Jih Department of State / Arms Control, Verification, and Compliance Bureau...cable. The so-called “Yin Zhong Xian” (“引中线” in Chinese) algorithm, hereafter the YZX method , is an Oriental version of IPB-based procedure. It

  19. Individual complex Dirac eigenvalue distributions from random matrix theory and comparison to quenched lattice QCD with a quark chemical potential.

    PubMed

    Akemann, G; Bloch, J; Shifrin, L; Wettig, T

    2008-01-25

    We analyze how individual eigenvalues of the QCD Dirac operator at nonzero quark chemical potential are distributed in the complex plane. Exact and approximate analytical results for both quenched and unquenched distributions are derived from non-Hermitian random matrix theory. When comparing these to quenched lattice QCD spectra close to the origin, excellent agreement is found for zero and nonzero topology at several values of the quark chemical potential. Our analytical results are also applicable to other physical systems in the same symmetry class.

  20. Distribution of Schmidt-like eigenvalues for Gaussian ensembles of the random matrix theory

    NASA Astrophysics Data System (ADS)

    Pato, Mauricio P.; Oshanin, Gleb

    2013-03-01

    We study the probability distribution function P(β)n(w) of the Schmidt-like random variable w = x21/(∑j = 1nx2j/n), where xj, (j = 1, 2, …, n), are unordered eigenvalues of a given n × n β-Gaussian random matrix, β being the Dyson symmetry index. This variable, by definition, can be considered as a measure of how any individual (randomly chosen) eigenvalue deviates from the arithmetic mean value of all eigenvalues of a given random matrix, and its distribution is calculated with respect to the ensemble of such β-Gaussian random matrices. We show that in the asymptotic limit n → ∞ and for arbitrary β the distribution P(β)n(w) converges to the Marčenko-Pastur form, i.e. is defined as P_{n}^{( \\beta )}(w) \\sim \\sqrt{(4 - w)/w} for w ∈ [0, 4] and equals zero outside of the support, despite the fact that formally w is defined on the interval [0, n]. Furthermore, for Gaussian unitary ensembles (β = 2) we present exact explicit expressions for P(β = 2)n(w) which are valid for arbitrary n and analyse their behaviour.

  1. Selecting Random Distributed Elements for HIFU using Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Zhou, Yufeng

    2011-09-01

    As an effective and noninvasive therapeutic modality for tumor treatment, high-intensity focused ultrasound (HIFU) has attracted attention from both physicians and patients. New generations of HIFU systems with the ability to electrically steer the HIFU focus using phased array transducers have been under development. The presence of side and grating lobes may cause undesired thermal accumulation at the interface of the coupling medium (i.e. water) and skin, or in the intervening tissue. Although sparse randomly distributed piston elements could reduce the amplitude of grating lobes, there are theoretically no grating lobes with the use of concave elements in the new phased array HIFU. A new HIFU transmission strategy is proposed in this study, firing a number of but not all elements for a certain period and then changing to another group for the next firing sequence. The advantages are: 1) the asymmetric position of active elements may reduce the side lobes, and 2) each element has some resting time during the entire HIFU ablation (up to several hours for some clinical applications) so that the decreasing efficiency of the transducer due to thermal accumulation is minimized. Genetic algorithm was used for selecting randomly distributed elements in a HIFU array. Amplitudes of the first side lobes at the focal plane were used as the fitness value in the optimization. Overall, it is suggested that the proposed new strategy could reduce the side lobe and the consequent side-effects, and the genetic algorithm is effective in selecting those randomly distributed elements in a HIFU array.

  2. Locational Marginal Pricing in the Campus Power System at the Power Distribution Level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hao, Jun; Gu, Yi; Zhang, Yingchen

    2016-11-14

    In the development of smart grid at distribution level, the realization of real-time nodal pricing is one of the key challenges. The research work in this paper implements and studies the methodology of locational marginal pricing at distribution level based on a real-world distribution power system. The pricing mechanism utilizes optimal power flow to calculate the corresponding distributional nodal prices. Both Direct Current Optimal Power Flow and Alternate Current Optimal Power Flow are utilized to calculate and analyze the nodal prices. The University of Denver campus power grid is used as the power distribution system test bed to demonstrate themore » pricing methodology.« less

  3. Location priority for non-formal early childhood education school based on promethee method and map visualization

    NASA Astrophysics Data System (ADS)

    Ayu Nurul Handayani, Hemas; Waspada, Indra

    2018-05-01

    Non-formal Early Childhood Education (non-formal ECE) is an education that is held for children under 4 years old. The implementation in District of Banyumas, Non-formal ECE is monitored by The District Government of Banyumas and helped by Sanggar Kegiatan Belajar (SKB) Purwokerto as one of the organizer of Non-formal Education. The government itself has a program for distributing ECE to all villages in Indonesia. However, The location to construct the ECE school in several years ahead is not arranged yet. Therefore, for supporting that program, a decision support system is made to give some recommendation villages for constructing The ECE building. The data are projected based on Brown’s Double Exponential Smoothing Method and utilizing Preference Ranking Organization Method for Enrichment Evaluation (Promethee) to generate priority order. As the recommendations system, it generates map visualization which is colored according to the priority level of sub-district and village area. The system was tested with black box testing, Promethee testing, and usability testing. The results showed that the system functionality and Promethee algorithm were working properly, and the user was satisfied.

  4. Randomly and Non-Randomly Missing Renal Function Data in the Strong Heart Study: A Comparison of Imputation Methods

    PubMed Central

    Shara, Nawar; Yassin, Sayf A.; Valaitis, Eduardas; Wang, Hong; Howard, Barbara V.; Wang, Wenyu; Lee, Elisa T.; Umans, Jason G.

    2015-01-01

    Kidney and cardiovascular disease are widespread among populations with high prevalence of diabetes, such as American Indians participating in the Strong Heart Study (SHS). Studying these conditions simultaneously in longitudinal studies is challenging, because the morbidity and mortality associated with these diseases result in missing data, and these data are likely not missing at random. When such data are merely excluded, study findings may be compromised. In this article, a subset of 2264 participants with complete renal function data from Strong Heart Exams 1 (1989–1991), 2 (1993–1995), and 3 (1998–1999) was used to examine the performance of five methods used to impute missing data: listwise deletion, mean of serial measures, adjacent value, multiple imputation, and pattern-mixture. Three missing at random models and one non-missing at random model were used to compare the performance of the imputation techniques on randomly and non-randomly missing data. The pattern-mixture method was found to perform best for imputing renal function data that were not missing at random. Determining whether data are missing at random or not can help in choosing the imputation method that will provide the most accurate results. PMID:26414328

  5. Randomly and Non-Randomly Missing Renal Function Data in the Strong Heart Study: A Comparison of Imputation Methods.

    PubMed

    Shara, Nawar; Yassin, Sayf A; Valaitis, Eduardas; Wang, Hong; Howard, Barbara V; Wang, Wenyu; Lee, Elisa T; Umans, Jason G

    2015-01-01

    Kidney and cardiovascular disease are widespread among populations with high prevalence of diabetes, such as American Indians participating in the Strong Heart Study (SHS). Studying these conditions simultaneously in longitudinal studies is challenging, because the morbidity and mortality associated with these diseases result in missing data, and these data are likely not missing at random. When such data are merely excluded, study findings may be compromised. In this article, a subset of 2264 participants with complete renal function data from Strong Heart Exams 1 (1989-1991), 2 (1993-1995), and 3 (1998-1999) was used to examine the performance of five methods used to impute missing data: listwise deletion, mean of serial measures, adjacent value, multiple imputation, and pattern-mixture. Three missing at random models and one non-missing at random model were used to compare the performance of the imputation techniques on randomly and non-randomly missing data. The pattern-mixture method was found to perform best for imputing renal function data that were not missing at random. Determining whether data are missing at random or not can help in choosing the imputation method that will provide the most accurate results.

  6. Football fever: goal distributions and non-Gaussian statistics

    NASA Astrophysics Data System (ADS)

    Bittner, E.; Nußbaumer, A.; Janke, W.; Weigel, M.

    2009-02-01

    Analyzing football score data with statistical techniques, we investigate how the not purely random, but highly co-operative nature of the game is reflected in averaged properties such as the probability distributions of scored goals for the home and away teams. As it turns out, especially the tails of the distributions are not well described by the Poissonian or binomial model resulting from the assumption of uncorrelated random events. Instead, a good effective description of the data is provided by less basic distributions such as the negative binomial one or the probability densities of extreme value statistics. To understand this behavior from a microscopical point of view, however, no waiting time problem or extremal process need be invoked. Instead, modifying the Bernoulli random process underlying the Poissonian model to include a simple component of self-affirmation seems to describe the data surprisingly well and allows to understand the observed deviation from Gaussian statistics. The phenomenological distributions used before can be understood as special cases within this framework. We analyzed historical football score data from many leagues in Europe as well as from international tournaments, including data from all past tournaments of the “FIFA World Cup” series, and found the proposed models to be applicable rather universally. In particular, here we analyze the results of the German women’s premier football league and consider the two separate German men’s premier leagues in the East and West during the cold war times as well as the unified league after 1990 to see how scoring in football and the component of self-affirmation depend on cultural and political circumstances.

  7. Evaluating the Use of Random Distribution Theory to Introduce Statistical Inference Concepts to Business Students

    ERIC Educational Resources Information Center

    Larwin, Karen H.; Larwin, David A.

    2011-01-01

    Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…

  8. Viscoelastic flow past mono- and bidisperse random arrays of cylinders: flow resistance, topology and normal stress distribution.

    PubMed

    De, S; Kuipers, J A M; Peters, E A J F; Padding, J T

    2017-12-13

    We investigate creeping viscoelastic fluid flow through two-dimensional porous media consisting of random arrangements of monodisperse and bidisperse cylinders, using our finite volume-immersed boundary method introduced in S. De, et al., J. Non-Newtonian Fluid Mech., 2016, 232, 67-76. The viscoelastic fluid is modeled with a FENE-P model. The simulations show an increased flow resistance with increase in flow rate, even though the bulk response of the fluid to shear flow is shear thinning. We show that if the square root of the permeability is chosen as the characteristic length scale in the determination of the dimensionless Deborah number (De), then all flow resistance curves collapse to a single master curve, irrespective of the pore geometry. Our study reveals how viscoelastic stresses and flow topologies (rotation, shear and extension) are distributed through the porous media, and how they evolve with increasing De. We correlate the local viscoelastic first normal stress differences with the local flow topology and show that the largest normal stress differences are located in shear flow dominated regions and not in extensional flow dominated regions at higher viscoelasticity. The study shows that normal stress differences in shear flow regions may play a crucial role in the increase of flow resistance for viscoelastic flow through such porous media.

  9. Thermodynamic method for generating random stress distributions on an earthquake fault

    USGS Publications Warehouse

    Barall, Michael; Harris, Ruth A.

    2012-01-01

    This report presents a new method for generating random stress distributions on an earthquake fault, suitable for use as initial conditions in a dynamic rupture simulation. The method employs concepts from thermodynamics and statistical mechanics. A pattern of fault slip is considered to be analogous to a micro-state of a thermodynamic system. The energy of the micro-state is taken to be the elastic energy stored in the surrounding medium. Then, the Boltzmann distribution gives the probability of a given pattern of fault slip and stress. We show how to decompose the system into independent degrees of freedom, which makes it computationally feasible to select a random state. However, due to the equipartition theorem, straightforward application of the Boltzmann distribution leads to a divergence which predicts infinite stress. To avoid equipartition, we show that the finite strength of the fault acts to restrict the possible states of the system. By analyzing a set of earthquake scaling relations, we derive a new formula for the expected power spectral density of the stress distribution, which allows us to construct a computer algorithm free of infinities. We then present a new technique for controlling the extent of the rupture by generating a random stress distribution thousands of times larger than the fault surface, and selecting a portion which, by chance, has a positive stress perturbation of the desired size. Finally, we present a new two-stage nucleation method that combines a small zone of forced rupture with a larger zone of reduced fracture energy.

  10. Functions of Japanese Exemplifying Particles in Spoken and Written Discourse

    ERIC Educational Resources Information Center

    Taylor, Yuki Io

    2010-01-01

    This dissertation examines how the Japanese particles "nado", "toka", and "tari" which all may be translated as "such as", "etc.", or "like" behave differently in written and spoken discourse. According to traditional analyses (e.g. Martin, 1987), these particles are assumed to be Exemplifying Particles (EP) used to provide concrete examples to…

  11. Radiation breakage of DNA: a model based on random-walk chromatin structure

    NASA Technical Reports Server (NTRS)

    Ponomarev, A. L.; Sachs, R. K.

    2001-01-01

    Monte Carlo computer software, called DNAbreak, has recently been developed to analyze observed non-random clustering of DNA double strand breaks in chromatin after exposure to densely ionizing radiation. The software models coarse-grained configurations of chromatin and radiation tracks, small-scale details being suppressed in order to obtain statistical results for larger scales, up to the size of a whole chromosome. We here give an analytic counterpart of the numerical model, useful for benchmarks, for elucidating the numerical results, for analyzing the assumptions of a more general but less mechanistic "randomly-located-clusters" formalism, and, potentially, for speeding up the calculations. The equations characterize multi-track DNA fragment-size distributions in terms of one-track action; an important step in extrapolating high-dose laboratory results to the much lower doses of main interest in environmental or occupational risk estimation. The approach can utilize the experimental information on DNA fragment-size distributions to draw inferences about large-scale chromatin geometry during cell-cycle interphase.

  12. A Geostatistical Scaling Approach for the Generation of Non Gaussian Random Variables and Increments

    NASA Astrophysics Data System (ADS)

    Guadagnini, Alberto; Neuman, Shlomo P.; Riva, Monica; Panzeri, Marco

    2016-04-01

    We address manifestations of non-Gaussian statistical scaling displayed by many variables, Y, and their (spatial or temporal) increments. Evidence of such behavior includes symmetry of increment distributions at all separation distances (or lags) with sharp peaks and heavy tails which tend to decay asymptotically as lag increases. Variables reported to exhibit such distributions include quantities of direct relevance to hydrogeological sciences, e.g. porosity, log permeability, electrical resistivity, soil and sediment texture, sediment transport rate, rainfall, measured and simulated turbulent fluid velocity, and other. No model known to us captures all of the documented statistical scaling behaviors in a unique and consistent manner. We recently proposed a generalized sub-Gaussian model (GSG) which reconciles within a unique theoretical framework the probability distributions of a target variable and its increments. We presented an algorithm to generate unconditional random realizations of statistically isotropic or anisotropic GSG functions and illustrated it in two dimensions. In this context, we demonstrated the feasibility of estimating all key parameters of a GSG model underlying a single realization of Y by analyzing jointly spatial moments of Y data and corresponding increments. Here, we extend our GSG model to account for noisy measurements of Y at a discrete set of points in space (or time), present an algorithm to generate conditional realizations of corresponding isotropic or anisotropic random field, and explore them on one- and two-dimensional synthetic test cases.

  13. Online distribution channel increases article usage on Mendeley: a randomized controlled trial.

    PubMed

    Kudlow, Paul; Cockerill, Matthew; Toccalino, Danielle; Dziadyk, Devin Bissky; Rutledge, Alan; Shachak, Aviv; McIntyre, Roger S; Ravindran, Arun; Eysenbach, Gunther

    2017-01-01

    Prior research shows that article reader counts (i.e. saves) on the online reference manager, Mendeley, correlate to future citations. There are currently no evidenced-based distribution strategies that have been shown to increase article saves on Mendeley. We conducted a 4-week randomized controlled trial to examine how promotion of article links in a novel online cross-publisher distribution channel (TrendMD) affect article saves on Mendeley. Four hundred articles published in the Journal of Medical Internet Research were randomized to either the TrendMD arm ( n  = 200) or the control arm ( n  = 200) of the study. Our primary outcome compares the 4-week mean Mendeley saves of articles randomized to TrendMD versus control. Articles randomized to TrendMD showed a 77% increase in article saves on Mendeley relative to control. The difference in mean Mendeley saves for TrendMD articles versus control was 2.7, 95% CI (2.63, 2.77), and statistically significant ( p  < 0.01). There was a positive correlation between pageviews driven by TrendMD and article saves on Mendeley (Spearman's rho r  = 0.60). This is the first randomized controlled trial to show how an online cross-publisher distribution channel (TrendMD) enhances article saves on Mendeley. While replication and further study are needed, these data suggest that cross-publisher article recommendations via TrendMD may enhance citations of scholarly articles.

  14. Challenges in Locating Microseismic Events Using Distributed Acoustic Sensors

    NASA Astrophysics Data System (ADS)

    Williams, A.; Kendall, J. M.; Clarke, A.; Verdon, J.

    2017-12-01

    Microseismic monitoring is an important method of assessing the behaviour of subsurface fluid processes, and is commonly acquired using geophone arrays in boreholes or on the surface. A new alternative technology has been recently developed - fibre-optic Distributed Acoustic Sensing (DAS) - using strain along a fibre-optic cable as a measure of seismic signals. DAS can offer high density arrays and full-well coverage from the surface to bottom, with less overall disruption to operations, so there are many exciting possible applications in monitoring both petroleum and other subsurface industries. However, there are challenges in locating microseismic events recorded using current DAS systems, which only record seismic data in one-component and consequently omit the azimuthal information provided by a three-component geophone. To test the impact of these limitations we used finite difference modelling to generate one-component synthetic DAS datasets and investigated the impact of picking solely P-wave or both P- and S-wave arrivals and the impact of different array geometries. These are then compared to equivalent 3-component synthetic geophone datasets. In simple velocity models, P-wave arrivals along linear arrays cannot be used to constrain locations using DAS, without further a priori information. We then tested the impact of straight cables vs. L-shaped arrays and found improved locations when the cable is deviated, especially when both P- and S-wave picks are included. There is a trade-off between the added coverage of DAS cables versus sparser 3C geophone arrays where particle motion helps constrains locations, which cannot be assessed without forward modelling.

  15. Non-volatile magnetic random access memory

    NASA Technical Reports Server (NTRS)

    Katti, Romney R. (Inventor); Stadler, Henry L. (Inventor); Wu, Jiin-Chuan (Inventor)

    1994-01-01

    Improvements are made in a non-volatile magnetic random access memory. Such a memory is comprised of an array of unit cells, each having a Hall-effect sensor and a thin-film magnetic element made of material having an in-plane, uniaxial anisotropy and in-plane, bipolar remanent magnetization states. The Hall-effect sensor is made more sensitive by using a 1 m thick molecular beam epitaxy grown InAs layer on a silicon substrate by employing a GaAs/AlGaAs/InAlAs superlattice buffering layer. One improvement avoids current shunting problems of matrix architecture. Another improvement reduces the required magnetizing current for the micromagnets. Another improvement relates to the use of GaAs technology wherein high electron-mobility GaAs MESFETs provide faster switching times. Still another improvement relates to a method for configuring the invention as a three-dimensional random access memory.

  16. A lattice Boltzmann simulation of coalescence-induced droplet jumping on superhydrophobic surfaces with randomly distributed structures

    NASA Astrophysics Data System (ADS)

    Zhang, Li-Zhi; Yuan, Wu-Zhi

    2018-04-01

    The motion of coalescence-induced condensate droplets on superhydrophobic surface (SHS) has attracted increasing attention in energy-related applications. Previous researches were focused on regularly rough surfaces. Here a new approach, a mesoscale lattice Boltzmann method (LBM), is proposed and used to model the dynamic behavior of coalescence-induced droplet jumping on SHS with randomly distributed rough structures. A Fast Fourier Transformation (FFT) method is used to generate non-Gaussian randomly distributed rough surfaces with the skewness (Sk), kurtosis (K) and root mean square (Rq) obtained from real surfaces. Three typical spreading states of coalesced droplets are observed through LBM modeling on various rough surfaces, which are found to significantly influence the jumping ability of coalesced droplet. The coalesced droplets spreading in Cassie state or in composite state will jump off the rough surfaces, while the ones spreading in Wenzel state would eventually remain on the rough surfaces. It is demonstrated that the rough surfaces with smaller Sks, larger Rqs and a K at 3.0 are beneficial to coalescence-induced droplet jumping. The new approach gives more detailed insights into the design of SHS.

  17. School Choice in Colorado Springs: The Relationship between Parental Decisions, Location and Neighbourhood Characteristics

    ERIC Educational Resources Information Center

    Theobald, Rebecca

    2005-01-01

    The influence of location as exemplified by neighbourhood factors and school characteristics on primary education is examined in the context of the school choice movement of the last two decades. The analysis incorporates statistical information about schools and population data from Census 2000 describing neighbourhoods and schools in one…

  18. Stimulated luminescence emission from localized recombination in randomly distributed defects.

    PubMed

    Jain, Mayank; Guralnik, Benny; Andersen, Martin Thalbitzer

    2012-09-26

    We present a new kinetic model describing localized electronic recombination through the excited state of the donor (d) to an acceptor (a) centre in luminescent materials. In contrast to the existing models based on the localized transition model (LTM) of Halperin and Braner (1960 Phys. Rev. 117 408-15) which assumes a fixed d → a tunnelling probability for the entire crystal, our model is based on nearest-neighbour recombination within randomly distributed centres. Such a random distribution can occur through the entire volume or within the defect complexes of the dosimeter, and implies that the tunnelling probability varies with the donor-acceptor (d-a) separation distance. We first develop an 'exact kinetic model' that incorporates this variation in tunnelling probabilities, and evolves both in spatial as well as temporal domains. We then develop a simplified one-dimensional, semi-analytical model that evolves only in the temporal domain. An excellent agreement is observed between thermally and optically stimulated luminescence (TL and OSL) results produced from the two models. In comparison to the first-order kinetic behaviour of the LTM of Halperin and Braner (1960 Phys. Rev. 117 408-15), our model results in a highly asymmetric TL peak; this peak can be understood to derive from a continuum of several first-order TL peaks. Our model also shows an extended power law behaviour for OSL (or prompt luminescence), which is expected from localized recombination mechanisms in materials with random distribution of centres.

  19. A novel framework to evaluate pedestrian safety at non-signalized locations.

    PubMed

    Fu, Ting; Miranda-Moreno, Luis; Saunier, Nicolas

    2018-02-01

    This paper proposes a new framework to evaluate pedestrian safety at non-signalized crosswalk locations. In the proposed framework, the yielding maneuver of a driver in response to a pedestrian is split into the reaction and braking time. Hence, the relationship of the distance required for a yielding maneuver and the approaching vehicle speed depends on the reaction time of the driver and deceleration rate that the vehicle can achieve. The proposed framework is represented in the distance-velocity (DV) diagram and referred as the DV model. The interactions between approaching vehicles and pedestrians showing the intention to cross are divided in three categories: i) situations where the vehicle cannot make a complete stop, ii) situations where the vehicle's ability to stop depends on the driver reaction time, and iii) situations where the vehicle can make a complete stop. Based on these classifications, non-yielding maneuvers are classified as "non-infraction non-yielding" maneuvers, "uncertain non-yielding" maneuvers and "non-yielding" violations, respectively. From the pedestrian perspective, crossing decisions are classified as dangerous crossings, risky crossings and safe crossings accordingly. The yielding compliance and yielding rate, as measures of the yielding behavior, are redefined based on these categories. Time to crossing and deceleration rate required for the vehicle to stop are used to measure the probability of collision. Finally, the framework is demonstrated through a case study in evaluating pedestrian safety at three different types of non-signalized crossings: a painted crosswalk, an unprotected crosswalk, and a crosswalk controlled by stop signs. Results from the case study suggest that the proposed framework works well in describing pedestrian-vehicle interactions which helps in evaluating pedestrian safety at non-signalized crosswalk locations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Functional Redundancy Patterns Reveal Non-Random Assembly Rules in a Species-Rich Marine Assemblage

    PubMed Central

    Guillemot, Nicolas; Kulbicki, Michel; Chabanet, Pascale; Vigliola, Laurent

    2011-01-01

    The relationship between species and the functional diversity of assemblages is fundamental in ecology because it contains key information on functional redundancy, and functionally redundant ecosystems are thought to be more resilient, resistant and stable. However, this relationship is poorly understood and undocumented for species-rich coastal marine ecosystems. Here, we used underwater visual censuses to examine the patterns of functional redundancy for one of the most diverse vertebrate assemblages, the coral reef fishes of New Caledonia, South Pacific. First, we found that the relationship between functional and species diversity displayed a non-asymptotic power-shaped curve, implying that rare functions and species mainly occur in highly diverse assemblages. Second, we showed that the distribution of species amongst possible functions was significantly different from a random distribution up to a threshold of ∼90 species/transect. Redundancy patterns for each function further revealed that some functions displayed fast rates of increase in redundancy at low species diversity, whereas others were only becoming redundant past a certain threshold. This suggested non-random assembly rules and the existence of some primordial functions that would need to be fulfilled in priority so that coral reef fish assemblages can gain a basic ecological structure. Last, we found little effect of habitat on the shape of the functional-species diversity relationship and on the redundancy of functions, although habitat is known to largely determine assemblage characteristics such as species composition, biomass, and abundance. Our study shows that low functional redundancy is characteristic of this highly diverse fish assemblage, and, therefore, that even species-rich ecosystems such as coral reefs may be vulnerable to the removal of a few keystone species. PMID:22039543

  1. Superdiffusion in a non-Markovian random walk model with a Gaussian memory profile

    NASA Astrophysics Data System (ADS)

    Borges, G. M.; Ferreira, A. S.; da Silva, M. A. A.; Cressoni, J. C.; Viswanathan, G. M.; Mariz, A. M.

    2012-09-01

    Most superdiffusive Non-Markovian random walk models assume that correlations are maintained at all time scales, e.g., fractional Brownian motion, Lévy walks, the Elephant walk and Alzheimer walk models. In the latter two models the random walker can always "remember" the initial times near t = 0. Assuming jump size distributions with finite variance, the question naturally arises: is superdiffusion possible if the walker is unable to recall the initial times? We give a conclusive answer to this general question, by studying a non-Markovian model in which the walker's memory of the past is weighted by a Gaussian centered at time t/2, at which time the walker had one half the present age, and with a standard deviation σt which grows linearly as the walker ages. For large widths we find that the model behaves similarly to the Elephant model, but for small widths this Gaussian memory profile model behaves like the Alzheimer walk model. We also report that the phenomenon of amnestically induced persistence, known to occur in the Alzheimer walk model, arises in the Gaussian memory profile model. We conclude that memory of the initial times is not a necessary condition for generating (log-periodic) superdiffusion. We show that the phenomenon of amnestically induced persistence extends to the case of a Gaussian memory profile.

  2. Complementarity between entanglement-assisted and quantum distributed random access code

    NASA Astrophysics Data System (ADS)

    Hameedi, Alley; Saha, Debashis; Mironowicz, Piotr; Pawłowski, Marcin; Bourennane, Mohamed

    2017-05-01

    Collaborative communication tasks such as random access codes (RACs) employing quantum resources have manifested great potential in enhancing information processing capabilities beyond the classical limitations. The two quantum variants of RACs, namely, quantum random access code (QRAC) and the entanglement-assisted random access code (EARAC), have demonstrated equal prowess for a number of tasks. However, there do exist specific cases where one outperforms the other. In this article, we study a family of 3 →1 distributed RACs [J. Bowles, N. Brunner, and M. Pawłowski, Phys. Rev. A 92, 022351 (2015), 10.1103/PhysRevA.92.022351] and present its general construction of both the QRAC and the EARAC. We demonstrate that, depending on the function of inputs that is sought, if QRAC achieves the maximal success probability then EARAC fails to do so and vice versa. Moreover, a tripartite Bell-type inequality associated with the EARAC variants reveals the genuine multipartite nonlocality exhibited by our protocol. We conclude with an experimental realization of the 3 →1 distributed QRAC that achieves higher success probabilities than the maximum possible with EARACs for a number of tasks.

  3. Correction of confounding bias in non-randomized studies by appropriate weighting.

    PubMed

    Schmoor, Claudia; Gall, Christine; Stampf, Susanne; Graf, Erika

    2011-03-01

    In non-randomized studies, the assessment of a causal effect of treatment or exposure on outcome is hampered by possible confounding. Applying multiple regression models including the effects of treatment and covariates on outcome is the well-known classical approach to adjust for confounding. In recent years other approaches have been promoted. One of them is based on the propensity score and considers the effect of possible confounders on treatment as a relevant criterion for adjustment. Another proposal is based on using an instrumental variable. Here inference relies on a factor, the instrument, which affects treatment but is thought to be otherwise unrelated to outcome, so that it mimics randomization. Each of these approaches can basically be interpreted as a simple reweighting scheme, designed to address confounding. The procedures will be compared with respect to their fundamental properties, namely, which bias they aim to eliminate, which effect they aim to estimate, and which parameter is modelled. We will expand our overview of methods for analysis of non-randomized studies to methods for analysis of randomized controlled trials and show that analyses of both study types may target different effects and different parameters. The considerations will be illustrated using a breast cancer study with a so-called Comprehensive Cohort Study design, including a randomized controlled trial and a non-randomized study in the same patient population as sub-cohorts. This design offers ideal opportunities to discuss and illustrate the properties of the different approaches. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Universal energy distribution for interfaces in a random-field environment

    NASA Astrophysics Data System (ADS)

    Fedorenko, Andrei A.; Stepanow, Semjon

    2003-11-01

    We study the energy distribution function ρ(E) for interfaces in a random-field environment at zero temperature by summing the leading terms in the perturbation expansion of ρ(E) in powers of the disorder strength, and by taking into account the nonperturbational effects of the disorder using the functional renormalization group. We have found that the average and the variance of the energy for one-dimensional interface of length L behave as, R∝L ln L, ΔER∝L, while the distribution function of the energy tends for large L to the Gumbel distribution of the extreme value statistics.

  5. Hessian eigenvalue distribution in a random Gaussian landscape

    NASA Astrophysics Data System (ADS)

    Yamada, Masaki; Vilenkin, Alexander

    2018-03-01

    The energy landscape of multiverse cosmology is often modeled by a multi-dimensional random Gaussian potential. The physical predictions of such models crucially depend on the eigenvalue distribution of the Hessian matrix at potential minima. In particular, the stability of vacua and the dynamics of slow-roll inflation are sensitive to the magnitude of the smallest eigenvalues. The Hessian eigenvalue distribution has been studied earlier, using the saddle point approximation, in the leading order of 1/ N expansion, where N is the dimensionality of the landscape. This approximation, however, is insufficient for the small eigenvalue end of the spectrum, where sub-leading terms play a significant role. We extend the saddle point method to account for the sub-leading contributions. We also develop a new approach, where the eigenvalue distribution is found as an equilibrium distribution at the endpoint of a stochastic process (Dyson Brownian motion). The results of the two approaches are consistent in cases where both methods are applicable. We discuss the implications of our results for vacuum stability and slow-roll inflation in the landscape.

  6. Distribution of 45S rDNA sites in chromosomes of plants: Structural and evolutionary implications

    PubMed Central

    2012-01-01

    Background 45S rDNA sites are the most widely documented chromosomal regions in eukaryotes. The analysis of the distribution of these sites along the chromosome in several genera has suggested some bias in their distribution. In order to evaluate if these loci are in fact non-randomly distributed and what is the influence of some chromosomal and karyotypic features on the distribution of these sites, a database was built with the position and number of 45S rDNA sites obtained by FISH together with other karyotypic data from 846 plant species. Results In angiosperms the most frequent numbers of sites per diploid karyotype were two and four, suggesting that in spite of the wide dispersion capacity of these sequences the number of rDNA sites tends to be restricted. The sites showed a preferential distribution on the short arms, mainly in the terminal regions. Curiously, these sites were frequently found on the short arms of acrocentric chromosomes where they usually occupy the whole arm. The trend to occupy the terminal region is especially evident in holokinetic chromosomes, where all of them were terminally located. In polyploids there is a trend towards reduction in the number of sites per monoploid complement. In gymnosperms, however, the distribution of rDNA sites varied strongly among the sampled families. Conclusions The location of 45S rDNA sites do not vary randomly, occurring preferentially on the short arm and in the terminal region of chromosomes in angiosperms. The meaning of this preferential location is not known, but some hypotheses are considered and the observed trends are discussed. PMID:23181612

  7. Narrow log-periodic modulations in non-Markovian random walks

    NASA Astrophysics Data System (ADS)

    Diniz, R. M. B.; Cressoni, J. C.; da Silva, M. A. A.; Mariz, A. M.; de Araújo, J. M.

    2017-12-01

    What are the necessary ingredients for log-periodicity to appear in the dynamics of a random walk model? Can they be subtle enough to be overlooked? Previous studies suggest that long-range damaged memory and negative feedback together are necessary conditions for the emergence of log-periodic oscillations. The role of negative feedback would then be crucial, forcing the system to change direction. In this paper we show that small-amplitude log-periodic oscillations can emerge when the system is driven by positive feedback. Due to their very small amplitude, these oscillations can easily be mistaken for numerical finite-size effects. The models we use consist of discrete-time random walks with strong memory correlations where the decision process is taken from memory profiles based either on a binomial distribution or on a delta distribution. Anomalous superdiffusive behavior and log-periodic modulations are shown to arise in the large time limit for convenient choices of the models parameters.

  8. A Permutation-Randomization Approach to Test the Spatial Distribution of Plant Diseases.

    PubMed

    Lione, G; Gonthier, P

    2016-01-01

    The analysis of the spatial distribution of plant diseases requires the availability of trustworthy geostatistical methods. The mean distance tests (MDT) are here proposed as a series of permutation and randomization tests to assess the spatial distribution of plant diseases when the variable of phytopathological interest is categorical. A user-friendly software to perform the tests is provided. Estimates of power and type I error, obtained with Monte Carlo simulations, showed the reliability of the MDT (power > 0.80; type I error < 0.05). A biological validation on the spatial distribution of spores of two fungal pathogens causing root rot on conifers was successfully performed by verifying the consistency between the MDT responses and previously published data. An application of the MDT was carried out to analyze the relation between the plantation density and the distribution of the infection of Gnomoniopsis castanea, an emerging fungal pathogen causing nut rot on sweet chestnut. Trees carrying nuts infected by the pathogen were randomly distributed in areas with different plantation densities, suggesting that the distribution of G. castanea was not related to the plantation density. The MDT could be used to analyze the spatial distribution of plant diseases both in agricultural and natural ecosystems.

  9. Location, sidedness, and sex distribution of intracranial arachnoid cysts in a population-based sample.

    PubMed

    Helland, Christian A; Lund-Johansen, Morten; Wester, Knut

    2010-11-01

    The aim of this study was to examine the distribution of intracranial arachnoid cysts in a large and unselected patient population with special emphasis on sidedness and sex distribution. In total, 299 patients with 305 arachnoid cysts were studied. These patients were consecutively referred to our department during a 20-year period from a well-defined geographical area with a stable population. There was a strong predilection (198 patients [66.2%]) for intracranial arachnoid cysts in the temporal fossa. Forty-two patients had cysts overlying the frontal convexity, 36 had cysts in the posterior fossa, and 23 patients had cysts in other, different locations. Of 269 cysts with clearly unilateral distribution, 163 were located on the left side and 106 on the right side. This difference resulted from the marked preponderance of temporal fossa cysts on the left side (left-to-right ratio 2.5:1; p < 0.0001 [adjusted < 0.0005]). For cysts in the cerebellopontine angle (CPA), there was preponderance on the right side (p = 0.001 [adjusted = 0.005]). Significantly more males than females had cysts in the temporal fossa (p = 0.002 [adjusted = 0.004]), whereas in the CPA a significant female preponderance was found (p = 0.016 [adjusted = 0.032]). For all other cyst locations, there was no difference between the 2 sexes. Arachnoid cysts have a strong predilection for the temporal fossa. There is a sex dependency for some intracranial locations of arachnoid cysts, with temporal cysts occurring more frequently in men, and CPA cysts found more frequently in women. Furthermore, there is a strong location-related sidedness for arachnoid cysts, independent of patient sex. These findings and reports from the literature suggest a possible genetic component in the development of some arachnoid cysts.

  10. Atom probe study of vanadium interphase precipitates and randomly distributed vanadium precipitates in ferrite.

    PubMed

    Nöhrer, M; Zamberger, S; Primig, S; Leitner, H

    2013-01-01

    Atom probe tomography and transmission electron microscopy were used to examine the precipitation reaction in the austenite and ferrite phases in vanadium micro-alloyed steel after a thermo-mechanical process. It was observed that only in the ferrite phase precipitates could be found, whereupon two different types were detected. Thus, the aim was to reveal the difference between these two types. The first type was randomly distributed precipitates from V supersaturated ferrite and the second type V interphase precipitates. Not only the arrangement of the particles was different also the chemical composition. The randomly distributed precipitates consisted of V, C and N in contrast to that the interphase precipitates showed a composition of V, C and Mn. Furthermore the randomly distributed precipitates had maximum size of 20 nm and the interphase precipitates a maximum size of 15 nm. It was assumed that the reason for these differences is caused by the site in which they were formed. The randomly distributed precipitates were formed in a matrix consisting mainly of 0.05 at% C, 0.68 at% Si, 0.03 at% N, 0.145 at% V and 1.51 at% Mn. The interphase precipitates were formed in a region with a much higher C, Mn and V content. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. On intra-supply chain system with an improved distribution plan, multiple sales locations and quality assurance.

    PubMed

    Chiu, Singa Wang; Huang, Chao-Chih; Chiang, Kuo-Wei; Wu, Mei-Fang

    2015-01-01

    Transnational companies, operating in extremely competitive global markets, always seek to lower different operating costs, such as inventory holding costs in their intra- supply chain system. This paper incorporates a cost reducing product distribution policy into an intra-supply chain system with multiple sales locations and quality assurance studied by [Chiu et al., Expert Syst Appl, 40:2669-2676, (2013)]. Under the proposed cost reducing distribution policy, an added initial delivery of end items is distributed to multiple sales locations to meet their demand during the production unit's uptime and rework time. After rework when the remaining production lot goes through quality assurance, n fixed quantity installments of finished items are then transported to sales locations at a fixed time interval. Mathematical modeling and optimization techniques are used to derive closed-form optimal operating policies for the proposed system. Furthermore, the study demonstrates significant savings in stock holding costs for both the production unit and sales locations. Alternative of outsourcing product delivery task to an external distributor is analyzed to assist managerial decision making in potential outsourcing issues in order to facilitate further reduction in operating costs.

  12. Distributed fiber sensing system with wide frequency response and accurate location

    NASA Astrophysics Data System (ADS)

    Shi, Yi; Feng, Hao; Zeng, Zhoumo

    2016-02-01

    A distributed fiber sensing system merging Mach-Zehnder interferometer and phase-sensitive optical time domain reflectometer (Φ-OTDR) is demonstrated for vibration measurement, which requires wide frequency response and accurate location. Two narrow line-width lasers with delicately different wavelengths are used to constitute the interferometer and reflectometer respectively. A narrow band Fiber Bragg Grating is responsible for separating the two wavelengths. In addition, heterodyne detection is applied to maintain the signal to noise rate of the locating signal. Experiment results show that the novel system has a wide frequency from 1 Hz to 50 MHz, limited by the sample frequency of data acquisition card, and a spatial resolution of 20 m, according to 200 ns pulse width, along 2.5 km fiber link.

  13. Random phenotypic variation of yeast (Saccharomyces cerevisiae) single-gene knockouts fits a double pareto-lognormal distribution.

    PubMed

    Graham, John H; Robb, Daniel T; Poe, Amy R

    2012-01-01

    Distributed robustness is thought to influence the buffering of random phenotypic variation through the scale-free topology of gene regulatory, metabolic, and protein-protein interaction networks. If this hypothesis is true, then the phenotypic response to the perturbation of particular nodes in such a network should be proportional to the number of links those nodes make with neighboring nodes. This suggests a probability distribution approximating an inverse power-law of random phenotypic variation. Zero phenotypic variation, however, is impossible, because random molecular and cellular processes are essential to normal development. Consequently, a more realistic distribution should have a y-intercept close to zero in the lower tail, a mode greater than zero, and a long (fat) upper tail. The double Pareto-lognormal (DPLN) distribution is an ideal candidate distribution. It consists of a mixture of a lognormal body and upper and lower power-law tails. If our assumptions are true, the DPLN distribution should provide a better fit to random phenotypic variation in a large series of single-gene knockout lines than other skewed or symmetrical distributions. We fit a large published data set of single-gene knockout lines in Saccharomyces cerevisiae to seven different probability distributions: DPLN, right Pareto-lognormal (RPLN), left Pareto-lognormal (LPLN), normal, lognormal, exponential, and Pareto. The best model was judged by the Akaike Information Criterion (AIC). Phenotypic variation among gene knockouts in S. cerevisiae fits a double Pareto-lognormal (DPLN) distribution better than any of the alternative distributions, including the right Pareto-lognormal and lognormal distributions. A DPLN distribution is consistent with the hypothesis that developmental stability is mediated, in part, by distributed robustness, the resilience of gene regulatory, metabolic, and protein-protein interaction networks. Alternatively, multiplicative cell growth, and the mixing of

  14. Averaging in SU(2) open quantum random walk

    NASA Astrophysics Data System (ADS)

    Clement, Ampadu

    2014-03-01

    We study the average position and the symmetry of the distribution in the SU(2) open quantum random walk (OQRW). We show that the average position in the central limit theorem (CLT) is non-uniform compared with the average position in the non-CLT. The symmetry of distribution is shown to be even in the CLT.

  15. Ion exhaust distributions and reconnection location with Magnetospheric Multiscale and global MHD test particles

    NASA Astrophysics Data System (ADS)

    Broll, J. M.; Fuselier, S. A.; Trattner, K. J.; Steven, P. M.; Burch, J. L.; Giles, B. L.

    2017-12-01

    Magnetic reconnection at Earth's dayside magnetopause is an essential process in magnetospheric physics. Under southward IMF conditions, reconnection occurs along a thin ribbon across the dayside magnetopause. The location of this ribbon has been studied extensively in terms of global optimization of quantities like reconnecting field energy or magnetic shear, but with expected errors of 1-2 Earth radii these global models give limited context for cases where an observation is near the reconnection line. Building on previous results, which established the cutoff contour method for locating reconnection using in-situ velocity measurements, we examine the effects of MHD-scale waves on reconnection exhaust distributions. We use a test particle exhaust distribution propagated through a globamagnetohydrodynamics model fields and compare with Magnetospheric Multiscale observations of reconnection exhaust.

  16. Spatio-temporal modelling of wind speed variations and extremes in the Caribbean and the Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Rychlik, Igor; Mao, Wengang

    2018-02-01

    The wind speed variability in the North Atlantic has been successfully modelled using a spatio-temporal transformed Gaussian field. However, this type of model does not correctly describe the extreme wind speeds attributed to tropical storms and hurricanes. In this study, the transformed Gaussian model is further developed to include the occurrence of severe storms. In this new model, random components are added to the transformed Gaussian field to model rare events with extreme wind speeds. The resulting random field is locally stationary and homogeneous. The localized dependence structure is described by time- and space-dependent parameters. The parameters have a natural physical interpretation. To exemplify its application, the model is fitted to the ECMWF ERA-Interim reanalysis data set. The model is applied to compute long-term wind speed distributions and return values, e.g., 100- or 1000-year extreme wind speeds, and to simulate random wind speed time series at a fixed location or spatio-temporal wind fields around that location.

  17. Distributed computing feasibility in a non-dedicated homogeneous distributed system

    NASA Technical Reports Server (NTRS)

    Leutenegger, Scott T.; Sun, Xian-He

    1993-01-01

    The low cost and availability of clusters of workstations have lead researchers to re-explore distributed computing using independent workstations. This approach may provide better cost/performance than tightly coupled multiprocessors. In practice, this approach often utilizes wasted cycles to run parallel jobs. The feasibility of such a non-dedicated parallel processing environment assuming workstation processes have preemptive priority over parallel tasks is addressed. An analytical model is developed to predict parallel job response times. Our model provides insight into how significantly workstation owner interference degrades parallel program performance. A new term task ratio, which relates the parallel task demand to the mean service demand of nonparallel workstation processes, is introduced. It was proposed that task ratio is a useful metric for determining how large the demand of a parallel applications must be in order to make efficient use of a non-dedicated distributed system.

  18. Continuous-time random-walk model for financial distributions

    NASA Astrophysics Data System (ADS)

    Masoliver, Jaume; Montero, Miquel; Weiss, George H.

    2003-02-01

    We apply the formalism of the continuous-time random walk to the study of financial data. The entire distribution of prices can be obtained once two auxiliary densities are known. These are the probability densities for the pausing time between successive jumps and the corresponding probability density for the magnitude of a jump. We have applied the formalism to data on the U.S. dollar deutsche mark future exchange, finding good agreement between theory and the observed data.

  19. Non-Fickian dispersion of groundwater age

    PubMed Central

    Engdahl, Nicholas B.; Ginn, Timothy R.; Fogg, Graham E.

    2014-01-01

    We expand the governing equation of groundwater age to account for non-Fickian dispersive fluxes using continuous random walks. Groundwater age is included as an additional (fifth) dimension on which the volumetric mass density of water is distributed and we follow the classical random walk derivation now in five dimensions. The general solution of the random walk recovers the previous conventional model of age when the low order moments of the transition density functions remain finite at their limits and describes non-Fickian age distributions when the transition densities diverge. Previously published transition densities are then used to show how the added dimension in age affects the governing differential equations. Depending on which transition densities diverge, the resulting models may be nonlocal in time, space, or age and can describe asymptotic or pre-asymptotic dispersion. A joint distribution function of time and age transitions is developed as a conditional probability and a natural result of this is that time and age must always have identical transition rate functions. This implies that a transition density defined for age can substitute for a density in time and this has implications for transport model parameter estimation. We present examples of simulated age distributions from a geologically based, heterogeneous domain that exhibit non-Fickian behavior and show that the non-Fickian model provides better descriptions of the distributions than the Fickian model. PMID:24976651

  20. Models for the hotspot distribution

    NASA Technical Reports Server (NTRS)

    Jurdy, Donna M.; Stefanick, Michael

    1990-01-01

    Published hotspot catalogs all show a hemispheric concentration beyond what can be expected by chance. Cumulative distributions about the center of concentration are described by a power law with a fractal dimension closer to 1 than 2. Random sets of the corresponding sizes do not show this effect. A simple shift of the random sets away from a point would produce distributions similar to those of hotspot sets. The possible relation of the hotspots to the locations of ridges and subduction zones is tested using large sets of randomly-generated points to estimate areas within given distances of the plate boundaries. The probability of finding the observed number of hotspots within 10 deg of the ridges is about what is expected.

  1. Extended q -Gaussian and q -exponential distributions from gamma random variables

    NASA Astrophysics Data System (ADS)

    Budini, Adrián A.

    2015-05-01

    The family of q -Gaussian and q -exponential probability densities fit the statistical behavior of diverse complex self-similar nonequilibrium systems. These distributions, independently of the underlying dynamics, can rigorously be obtained by maximizing Tsallis "nonextensive" entropy under appropriate constraints, as well as from superstatistical models. In this paper we provide an alternative and complementary scheme for deriving these objects. We show that q -Gaussian and q -exponential random variables can always be expressed as a function of two statistically independent gamma random variables with the same scale parameter. Their shape index determines the complexity q parameter. This result also allows us to define an extended family of asymmetric q -Gaussian and modified q -exponential densities, which reduce to the standard ones when the shape parameters are the same. Furthermore, we demonstrate that a simple change of variables always allows relating any of these distributions with a beta stochastic variable. The extended distributions are applied in the statistical description of different complex dynamics such as log-return signals in financial markets and motion of point defects in a fluid flow.

  2. Patterns of particle distribution in multiparticle systems by random walks with memory enhancement and decay

    NASA Astrophysics Data System (ADS)

    Tan, Zhi-Jie; Zou, Xian-Wu; Huang, Sheng-You; Zhang, Wei; Jin, Zhun-Zhi

    2002-07-01

    We investigate the pattern of particle distribution and its evolution with time in multiparticle systems using the model of random walks with memory enhancement and decay. This model describes some biological intelligent walks. With decrease in the memory decay exponent α, the distribution of particles changes from a random dispersive pattern to a locally dense one, and then returns to the random one. Correspondingly, the fractal dimension Df,p characterizing the distribution of particle positions increases from a low value to a maximum and then decreases to the low one again. This is determined by the degree of overlap of regions consisting of sites with remanent information. The second moment of the density ρ(2) was introduced to investigate the inhomogeneity of the particle distribution. The dependence of ρ(2) on α is similar to that of Df,p on α. ρ(2) increases with time as a power law in the process of adjusting the particle distribution, and then ρ(2) tends to a stable equilibrium value.

  3. Saddlepoint approximation to the distribution of the total distance of the continuous time random walk

    NASA Astrophysics Data System (ADS)

    Gatto, Riccardo

    2017-12-01

    This article considers the random walk over Rp, with p ≥ 2, where a given particle starts at the origin and moves stepwise with uniformly distributed step directions and step lengths following a common distribution. Step directions and step lengths are independent. The case where the number of steps of the particle is fixed and the more general case where it follows an independent continuous time inhomogeneous counting process are considered. Saddlepoint approximations to the distribution of the distance from the position of the particle to the origin are provided. Despite the p-dimensional nature of the random walk, the computations of the saddlepoint approximations are one-dimensional and thus simple. Explicit formulae are derived with dimension p = 3: for uniformly and exponentially distributed step lengths, for fixed and for Poisson distributed number of steps. In these situations, the high accuracy of the saddlepoint approximations is illustrated by numerical comparisons with Monte Carlo simulation. Contribution to the "Topical Issue: Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.

  4. Network Location-Aware Service Recommendation with Random Walk in Cyber-Physical Systems

    PubMed Central

    Yin, Yuyu; Yu, Fangzheng; Xu, Yueshen; Yu, Lifeng; Mu, Jinglong

    2017-01-01

    Cyber-physical systems (CPS) have received much attention from both academia and industry. An increasing number of functions in CPS are provided in the way of services, which gives rise to an urgent task, that is, how to recommend the suitable services in a huge number of available services in CPS. In traditional service recommendation, collaborative filtering (CF) has been studied in academia, and used in industry. However, there exist several defects that limit the application of CF-based methods in CPS. One is that under the case of high data sparsity, CF-based methods are likely to generate inaccurate prediction results. In this paper, we discover that mining the potential similarity relations among users or services in CPS is really helpful to improve the prediction accuracy. Besides, most of traditional CF-based methods are only capable of using the service invocation records, but ignore the context information, such as network location, which is a typical context in CPS. In this paper, we propose a novel service recommendation method for CPS, which utilizes network location as context information and contains three prediction models using random walking. We conduct sufficient experiments on two real-world datasets, and the results demonstrate the effectiveness of our proposed methods and verify that the network location is indeed useful in QoS prediction. PMID:28885602

  5. Network Location-Aware Service Recommendation with Random Walk in Cyber-Physical Systems.

    PubMed

    Yin, Yuyu; Yu, Fangzheng; Xu, Yueshen; Yu, Lifeng; Mu, Jinglong

    2017-09-08

    Cyber-physical systems (CPS) have received much attention from both academia and industry. An increasing number of functions in CPS are provided in the way of services, which gives rise to an urgent task, that is, how to recommend the suitable services in a huge number of available services in CPS. In traditional service recommendation, collaborative filtering (CF) has been studied in academia, and used in industry. However, there exist several defects that limit the application of CF-based methods in CPS. One is that under the case of high data sparsity, CF-based methods are likely to generate inaccurate prediction results. In this paper, we discover that mining the potential similarity relations among users or services in CPS is really helpful to improve the prediction accuracy. Besides, most of traditional CF-based methods are only capable of using the service invocation records, but ignore the context information, such as network location, which is a typical context in CPS. In this paper, we propose a novel service recommendation method for CPS, which utilizes network location as context information and contains three prediction models using random walking. We conduct sufficient experiments on two real-world datasets, and the results demonstrate the effectiveness of our proposed methods and verify that the network location is indeed useful in QoS prediction.

  6. Inbreeding avoidance through non-random mating in sticklebacks

    PubMed Central

    Frommen, Joachim G; Bakker, Theo C.M

    2006-01-01

    Negative effects of inbreeding are well documented in a wide range of animal taxa. Hatching success and survival of inbred offspring is reduced in many species and inbred progeny are often less attractive to potential mates. Thus, individuals should avoid mating with close kin. However, experimental evidence for inbreeding avoidance through non-random mating in vertebrates is scarce. Here, we show that gravid female three-spined sticklebacks (Gasterosteus aculeatus) when given the choice between a courting familiar brother and a courting unfamiliar non-sib prefer to mate with the non-sib and thus avoid the disadvantages of incest. We controlled for differences in males' body size and red intensity of nuptial coloration. Thus, females adjust their courting behaviour to the risk of inbreeding. PMID:17148370

  7. Inbreeding avoidance through non-random mating in sticklebacks.

    PubMed

    Frommen, Joachim G; Bakker, Theo C M

    2006-06-22

    Negative effects of inbreeding are well documented in a wide range of animal taxa. Hatching success and survival of inbred offspring is reduced in many species and inbred progeny are often less attractive to potential mates. Thus, individuals should avoid mating with close kin. However, experimental evidence for inbreeding avoidance through non-random mating in vertebrates is scarce. Here, we show that gravid female three-spined sticklebacks (Gasterosteus aculeatus) when given the choice between a courting familiar brother and a courting unfamiliar non-sib prefer to mate with the non-sib and thus avoid the disadvantages of incest. We controlled for differences in males' body size and red intensity of nuptial coloration. Thus, females adjust their courting behaviour to the risk of inbreeding.

  8. Three-dimensional distribution of random velocity inhomogeneities at the Nankai trough seismogenic zone

    NASA Astrophysics Data System (ADS)

    Takahashi, T.; Obana, K.; Yamamoto, Y.; Nakanishi, A.; Kaiho, Y.; Kodaira, S.; Kaneda, Y.

    2012-12-01

    The Nankai trough in southwestern Japan is a convergent margin where the Philippine sea plate is subducted beneath the Eurasian plate. There are major faults segments of huge earthquakes that are called Tokai, Tonankai and Nankai earthquakes. According to the earthquake occurrence history over the past hundreds years, we must expect various rupture patters such as simultaneous or nearly continuous ruptures of plural fault segments. Japan Agency for Marine-Earth Science and Technology (JAMSTEC) conducted seismic surveys at Nankai trough in order to clarify mutual relations between seismic structures and fault segments, as a part of "Research concerning Interaction Between the Tokai, Tonankai and Nankai Earthquakes" funded by Ministry of Education, Culture, Sports, Science and Technology, Japan. This study evaluated the spatial distribution of random velocity inhomogeneities from Hyuga-nada to Kii-channel by using velocity seismograms of small and moderate sized earthquakes. Random velocity inhomogeneities are estimated by the peak delay time analysis of S-wave envelopes (e.g., Takahashi et al. 2009). Peak delay time is defined as the time lag from the S-wave onset to its maximal amplitude arrival. This quantity mainly reflects the accumulated multiple forward scattering effect due to random inhomogeneities, and is quite insensitive to the inelastic attenuation. Peak delay times are measured from the rms envelopes of horizontal components at 4-8Hz, 8-16Hz and 16-32Hz. This study used the velocity seismograms that are recorded by 495 ocean bottom seismographs and 378 onshore seismic stations. Onshore stations are composed of the F-net and Hi-net stations that are maintained by National Research Institute for Earth Science and Disaster Prevention (NIED) of Japan. It is assumed that the random inhomogeneities are represented by the von Karman type PSDF. Preliminary result of inversion analysis shows that spectral gradient of PSDF (i.e., scale dependence of

  9. Prediction future asset price which is non-concordant with the historical distribution

    NASA Astrophysics Data System (ADS)

    Seong, Ng Yew; Hin, Pooi Ah

    2015-12-01

    This paper attempts to predict the major characteristics of the future asset price which is non-concordant with the distribution estimated from the price today and the prices on a large number of previous days. The three major characteristics of the i-th non-concordant asset price are the length of the interval between the occurrence time of the previous non-concordant asset price and that of the present non-concordant asset price, the indicator which denotes that the non-concordant price is extremely small or large by its values -1 and 1 respectively, and the degree of non-concordance given by the negative logarithm of the probability of the left tail or right tail of which one of the end points is given by the observed future price. The vector of three major characteristics of the next non-concordant price is modelled to be dependent on the vectors corresponding to the present and l - 1 previous non-concordant prices via a 3-dimensional conditional distribution which is derived from a 3(l + 1)-dimensional power-normal mixture distribution. The marginal distribution for each of the three major characteristics can then be derived from the conditional distribution. The mean of the j-th marginal distribution is an estimate of the value of the j-th characteristics of the next non-concordant price. Meanwhile, the 100(α/2) % and 100(1 - α/2) % points of the j-th marginal distribution can be used to form a prediction interval for the j-th characteristic of the next non-concordant price. The performance measures of the above estimates and prediction intervals indicate that the fitted conditional distribution is satisfactory. Thus the incorporation of the distribution of the characteristics of the next non-concordant price in the model for asset price has a good potential of yielding a more realistic model.

  10. Graphene materials having randomly distributed two-dimensional structural defects

    DOEpatents

    Kung, Harold H; Zhao, Xin; Hayner, Cary M; Kung, Mayfair C

    2013-10-08

    Graphene-based storage materials for high-power battery applications are provided. The storage materials are composed of vertical stacks of graphene sheets and have reduced resistance for Li ion transport. This reduced resistance is achieved by incorporating a random distribution of structural defects into the stacked graphene sheets, whereby the structural defects facilitate the diffusion of Li ions into the interior of the storage materials.

  11. Graphene materials having randomly distributed two-dimensional structural defects

    DOEpatents

    Kung, Harold H.; Zhao, Xin; Hayner, Cary M.; Kung, Mayfair C.

    2016-05-31

    Graphene-based storage materials for high-power battery applications are provided. The storage materials are composed of vertical stacks of graphene sheets and have reduced resistance for Li ion transport. This reduced resistance is achieved by incorporating a random distribution of structural defects into the stacked graphene sheets, whereby the structural defects facilitate the diffusion of Li ions into the interior of the storage materials.

  12. Gravitational lensing by eigenvalue distributions of random matrix models

    NASA Astrophysics Data System (ADS)

    Martínez Alonso, Luis; Medina, Elena

    2018-05-01

    We propose to use eigenvalue densities of unitary random matrix ensembles as mass distributions in gravitational lensing. The corresponding lens equations reduce to algebraic equations in the complex plane which can be treated analytically. We prove that these models can be applied to describe lensing by systems of edge-on galaxies. We illustrate our analysis with the Gaussian and the quartic unitary matrix ensembles.

  13. On the distribution of a product of N Gaussian random variables

    NASA Astrophysics Data System (ADS)

    Stojanac, Željka; Suess, Daniel; Kliesch, Martin

    2017-08-01

    The product of Gaussian random variables appears naturally in many applications in probability theory and statistics. It has been known that the distribution of a product of N such variables can be expressed in terms of a Meijer G-function. Here, we compute a similar representation for the corresponding cumulative distribution function (CDF) and provide a power-log series expansion of the CDF based on the theory of the more general Fox H-functions. Numerical computations show that for small values of the argument the CDF of products of Gaussians is well approximated by the lowest orders of this expansion. Analogous results are also shown for the absolute value as well as the square of such products of N Gaussian random variables. For the latter two settings, we also compute the moment generating functions in terms of Meijer G-functions.

  14. Stochastic Seismic Response of an Algiers Site with Random Depth to Bedrock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Badaoui, M.; Mebarki, A.; Berrah, M. K.

    2010-05-21

    Among the important effects of the Boumerdes earthquake (Algeria, May 21{sup st} 2003) was that, within the same zone, the destructions in certain parts were more important than in others. This phenomenon is due to site effects which alter the characteristics of seismic motions and cause concentration of damage during earthquakes. Local site effects such as thickness and mechanical properties of soil layers have important effects on the surface ground motions.This paper deals with the effect of the randomness aspect of the depth to bedrock (soil layers heights) which is assumed to be a random variable with lognormal distribution. Thismore » distribution is suitable for strictly non-negative random variables with large values of the coefficient of variation. In this case, Monte Carlo simulations are combined with the stiffness matrix method, used herein as a deterministic method, for evaluating the effect of the depth to bedrock uncertainty on the seismic response of a multilayered soil. This study considers a P and SV wave propagation pattern using input accelerations collected at Keddara station, located at 20 km from the epicenter, as it is located directly on the bedrock.A parametric study is conducted do derive the stochastic behavior of the peak ground acceleration and its response spectrum, the transfer function and the amplification factors. It is found that the soil height heterogeneity causes a widening of the frequency content and an increase in the fundamental frequency of the soil profile, indicating that the resonance phenomenon concerns a larger number of structures.« less

  15. Stochastic Seismic Response of an Algiers Site with Random Depth to Bedrock

    NASA Astrophysics Data System (ADS)

    Badaoui, M.; Berrah, M. K.; Mébarki, A.

    2010-05-01

    Among the important effects of the Boumerdes earthquake (Algeria, May 21st 2003) was that, within the same zone, the destructions in certain parts were more important than in others. This phenomenon is due to site effects which alter the characteristics of seismic motions and cause concentration of damage during earthquakes. Local site effects such as thickness and mechanical properties of soil layers have important effects on the surface ground motions. This paper deals with the effect of the randomness aspect of the depth to bedrock (soil layers heights) which is assumed to be a random variable with lognormal distribution. This distribution is suitable for strictly non-negative random variables with large values of the coefficient of variation. In this case, Monte Carlo simulations are combined with the stiffness matrix method, used herein as a deterministic method, for evaluating the effect of the depth to bedrock uncertainty on the seismic response of a multilayered soil. This study considers a P and SV wave propagation pattern using input accelerations collected at Keddara station, located at 20 km from the epicenter, as it is located directly on the bedrock. A parametric study is conducted do derive the stochastic behavior of the peak ground acceleration and its response spectrum, the transfer function and the amplification factors. It is found that the soil height heterogeneity causes a widening of the frequency content and an increase in the fundamental frequency of the soil profile, indicating that the resonance phenomenon concerns a larger number of structures.

  16. Spatial distribution of non volcanic tremors offshore eastern Taiwan

    NASA Astrophysics Data System (ADS)

    Xie, X. S.; Lin, J. Y.; Hsu, S. K.; Lee, C. H.; Liang, C. W.

    2012-04-01

    Non-volcanic tremor (NVT), originally identified in the subduction zone of the southwest Japan, have been well studied in the circum-Pacific subduction zones and the transform plate boundary in California. Most studies related NVT to the release of fluids, while some others associated them with slow-slip events, and can be triggered instantaneously by the surface waves of teleseismic events. Taiwan is located at a complex intersection of the Philippines Sea Plate and the Eurasian Plate. East of Taiwan, the Philippine Sea plate subducts northward beneath the Ryukyu arc. The major part of the island results from the strong convergence between the two plates and the convergent boundary is along the Longitudinal Valley. Moreover, an active strike-slip fault along the Taitung Canyon was reported in the offshore eastern Taiwan. In such complicate tectonic environments, NVT behavior could probably bring us more information about the interaction of all the geological components in the area. In this study, we analyze the seismic signals recorded by the Ocean bottom Seismometer (OBS) deployed offshore eastern Taiwan in September 2009. TAMS (Tremor Active Monitor System) software was used to detect the presence of NVT. 200 tremor-like signals were obtained from the 3 weeks recording period. We use the SSA (Source-Scanning Algorithm) to map the possible distribution of the tremor. In total, 180 tremors were located around the eastern offshore Taiwan. The tremors are mainly distributed in two source areas: one is along the Taitung Canyon, and the other is sub-parallel to the Ryukyu Trench, probably along the plate interface. Many tremors are located at depth shallower than 5 km, which suggests a possible existence of a weak basal detachment along the sea bottom. Other tremors with larger depth may be related to the dehydration of the subducting sea plate as suggested by the former studies. Limited by the short recording period of the OBS experiment, we could not obtain any

  17. The Influence of Emission Location on the Magnitude and Spatial Distribution of Aerosols' Climate Effects

    NASA Astrophysics Data System (ADS)

    Persad, G.; Caldeira, K.

    2017-12-01

    The global distribution of anthropogenic aerosol emissions has evolved continuously since the preindustrial era - from 20th century North American and Western European emissions hotspots to present-day South and East Asian ones. With this comes a relocation of the regional radiative, dynamical, and hydrological impacts of aerosol emissions, which may influence global climate differently depending on where they occur. A lack of understanding of this relationship between aerosol emissions' location and their global climate effects, however, obscures the potential influence that aerosols' evolving geographic distribution may have on global and regional climate change—a gap which we address in this work. Using a novel suite of experiments in the CESM CAM5 atmospheric general circulation model coupled to a slab ocean, we systematically test and analyze mechanisms behind the relative climate impact of identical black carbon and sulfate aerosol emissions located in each of 8 past, present, or projected future major emissions regions. Results indicate that historically high emissions regions, such as North America and Western Europe, produce a stronger cooling effect than current and projected future high emissions regions. Aerosol emissions located in Western Europe produce 3 times the global mean cooling (-0.34 °C) as those located in East Africa or India (-0.11 °C). The aerosols' in-situ radiative effects remain relatively confined near the emissions region, but large distal cooling results from remote feedback processes - such as ice albedo and cloud changes - that are excited more strongly by emissions from certain regions than others. Results suggest that aerosol emissions from different countries should not be considered equal in the context of climate mitigation accounting, and that the evolving geographic distribution of aerosol emissions may have a substantial impact on the magnitude and spatial distribution of global climate change.

  18. Non-Evolutionary Algorithms for Scheduling Dependent Tasks in Distributed Heterogeneous Computing Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wayne F. Boyer; Gurdeep S. Hura

    2005-09-01

    The Problem of obtaining an optimal matching and scheduling of interdependent tasks in distributed heterogeneous computing (DHC) environments is well known to be an NP-hard problem. In a DHC system, task execution time is dependent on the machine to which it is assigned and task precedence constraints are represented by a directed acyclic graph. Recent research in evolutionary techniques has shown that genetic algorithms usually obtain more efficient schedules that other known algorithms. We propose a non-evolutionary random scheduling (RS) algorithm for efficient matching and scheduling of inter-dependent tasks in a DHC system. RS is a succession of randomized taskmore » orderings and a heuristic mapping from task order to schedule. Randomized task ordering is effectively a topological sort where the outcome may be any possible task order for which the task precedent constraints are maintained. A detailed comparison to existing evolutionary techniques (GA and PSGA) shows the proposed algorithm is less complex than evolutionary techniques, computes schedules in less time, requires less memory and fewer tuning parameters. Simulation results show that the average schedules produced by RS are approximately as efficient as PSGA schedules for all cases studied and clearly more efficient than PSGA for certain cases. The standard formulation for the scheduling problem addressed in this paper is Rm|prec|Cmax.,« less

  19. Characterizing ISI and sub-threshold membrane potential distributions: Ensemble of IF neurons with random squared-noise intensity.

    PubMed

    Kumar, Sanjeev; Karmeshu

    2018-04-01

    A theoretical investigation is presented that characterizes the emerging sub-threshold membrane potential and inter-spike interval (ISI) distributions of an ensemble of IF neurons that group together and fire together. The squared-noise intensity σ 2 of the ensemble of neurons is treated as a random variable to account for the electrophysiological variations across population of nearly identical neurons. Employing superstatistical framework, both ISI distribution and sub-threshold membrane potential distribution of neuronal ensemble are obtained in terms of generalized K-distribution. The resulting distributions exhibit asymptotic behavior akin to stretched exponential family. Extensive simulations of the underlying SDE with random σ 2 are carried out. The results are found to be in excellent agreement with the analytical results. The analysis has been extended to cover the case corresponding to independent random fluctuations in drift in addition to random squared-noise intensity. The novelty of the proposed analytical investigation for the ensemble of IF neurons is that it yields closed form expressions of probability distributions in terms of generalized K-distribution. Based on a record of spiking activity of thousands of neurons, the findings of the proposed model are validated. The squared-noise intensity σ 2 of identified neurons from the data is found to follow gamma distribution. The proposed generalized K-distribution is found to be in excellent agreement with that of empirically obtained ISI distribution of neuronal ensemble. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Telemedicine Provides Non-Inferior Research Informed Consent for Remote Study Enrollment: A Randomized Controlled Trial

    PubMed Central

    Bobb, Morgan R.; Van Heukelom, Paul G.; Faine, Brett A.; Ahmed, Azeemuddin; Messerly, Jeffrey T.; Bell, Gregory; Harland, Karisa K.; Simon, Christian; Mohr, Nicholas M.

    2016-01-01

    Objective Telemedicine networks are beginning to provide an avenue for conducting emergency medicine research, but using telemedicine to recruit participants for clinical trials has not been validated. The goal of this consent study is to determine whether patient comprehension of telemedicine-enabled research informed consent is non-inferior to standard face-to-face research informed consent. Methods A prospective, open-label randomized controlled trial was performed in a 60,000-visit Midwestern academic Emergency Department (ED) to test whether telemedicine-enabled research informed consent provided non-inferior comprehension compared with standard consent. This study was conducted as part of a parent clinical trial evaluating the effectiveness of oral chlorhexidine gluconate 0.12% in preventing hospital-acquired pneumonia among adult ED patients with expected hospital admission. Prior to being recruited into the study, potential participants were randomized in a 1:1 allocation ratio to consent by telemedicine versus standard face-to-face consent. Telemedicine connectivity was provided using a commercially available interface (REACH platform, Vidyo Inc., Hackensack, NJ) to an emergency physician located in another part of the ED. Comprehension of research consent (primary outcome) was measured using the modified Quality of Informed Consent (QuIC) instrument, a validated tool for measuring research informed consent comprehension. Parent trial accrual rate and qualitative survey data were secondary outcomes. Results One-hundred thirty-one patients were randomized (n = 64, telemedicine), and 101 QuIC surveys were completed. Comprehension of research informed consent using telemedicine was not inferior to face-to-face consent (QuIC scores 74.4 ± 8.1 vs. 74.4 ± 6.9 on a 100-point scale, p = 0.999). Subjective understanding of consent (p=0.194) and parent trial study accrual rates (56% vs. 69%, p = 0.142) were similar. Conclusion Telemedicine is non-inferior to face

  1. A Randomized Clinical Trial Comparing Methotrexate and Mycophenolate Mofetil for Non-Infectious Uveitis

    PubMed Central

    Rathinam, Sivakumar R; Babu, Manohar; Thundikandy, Radhika; Kanakath, Anuradha; Nardone, Natalie; Esterberg, Elizabeth; Lee, Salena M; Enanoria, Wayne TA; Porco, Travis C; Browne, Erica N; Weinrib, Rachel; Acharya, Nisha R

    2014-01-01

    Objective To compare the relative effectiveness of methotrexate and mycophenolate mofetil for non-infectious intermediate uveitis, posterior uveitis, or panuveitis. Design Multicenter, block-randomized, observer-masked clinical trial Participants Eighty patients with non-infectious intermediate, posterior or panuveitis requiring corticosteroid-sparing therapy at Aravind Eye Hospitals in Madurai and Coimbatore, India. Intervention Patients were randomized to receive 25mg weekly oral methotrexate or 1g twice daily oral mycophenolate mofetil and were monitored monthly for 6 months. Oral prednisone and topical corticosteroids were tapered. Main Outcome Measures Masked examiners assessed the primary outcome of treatment success, defined by achieving the following at 5 and 6 months: (1) ≤0.5+ anterior chamber cells, ≤0.5+ vitreous cells, ≤0.5+ vitreous haze and no active retinal/choroidal lesions in both eyes, (2) ≤ 10 mg of prednisone and ≤ 2 drops of prednisolone acetate 1% a day and (3) no declaration of treatment failure due to intolerability or safety. Additional outcomes included time to sustained corticosteroid-sparing control of inflammation, change in best spectacle-corrected visual acuity, resolution of macular edema, adverse events, subgroup analysis by anatomic location, and medication adherence. Results Forty-one patients were randomized to methotrexate and 39 to mycophenolate mofetil. A total of 67 patients (35 methotrexate, 32 mycophenolate mofetil) contributed to the primary outcome. Sixty-nine percent of patients achieved treatment success with methotrexate and 47% with mycophenolate mofetil (p=0.09). Treatment failure due to adverse events or tolerability was not significantly different by treatment arm (p=0.99). There were no statistically significant differences between treatment groups in time to corticosteroid-sparing control of inflammation (p=0.44), change in best spectacle-corrected visual acuity (p=0.68), and resolution of macular

  2. Schmallenberg virus non-structural protein NSm: Intracellular distribution and role of non-hydrophobic domains.

    PubMed

    Kraatz, Franziska; Wernike, Kerstin; Reiche, Sven; Aebischer, Andrea; Reimann, Ilona; Beer, Martin

    2018-03-01

    Schmallenberg virus (SBV) induces fetal malformation, abortions and stillbirth in ruminants. While the non-structural protein NSs is a major virulence factor, the biological function of NSm, the second non-structural protein which consists of three hydrophobic transmembrane (I, III, V) and two non-hydrophobic regions (II, IV), is still unknown. Here, a series of NSm mutants displaying deletions of nearly the entire NSm or of the non-hydrophobic domains was generated and the intracellular distribution of NSm was assessed. SBV-NSm is dispensable for the generation of infectious virus and mutants lacking domains II - V showed growth properties similar to the wild-type virus. In addition, a comparable intracellular distribution of SBV-NSm was observed in mammalian cells infected with domain II mutants or wild-type virus. In both cases, NSm co-localized with the glycoprotein Gc in the Golgi compartment. However, domain IV-deletion mutants showed an altered distribution pattern and no co-localization of NSm and Gc. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Spatial distribution of acaricide profiles (Boophilus microplus strains susceptible or resistant to acaricides) in southeastern Mexico.

    PubMed

    Rodríguez-Vivas, R I; Rivas, A L; Chowell, G; Fragoso, S H; Rosario, C R; García, Z; Smith, S D; Williams, J J; Schwager, S J

    2007-05-15

    The ability of Boophilus microplus strains to be susceptible (-) or resistant (+) to amidines (Am), synthetic pyrethroids (SP), and/or organo-phosphates (OP) (or acaricide profiles) was investigated in 217 southeastern Mexican cattle ranches (located in the states of Yucatán, Quintana Roo, and Tabasco). Three questions were asked: (1) whether acaricide profiles varied at random and, if not, which one(s) explained more (or less) cases than expected, (2) whether the spatial distribution of acaricide profiles was randomly or non-randomly distributed, and (3) whether acaricide profiles were associated with farm-related covariates (frequency of annual treatments, herd size, and farm size). Three acaricide profiles explained 73.6% of the data, representing at least twice as many cases as expected (P<0.001): (1) Am-SP-, (2) Am+SP+, and (3) (among ranches that dispensed acaricides > or = 6 times/year) Am-OP+SP+. Because ticks collected in Yucatán ranches tended to be susceptible to Am, those of Quintana Roo ranches displayed, predominantly, resistance to OP/SP, and Tabasco ticks tended to be resistant to Am (all with P < or = 0.05), acaricide profiles appeared to be non-randomly disseminated over space. Across states, two farm-related covariates were associated with resistance (P < or = 0.02): (1) high annual frequency of acaricide treatments, and (2) large farm size. Findings supported the hypothesis that spatial acaricide profiles followed neither random nor homogeneous data distributions, being partially explained by agent- and/or farm-specific factors. Some profiles could not be explained by these factors. Further spatially explicit studies (addressing host-related factors) are recommended.

  4. Corrected Mean-Field Model for Random Sequential Adsorption on Random Geometric Graphs

    NASA Astrophysics Data System (ADS)

    Dhara, Souvik; van Leeuwaarden, Johan S. H.; Mukherjee, Debankur

    2018-03-01

    A notorious problem in mathematics and physics is to create a solvable model for random sequential adsorption of non-overlapping congruent spheres in the d-dimensional Euclidean space with d≥ 2 . Spheres arrive sequentially at uniformly chosen locations in space and are accepted only when there is no overlap with previously deposited spheres. Due to spatial correlations, characterizing the fraction of accepted spheres remains largely intractable. We study this fraction by taking a novel approach that compares random sequential adsorption in Euclidean space to the nearest-neighbor blocking on a sequence of clustered random graphs. This random network model can be thought of as a corrected mean-field model for the interaction graph between the attempted spheres. Using functional limit theorems, we characterize the fraction of accepted spheres and its fluctuations.

  5. Scalable and fault tolerant orthogonalization based on randomized distributed data aggregation

    PubMed Central

    Gansterer, Wilfried N.; Niederbrucker, Gerhard; Straková, Hana; Schulze Grotthoff, Stefan

    2013-01-01

    The construction of distributed algorithms for matrix computations built on top of distributed data aggregation algorithms with randomized communication schedules is investigated. For this purpose, a new aggregation algorithm for summing or averaging distributed values, the push-flow algorithm, is developed, which achieves superior resilience properties with respect to failures compared to existing aggregation methods. It is illustrated that on a hypercube topology it asymptotically requires the same number of iterations as the optimal all-to-all reduction operation and that it scales well with the number of nodes. Orthogonalization is studied as a prototypical matrix computation task. A new fault tolerant distributed orthogonalization method rdmGS, which can produce accurate results even in the presence of node failures, is built on top of distributed data aggregation algorithms. PMID:24748902

  6. A Note on the Assumption of Identical Distributions for Nonparametric Tests of Location

    ERIC Educational Resources Information Center

    Nordstokke, David W.; Colp, S. Mitchell

    2018-01-01

    Often, when testing for shift in location, researchers will utilize nonparametric statistical tests in place of their parametric counterparts when there is evidence or belief that the assumptions of the parametric test are not met (i.e., normally distributed dependent variables). An underlying and often unattended to assumption of nonparametric…

  7. The German Environmental Survey for Children (GerES IV): reference values and distributions for time-location patterns of German children.

    PubMed

    Conrad, André; Seiwert, Margarete; Hünken, Andreas; Quarcoo, David; Schlaud, Martin; Groneberg, David

    2013-01-01

    Children's time-location patterns are important determinants of environmental exposure and other health-relevant factors. Building on data of the German Environmental Survey for Children (GerES IV), our study aimed at deriving reference values and distributions for time-location patterns of 3-14-year-old German children. We also investigated if GerES IV data are appropriate for evaluating associations with children's health determinants by linking them to data of the National Health Interview and Examination Survey for Children and Adolescents (KiGGS). Parents reported on the time their children usually spend at home, in other indoor environments, and outdoors. This information was characterized by statistical parameters, which were also calculated for different strata concerning socio-demography and the residential environment. Consequently, group differences were evaluated by t-tests and univariate ANOVA. Reference distributions were fitted to the time-location data by a Maximum Likelihood approach to make them also useable in probabilistic exposure modeling. Finally, associations between data on the children's physical activity as well as body weight and their outdoor time were investigated by bivariate correlation analysis and cross tabulation. On daily average, German children spend 15 h and 31 min at home, 4 h and 46 min in other indoor environments, and 3 h and 43 min outdoors. Time spent at home and outdoors decreases with age while time spent in other indoor environments increases. Differences in time-location patterns were also observed for the socio-economic status (SES) and immigration status. E.g., children with a high SES spend 24 min less outdoors than low SES children. Immigrants spend on daily average 20 min more at home and 15 min less outdoors than non-immigrant children. Outdoor time was associated with parameters of the residential environment like the building development. Children living in 1- or 2-family houses spend more time outdoors than

  8. SEM method for direct visual tracking of nanoscale morphological changes of platinum based electrocatalysts on fixed locations upon electrochemical or thermal treatments.

    PubMed

    Zorko, Milena; Jozinović, Barbara; Bele, Marjan; Hodnik, Nejc; Gaberšček, Miran

    2014-05-01

    A general method for tracking morphological surface changes on a nanometer scale with scanning electron microscopy (SEM) is introduced. We exemplify the usefulness of the method by showing consecutive SEM images of an identical location before and after the electrochemical and thermal treatments of platinum-based nanoparticles deposited on a high surface area carbon. Observations reveal an insight into platinum based catalyst degradation occurring during potential cycling treatment. The presence of chloride clearly increases the rate of degradation. At these conditions the dominant degradation mechanism seems to be the platinum dissolution with some subsequent redeposition on the top of the catalyst film. By contrast, at the temperature of 60°C, under potentiostatic conditions some carbon corrosion and particle aggregation was observed. Temperature treatment simulating the annealing step of the synthesis reveals sintering of small platinum based composite aggregates into uniform spherical particles. The method provides a direct proof of induced surface phenomena occurring on a chosen location without the usual statistical uncertainty in usual, random SEM observations across relatively large surface areas. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Character Journaling through Social Networks: Exemplifying Tenets of the New Literacy Studies

    ERIC Educational Resources Information Center

    White, John Wesley; Hungerford-Kresser, Holly

    2014-01-01

    Countering reactionary attempts to ban social media from schools is a strong research based rationale for bringing social media into the literacy classroom. When used as a medium to explore literature--or more specifically for interactive character journaling--this medium exemplifies how meaning is created by individuals' interactions with…

  10. A random phased-array for MR-guided transcranial ultrasound neuromodulation in non-human primates

    NASA Astrophysics Data System (ADS)

    Chaplin, Vandiver; Phipps, Marshal A.; Caskey, Charles F.

    2018-05-01

    Transcranial focused ultrasound (FUS) is a non-invasive technique for therapy and study of brain neural activation. Here we report on the design and characterization of a new MR-guided FUS transducer for neuromodulation in non-human primates at 650 kHz. The array is randomized with 128 elements 6.6 mm in diameter, radius of curvature 7.2 cm, opening diameter 10.3 cm (focal ratio 0.7), and 46% coverage. Simulations were used to optimize transducer geometry with respect to focus size, grating lobes, and directivity. Focus size and grating lobes during electronic steering were quantified using hydrophone measurements in water and a three-axis stage. A novel combination of optical tracking and acoustic mapping enabled measurement of the 3D pressure distribution in the cortical region of an ex vivo skull to within ~3.5 mm of the surface, and allowed accurate modelling of the experiment via non-homogeneous 3D acoustic simulations. The data demonstrates acoustic focusing beyond the skull bone, with the focus slightly broadened and shifted proximal to the skull. The fabricated design is capable of targeting regions within the S1 sensorimotor cortex of macaques.

  11. A random phased-array for MR-guided transcranial ultrasound neuromodulation in non-human primates.

    PubMed

    Chaplin, Vandiver; Phipps, Marshal A; Caskey, Charles F

    2018-05-17

    Transcranial focused ultrasound (FUS) is a non-invasive technique for therapy and study of brain neural activation. Here we report on the design and characterization of a new MR-guided FUS transducer for neuromodulation in non-human primates at 650 kHz. The array is randomized with 128 elements 6.6 mm in diameter, radius of curvature 7.2 cm, opening diameter 10.3 cm (focal ratio 0.7), and 46% coverage. Simulations were used to optimize transducer geometry with respect to focus size, grating lobes, and directivity. Focus size and grating lobes during electronic steering were quantified using hydrophone measurements in water and a three-axis stage. A novel combination of optical tracking and acoustic mapping enabled measurement of the 3D pressure distribution in the cortical region of an ex vivo skull to within ~3.5 mm of the surface, and allowed accurate modelling of the experiment via non-homogeneous 3D acoustic simulations. The data demonstrates acoustic focusing beyond the skull bone, with the focus slightly broadened and shifted proximal to the skull. The fabricated design is capable of targeting regions within the S1 sensorimotor cortex of macaques.

  12. Three-Phase AC Optimal Power Flow Based Distribution Locational Marginal Price: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Rui; Zhang, Yingchen

    2017-05-17

    Designing market mechanisms for electricity distribution systems has been a hot topic due to the increased presence of smart loads and distributed energy resources (DERs) in distribution systems. The distribution locational marginal pricing (DLMP) methodology is one of the real-time pricing methods to enable such market mechanisms and provide economic incentives to active market participants. Determining the DLMP is challenging due to high power losses, the voltage volatility, and the phase imbalance in distribution systems. Existing DC Optimal Power Flow (OPF) approaches are unable to model power losses and the reactive power, while single-phase AC OPF methods cannot capture themore » phase imbalance. To address these challenges, in this paper, a three-phase AC OPF based approach is developed to define and calculate DLMP accurately. The DLMP is modeled as the marginal cost to serve an incremental unit of demand at a specific phase at a certain bus, and is calculated using the Lagrange multipliers in the three-phase AC OPF formulation. Extensive case studies have been conducted to understand the impact of system losses and the phase imbalance on DLMPs as well as the potential benefits of flexible resources.« less

  13. Linear velocity fields in non-Gaussian models for large-scale structure

    NASA Technical Reports Server (NTRS)

    Scherrer, Robert J.

    1992-01-01

    Linear velocity fields in two types of physically motivated non-Gaussian models are examined for large-scale structure: seed models, in which the density field is a convolution of a density profile with a distribution of points, and local non-Gaussian fields, derived from a local nonlinear transformation on a Gaussian field. The distribution of a single component of the velocity is derived for seed models with randomly distributed seeds, and these results are applied to the seeded hot dark matter model and the global texture model with cold dark matter. An expression for the distribution of a single component of the velocity in arbitrary local non-Gaussian models is given, and these results are applied to such fields with chi-squared and lognormal distributions. It is shown that all seed models with randomly distributed seeds and all local non-Guassian models have single-component velocity distributions with positive kurtosis.

  14. Probability evolution method for exit location distribution

    NASA Astrophysics Data System (ADS)

    Zhu, Jinjie; Chen, Zhen; Liu, Xianbin

    2018-03-01

    The exit problem in the framework of the large deviation theory has been a hot topic in the past few decades. The most probable escape path in the weak-noise limit has been clarified by the Freidlin-Wentzell action functional. However, noise in real physical systems cannot be arbitrarily small while noise with finite strength may induce nontrivial phenomena, such as noise-induced shift and noise-induced saddle-point avoidance. Traditional Monte Carlo simulation of noise-induced escape will take exponentially large time as noise approaches zero. The majority of the time is wasted on the uninteresting wandering around the attractors. In this paper, a new method is proposed to decrease the escape simulation time by an exponentially large factor by introducing a series of interfaces and by applying the reinjection on them. This method can be used to calculate the exit location distribution. It is verified by examining two classical examples and is compared with theoretical predictions. The results show that the method performs well for weak noise while may induce certain deviations for large noise. Finally, some possible ways to improve our method are discussed.

  15. Accreditation status and geographic location of outpatient vascular testing facilities among Medicare beneficiaries: the VALUE (Vascular Accreditation, Location & Utilization Evaluation) study.

    PubMed

    Rundek, Tatjana; Brown, Scott C; Wang, Kefeng; Dong, Chuanhui; Farrell, Mary Beth; Heller, Gary V; Gornik, Heather L; Hutchisson, Marge; Needleman, Laurence; Benenati, James F; Jaff, Michael R; Meier, George H; Perese, Susana; Bendick, Phillip; Hamburg, Naomi M; Lohr, Joann M; LaPerna, Lucy; Leers, Steven A; Lilly, Michael P; Tegeler, Charles; Alexandrov, Andrei V; Katanick, Sandra L

    2014-10-01

    There is limited information on the accreditation status and geographic distribution of vascular testing facilities in the US. The Centers for Medicare & Medicaid Services (CMS) provide reimbursement to facilities regardless of accreditation status. The aims were to: (1) identify the proportion of Intersocietal Accreditation Commission (IAC) accredited vascular testing facilities in a 5% random national sample of Medicare beneficiaries receiving outpatient vascular testing services; (2) describe the geographic distribution of these facilities. The VALUE (Vascular Accreditation, Location & Utilization Evaluation) Study examines the proportion of IAC accredited facilities providing vascular testing procedures nationally, and the geographic distribution and utilization of these facilities. The data set containing all facilities that billed Medicare for outpatient vascular testing services in 2011 (5% CMS Outpatient Limited Data Set (LDS) file) was examined, and locations of outpatient vascular testing facilities were obtained from the 2011 CMS/Medicare Provider of Services (POS) file. Of 13,462 total vascular testing facilities billing Medicare for vascular testing procedures in a 5% random Outpatient LDS for the US in 2011, 13% (n=1730) of facilities were IAC accredited. The percentage of IAC accredited vascular testing facilities in the LDS file varied significantly by US region, p<0.0001: 26%, 12%, 11%, and 7% for the Northeast, South, Midwest, and Western regions, respectively. Findings suggest that the proportion of outpatient vascular testing facilities that are IAC accredited is low and varies by region. Increasing the number of accredited vascular testing facilities to improve test quality is a hypothesis that should be tested in future research. © The Author(s) 2014.

  16. Non-random species loss in a forest herbaceous layer following nitrogen addition

    Treesearch

    Christopher A. ​Walter; Mary Beth Adams; Frank S. Gilliam; William T. Peterjohn

    2017-01-01

    Nitrogen (N) additions have decreased species richness (S) in hardwood forest herbaceous layers, yet the functional mechanisms for these decreases have not been explicitly evaluated.We tested two hypothesized mechanisms, random species loss (RSL) and non-random species loss (NRSL), in the hardwood forest herbaceous layer of a long-term, plot-scale...

  17. Consistent and powerful non-Euclidean graph-based change-point test with applications to segmenting random interfered video data.

    PubMed

    Shi, Xiaoping; Wu, Yuehua; Rao, Calyampudi Radhakrishna

    2018-06-05

    The change-point detection has been carried out in terms of the Euclidean minimum spanning tree (MST) and shortest Hamiltonian path (SHP), with successful applications in the determination of authorship of a classic novel, the detection of change in a network over time, the detection of cell divisions, etc. However, these Euclidean graph-based tests may fail if a dataset contains random interferences. To solve this problem, we present a powerful non-Euclidean SHP-based test, which is consistent and distribution-free. The simulation shows that the test is more powerful than both Euclidean MST- and SHP-based tests and the non-Euclidean MST-based test. Its applicability in detecting both landing and departure times in video data of bees' flower visits is illustrated.

  18. Location tests for biomarker studies: a comparison using simulations for the two-sample case.

    PubMed

    Scheinhardt, M O; Ziegler, A

    2013-01-01

    Gene, protein, or metabolite expression levels are often non-normally distributed, heavy tailed and contain outliers. Standard statistical approaches may fail as location tests in this situation. In three Monte-Carlo simulation studies, we aimed at comparing the type I error levels and empirical power of standard location tests and three adaptive tests [O'Gorman, Can J Stat 1997; 25: 269 -279; Keselman et al., Brit J Math Stat Psychol 2007; 60: 267- 293; Szymczak et al., Stat Med 2013; 32: 524 - 537] for a wide range of distributions. We simulated two-sample scenarios using the g-and-k-distribution family to systematically vary tail length and skewness with identical and varying variability between groups. All tests kept the type I error level when groups did not vary in their variability. The standard non-parametric U-test performed well in all simulated scenarios. It was outperformed by the two non-parametric adaptive methods in case of heavy tails or large skewness. Most tests did not keep the type I error level for skewed data in the case of heterogeneous variances. The standard U-test was a powerful and robust location test for most of the simulated scenarios except for very heavy tailed or heavy skewed data, and it is thus to be recommended except for these cases. The non-parametric adaptive tests were powerful for both normal and non-normal distributions under sample variance homogeneity. But when sample variances differed, they did not keep the type I error level. The parametric adaptive test lacks power for skewed and heavy tailed distributions.

  19. Variation in size frequency distribution of coral populations under different fishing pressures in two contrasting locations in the Indian Ocean.

    PubMed

    Grimsditch, G; Pisapia, C; Huck, M; Karisa, J; Obura, D; Sweet, M

    2017-10-01

    This study aimed to assess how the size-frequency distributions of coral genera varied between reefs under different fishing pressures in two contrasting Indian Ocean locations (the Maldives and East Africa). Using generalized linear mixed models, we were able to demonstrate that complex interactions occurred between coral genera, coral size class and fishing pressure. In both locations, we found Acropora coral species to be more abundant in non-fished compared to fished sites (a pattern which was consistent for nearly all the assessed size classes). Coral genera classified as 'stress tolerant' showed a contrasting pattern i.e. were higher in abundance in fished compared to non-fished sites. Site specific variations were also observed. For example, Maldivian reefs exhibited a significantly higher abundance in all size classes of 'competitive' corals compared to East Africa. This possibly indicates that East African reefs have already been subjected to higher levels of stress and are therefore less suitable environments for 'competitive' corals. This study also highlights the potential structure and composition of reefs under future degradation scenarios, for example with a loss of Acropora corals and an increase in dominance of 'stress tolerant' and 'generalist' coral genera. Copyright © 2017. Published by Elsevier Ltd.

  20. Cascaded Raman lasing in a PM phosphosilicate fiber with random distributed feedback

    NASA Astrophysics Data System (ADS)

    Lobach, Ivan A.; Kablukov, Sergey I.; Babin, Sergey A.

    2018-02-01

    We report on the first demonstration of a linearly polarized cascaded Raman fiber laser based on a simple half-open cavity with a broadband composite reflector and random distributed feedback in a polarization maintaining phosphosilicate fiber operating beyond zero dispersion wavelength ( 1400 nm). With increasing pump power from a Yb-doped fiber laser at 1080 nm, the random laser generates subsequently 8 W at 1262 nm and 9 W at 1515 nm with polarization extinction ratio of 27 dB. The generation linewidths amount to about 1 nm and 3 nm, respectively, being almost independent of power, in correspondence with the theory of a cascaded random lasing.

  1. Analysis of quantitative data obtained from toxicity studies showing non-normal distribution.

    PubMed

    Kobayashi, Katsumi

    2005-05-01

    The data obtained from toxicity studies are examined for homogeneity of variance, but, usually, they are not examined for normal distribution. In this study I examined the measured items of a carcinogenicity/chronic toxicity study with rats for both homogeneity of variance and normal distribution. It was observed that a lot of hematology and biochemistry items showed non-normal distribution. For testing normal distribution of the data obtained from toxicity studies, the data of the concurrent control group may be examined, and for the data that show a non-normal distribution, non-parametric tests with robustness may be applied.

  2. Root location in random trees: a polarity property of all sampling consistent phylogenetic models except one.

    PubMed

    Steel, Mike

    2012-10-01

    Neutral macroevolutionary models, such as the Yule model, give rise to a probability distribution on the set of discrete rooted binary trees over a given leaf set. Such models can provide a signal as to the approximate location of the root when only the unrooted phylogenetic tree is known, and this signal becomes relatively more significant as the number of leaves grows. In this short note, we show that among models that treat all taxa equally, and are sampling consistent (i.e. the distribution on trees is not affected by taxa yet to be included), all such models, except one (the so-called PDA model), convey some information as to the location of the ancestral root in an unrooted tree. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Factors influencing non-native tree species distribution in urban landscapes

    Treesearch

    Wayne C. Zipperer

    2010-01-01

    Non-native species are presumed to be pervasive across the urban landscape. Yet, we actually know very little about their actual distribution. For this study, vegetation plot data from Syracuse, NY and Baltimore, MD were used to examine non-native tree species distribution in urban landscapes. Data were collected from remnant and emergent forest patches on upland sites...

  4. Peculiarities of intracranial arachnoid cysts: location, sidedness, and sex distribution in 126 consecutive patients.

    PubMed

    Wester, K

    1999-10-01

    To study the distribution of intracranial arachnoid cysts in a large and nonbiased patient population. One hundred twenty-six patients with 132 arachnoid cysts were studied. Patients were consecutively referred to our department during a 10-year period from a well-defined geographical area with a stable population. The cysts had a strong predilection for the middle cranial fossa; 86 patients (65.2%) had cysts in this location. Of 106 cysts with clearly unilateral distribution, 64 were located on the left side and 42 on the right side. This significant difference resulted solely from the marked preponderance of middle fossa cysts for the left (left-to-right ratio, 2.1:1). There were significantly more males than females (92 males/34 females). This difference was exclusively due to male preponderance of unilateral middle fossa cysts (66 males/14 females; ratio, 4.7:1). For all other cyst locations, there was no difference between the two sexes (26 males/20 females) or the two sides (10 left, 16 right). The marked left-sidedness for middle fossa cysts was found only in males. Females had an even distribution between the two sides. Arachnoid cysts have a strong predilection for the middle cranial fossa that may be explained by a meningeal maldevelopment theory: the arachnoid coverings of the temporal and frontal lobes fail to merge when the sylvian fissure is formed in early fetal life, thereby creating a noncommunicating fluid compartment entirely surrounded by arachnoid membranes. Why males develop more middle fossa cysts on the left side remains a mystery.

  5. Kanerva's sparse distributed memory with multiple hamming thresholds

    NASA Technical Reports Server (NTRS)

    Pohja, Seppo; Kaski, Kimmo

    1992-01-01

    If the stored input patterns of Kanerva's Sparse Distributed Memory (SDM) are highly correlated, utilization of the storage capacity is very low compared to the case of uniformly distributed random input patterns. We consider a variation of SDM that has a better storage capacity utilization for correlated input patterns. This approach uses a separate selection threshold for each physical storage address or hard location. The selection of the hard locations for reading or writing can be done in parallel of which SDM implementations can benefit.

  6. A model for distribution centers location-routing problem on a multimodal transportation network with a meta-heuristic solving approach

    NASA Astrophysics Data System (ADS)

    Fazayeli, Saeed; Eydi, Alireza; Kamalabadi, Isa Nakhai

    2017-07-01

    Nowadays, organizations have to compete with different competitors in regional, national and international levels, so they have to improve their competition capabilities to survive against competitors. Undertaking activities on a global scale requires a proper distribution system which could take advantages of different transportation modes. Accordingly, the present paper addresses a location-routing problem on multimodal transportation network. The introduced problem follows four objectives simultaneously which form main contribution of the paper; determining multimodal routes between supplier and distribution centers, locating mode changing facilities, locating distribution centers, and determining product delivery tours from the distribution centers to retailers. An integer linear programming is presented for the problem, and a genetic algorithm with a new chromosome structure proposed to solve the problem. Proposed chromosome structure consists of two different parts for multimodal transportation and location-routing parts of the model. Based on published data in the literature, two numerical cases with different sizes generated and solved. Also, different cost scenarios designed to better analyze model and algorithm performance. Results show that algorithm can effectively solve large-size problems within a reasonable time which GAMS software failed to reach an optimal solution even within much longer times.

  7. A model for distribution centers location-routing problem on a multimodal transportation network with a meta-heuristic solving approach

    NASA Astrophysics Data System (ADS)

    Fazayeli, Saeed; Eydi, Alireza; Kamalabadi, Isa Nakhai

    2018-07-01

    Nowadays, organizations have to compete with different competitors in regional, national and international levels, so they have to improve their competition capabilities to survive against competitors. Undertaking activities on a global scale requires a proper distribution system which could take advantages of different transportation modes. Accordingly, the present paper addresses a location-routing problem on multimodal transportation network. The introduced problem follows four objectives simultaneously which form main contribution of the paper; determining multimodal routes between supplier and distribution centers, locating mode changing facilities, locating distribution centers, and determining product delivery tours from the distribution centers to retailers. An integer linear programming is presented for the problem, and a genetic algorithm with a new chromosome structure proposed to solve the problem. Proposed chromosome structure consists of two different parts for multimodal transportation and location-routing parts of the model. Based on published data in the literature, two numerical cases with different sizes generated and solved. Also, different cost scenarios designed to better analyze model and algorithm performance. Results show that algorithm can effectively solve large-size problems within a reasonable time which GAMS software failed to reach an optimal solution even within much longer times.

  8. Some design issues of strata-matched non-randomized studies with survival outcomes.

    PubMed

    Mazumdar, Madhu; Tu, Donsheng; Zhou, Xi Kathy

    2006-12-15

    Non-randomized studies for the evaluation of a medical intervention are useful for quantitative hypothesis generation before the initiation of a randomized trial and also when randomized clinical trials are difficult to conduct. A strata-matched non-randomized design is often utilized where subjects treated by a test intervention are matched to a fixed number of subjects treated by a standard intervention within covariate based strata. In this paper, we consider the issue of sample size calculation for this design. Based on the asymptotic formula for the power of a stratified log-rank test, we derive a formula to calculate the minimum number of subjects in the test intervention group that is required to detect a given relative risk between the test and standard interventions. When this minimum number of subjects in the test intervention group is available, an equation is also derived to find the multiple that determines the number of subjects in the standard intervention group within each stratum. The methodology developed is applied to two illustrative examples in gastric cancer and sarcoma.

  9. School-located Influenza Vaccinations for Adolescents: A Randomized Controlled Trial.

    PubMed

    Szilagyi, Peter G; Schaffer, Stanley; Rand, Cynthia M; Goldstein, Nicolas P N; Vincelli, Phyllis; Hightower, A Dirk; Younge, Mary; Eagan, Ashley; Blumkin, Aaron; Albertin, Christina S; DiBitetto, Kristine; Yoo, Byung-Kwang; Humiston, Sharon G

    2018-02-01

    We aimed to evaluate the effect of school-located influenza vaccination (SLIV) on adolescents' influenza vaccination rates. In 2015-2016, we performed a cluster-randomized trial of adolescent SLIV in middle/high schools. We selected 10 pairs of schools (identical grades within pairs) and randomly allocated schools within pairs to SLIV or usual care control. At eight suburban SLIV schools, we sent parents e-mail notifications about upcoming SLIV clinics and promoted online immunization consent. At two urban SLIV schools, we sent parents (via student backpack fliers) paper immunization consent forms and information about SLIV. E-mails were unavailable at these schools. Local health department nurses administered nasal or injectable influenza vaccine at dedicated SLIV clinics and billed insurers. We compared influenza vaccination rates at SLIV versus control schools using school directories to identify the student sample in each school. We used the state immunization registry to determine receipt of influenza vaccination. The final sample comprised 17,650 students enrolled in the 20 schools. Adolescents at suburban SLIV schools had higher overall influenza vaccination rates than did adolescents at control schools (51% vs. 46%, p < .001; adjusted odds ratio = 1.27, 95% confidence interval 1.18-1.38, controlling for vaccination during the prior two seasons). No effect of SLIV was noted among urbanschools on multivariate analysis. SLIV did not substitute for vaccinations in primary care or other settings; in suburban settings, SLIV was associated with increased vaccinations in primary care or other settings (adjusted odds ratio = 1.10, 95% confidence interval 1.02-1.19). SLIV in this community increased influenza vaccination rates among adolescents attending suburban schools. Copyright © 2018. Published by Elsevier Inc.

  10. Hydraulic head estimation at unobserved locations: Approximating the distribution of the absolute error based on geologic interpretations

    NASA Astrophysics Data System (ADS)

    Langousis, Andreas; Kaleris, Vassilios; Xeygeni, Vagia; Magkou, Foteini

    2017-04-01

    Assessing the availability of groundwater reserves at a regional level, requires accurate and robust hydraulic head estimation at multiple locations of an aquifer. To that extent, one needs groundwater observation networks that can provide sufficient information to estimate the hydraulic head at unobserved locations. The density of such networks is largely influenced by the spatial distribution of the hydraulic conductivity in the aquifer, and it is usually determined through trial-and-error, by solving the groundwater flow based on a properly selected set of alternative but physically plausible geologic structures. In this work, we use: 1) dimensional analysis, and b) a pulse-based stochastic model for simulation of synthetic aquifer structures, to calculate the distribution of the absolute error in hydraulic head estimation as a function of the standardized distance from the nearest measuring locations. The resulting distributions are proved to encompass all possible small-scale structural dependencies, exhibiting characteristics (bounds, multi-modal features etc.) that can be explained using simple geometric arguments. The obtained results are promising, pointing towards the direction of establishing design criteria based on large-scale geologic maps.

  11. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.

    1983-01-01

    Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  12. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.H.

    1980-01-01

    Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  13. Does Mass Azithromycin Distribution Impact Child Growth and Nutrition in Niger? A Cluster-Randomized Trial

    PubMed Central

    Amza, Abdou; Yu, Sun N.; Kadri, Boubacar; Nassirou, Baido; Stoller, Nicole E.; Zhou, Zhaoxia; West, Sheila K.; Bailey, Robin L.; Gaynor, Bruce D.; Keenan, Jeremy D.; Porco, Travis C.; Lietman, Thomas M.

    2014-01-01

    Background Antibiotic use on animals demonstrates improved growth regardless of whether or not there is clinical evidence of infectious disease. Antibiotics used for trachoma control may play an unintended benefit of improving child growth. Methodology In this sub-study of a larger randomized controlled trial, we assess anthropometry of pre-school children in a community-randomized trial of mass oral azithromycin distributions for trachoma in Niger. We measured height, weight, and mid-upper arm circumference (MUAC) in 12 communities randomized to receive annual mass azithromycin treatment of everyone versus 12 communities randomized to receive biannual mass azithromycin treatments for children, 3 years after the initial mass treatment. We collected measurements in 1,034 children aged 6–60 months of age. Principal Findings We found no difference in the prevalence of wasting among children in the 12 annually treated communities that received three mass azithromycin distributions compared to the 12 biannually treated communities that received six mass azithromycin distributions (odds ratio = 0.88, 95% confidence interval = 0.53 to 1.49). Conclusions/Significance We were unable to demonstrate a statistically significant difference in stunting, underweight, and low MUAC of pre-school children in communities randomized to annual mass azithromycin treatment or biannual mass azithromycin treatment. The role of antibiotics on child growth and nutrition remains unclear, but larger studies and longitudinal trials may help determine any association. PMID:25210836

  14. Optimal random search for a single hidden target.

    PubMed

    Snider, Joseph

    2011-01-01

    A single target is hidden at a location chosen from a predetermined probability distribution. Then, a searcher must find a second probability distribution from which random search points are sampled such that the target is found in the minimum number of trials. Here it will be shown that if the searcher must get very close to the target to find it, then the best search distribution is proportional to the square root of the target distribution regardless of dimension. For a Gaussian target distribution, the optimum search distribution is approximately a Gaussian with a standard deviation that varies inversely with how close the searcher must be to the target to find it. For a network where the searcher randomly samples nodes and looks for the fixed target along edges, the optimum is either to sample a node with probability proportional to the square root of the out-degree plus 1 or not to do so at all.

  15. A Random Forest Approach to Predict the Spatial Distribution ...

    EPA Pesticide Factsheets

    Modeling the magnitude and distribution of sediment-bound pollutants in estuaries is often limited by incomplete knowledge of the site and inadequate sample density. To address these modeling limitations, a decision-support tool framework was conceived that predicts sediment contamination from the sub-estuary to broader estuary extent. For this study, a Random Forest (RF) model was implemented to predict the distribution of a model contaminant, triclosan (5-chloro-2-(2,4-dichlorophenoxy)phenol) (TCS), in Narragansett Bay, Rhode Island, USA. TCS is an unregulated contaminant used in many personal care products. The RF explanatory variables were associated with TCS transport and fate (proxies) and direct and indirect environmental entry. The continuous RF TCS concentration predictions were discretized into three levels of contamination (low, medium, and high) for three different quantile thresholds. The RF model explained 63% of the variance with a minimum number of variables. Total organic carbon (TOC) (transport and fate proxy) was a strong predictor of TCS contamination causing a mean squared error increase of 59% when compared to permutations of randomized values of TOC. Additionally, combined sewer overflow discharge (environmental entry) and sand (transport and fate proxy) were strong predictors. The discretization models identified a TCS area of greatest concern in the northern reach of Narragansett Bay (Providence River sub-estuary), which was validated wi

  16. Indoor Location Sensing with Invariant Wi-Fi Received Signal Strength Fingerprinting.

    PubMed

    Husen, Mohd Nizam; Lee, Sukhan

    2016-11-11

    A method of location fingerprinting based on the Wi-Fi received signal strength (RSS) in an indoor environment is presented. The method aims to overcome the RSS instability due to varying channel disturbances in time by introducing the concept of invariant RSS statistics. The invariant RSS statistics represent here the RSS distributions collected at individual calibration locations under minimal random spatiotemporal disturbances in time. The invariant RSS statistics thus collected serve as the reference pattern classes for fingerprinting. Fingerprinting is carried out at an unknown location by identifying the reference pattern class that maximally supports the spontaneous RSS sensed from individual Wi-Fi sources. A design guideline is also presented as a rule of thumb for estimating the number of Wi-Fi signal sources required to be available for any given number of calibration locations under a certain level of random spatiotemporal disturbances. Experimental results show that the proposed method not only provides 17% higher success rate than conventional ones but also removes the need for recalibration. Furthermore, the resolution is shown finer by 40% with the execution time more than an order of magnitude faster than the conventional methods. These results are also backed up by theoretical analysis.

  17. On the Response of a Nonlinear Structure to High Kurtosis Non-Gaussian Random Loadings

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Przekop, Adam; Turner, Travis L.

    2011-01-01

    This paper is a follow-on to recent work by the authors in which the response and high-cycle fatigue of a nonlinear structure subject to non-Gaussian loadings was found to vary markedly depending on the nature of the loading. There it was found that a non-Gaussian loading having a steady rate of short-duration, high-excursion peaks produced essentially the same response as would have been incurred by a Gaussian loading. In contrast, a non-Gaussian loading having the same kurtosis, but with bursts of high-excursion peaks was found to elicit a much greater response. This work is meant to answer the question of when consideration of a loading probability distribution other than Gaussian is important. The approach entailed nonlinear numerical simulation of a beam structure under Gaussian and non-Gaussian random excitations. Whether the structure responded in a Gaussian or non-Gaussian manner was determined by adherence to, or violations of, the Central Limit Theorem. Over a practical range of damping, it was found that the linear response to a non-Gaussian loading was Gaussian when the period of the system impulse response is much greater than the rate of peaks in the loading. Lower damping reduced the kurtosis, but only when the linear response was non-Gaussian. In the nonlinear regime, the response was found to be non-Gaussian for all loadings. The effect of a spring-hardening type of nonlinearity was found to limit extreme values and thereby lower the kurtosis relative to the linear response regime. In this case, lower damping gave rise to greater nonlinearity, resulting in lower kurtosis than a higher level of damping.

  18. Atypical epigenetic mark in an atypical location: cytosine methylation at asymmetric (CNN) sites within the body of a non-repetitive tomato gene.

    PubMed

    González, Rodrigo M; Ricardi, Martiniano M; Iusem, Norberto D

    2011-05-20

    Eukaryotic DNA methylation is one of the most studied epigenetic processes, as it results in a direct and heritable covalent modification triggered by external stimuli. In contrast to mammals, plant DNA methylation, which is stimulated by external cues exemplified by various abiotic types of stress, is often found not only at CG sites but also at CNG (N denoting A, C or T) and CNN (asymmetric) sites. A genome-wide analysis of DNA methylation in Arabidopsis has shown that CNN methylation is preferentially concentrated in transposon genes and non-coding repetitive elements. We are particularly interested in investigating the epigenetics of plant species with larger and more complex genomes than Arabidopsis, particularly with regards to the associated alterations elicited by abiotic stress. We describe the existence of CNN-methylated epialleles that span Asr1, a non-transposon, protein-coding gene from tomato plants that lacks an orthologous counterpart in Arabidopsis. In addition, to test the hypothesis of a link between epigenetics modifications and the adaptation of crop plants to abiotic stress, we exhaustively explored the cytosine methylation status in leaf Asr1 DNA, a model gene in our system, resulting from water-deficit stress conditions imposed on tomato plants. We found that drought conditions brought about removal of methyl marks at approximately 75 of the 110 asymmetric (CNN) sites analysed, concomitantly with a decrease of the repressive H3K27me3 epigenetic mark and a large induction of expression at the RNA level. When pinpointing those sites, we observed that demethylation occurred mostly in the intronic region. These results demonstrate a novel genomic distribution of CNN methylation, namely in the transcribed region of a protein-coding, non-repetitive gene, and the changes in those epigenetic marks that are caused by water stress. These findings may represent a general mechanism for the acquisition of new epialleles in somatic cells, which are

  19. Quantum tunneling recombination in a system of randomly distributed trapped electrons and positive ions.

    PubMed

    Pagonis, Vasilis; Kulp, Christopher; Chaney, Charity-Grace; Tachiya, M

    2017-09-13

    During the past 10 years, quantum tunneling has been established as one of the dominant mechanisms for recombination in random distributions of electrons and positive ions, and in many dosimetric materials. Specifically quantum tunneling has been shown to be closely associated with two important effects in luminescence materials, namely long term afterglow luminescence and anomalous fading. Two of the common assumptions of quantum tunneling models based on random distributions of electrons and positive ions are: (a) An electron tunnels from a donor to the nearest acceptor, and (b) the concentration of electrons is much lower than that of positive ions at all times during the tunneling process. This paper presents theoretical studies for arbitrary relative concentrations of electrons and positive ions in the solid. Two new differential equations are derived which describe the loss of charge in the solid by tunneling, and they are solved analytically. The analytical solution compares well with the results of Monte Carlo simulations carried out in a random distribution of electrons and positive ions. Possible experimental implications of the model are discussed for tunneling phenomena in long term afterglow signals, and also for anomalous fading studies in feldspars and apatite samples.

  20. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    ERIC Educational Resources Information Center

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  1. Reconstructing Spatial Distributions from Anonymized Locations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horey, James L; Forrest, Stephanie; Groat, Michael

    2012-01-01

    Devices such as mobile phones, tablets, and sensors are often equipped with GPS that accurately report a person's location. Combined with wireless communication, these devices enable a wide range of new social tools and applications. These same qualities, however, leave location-aware applications vulnerable to privacy violations. This paper introduces the Negative Quad Tree, a privacy protection method for location aware applications. The method is broadly applicable to applications that use spatial density information, such as social applications that measure the popularity of social venues. The method employs a simple anonymization algorithm running on mobile devices, and a more complex reconstructionmore » algorithm on a central server. This strategy is well suited to low-powered mobile devices. The paper analyzes the accuracy of the reconstruction method in a variety of simulated and real-world settings and demonstrates that the method is accurate enough to be used in many real-world scenarios.« less

  2. Error Distribution Evaluation of the Third Vanishing Point Based on Random Statistical Simulation

    NASA Astrophysics Data System (ADS)

    Li, C.

    2012-07-01

    POS, integrated by GPS / INS (Inertial Navigation Systems), has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems). However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus) and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY). How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY) and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ) is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.

  3. An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution

    NASA Technical Reports Server (NTRS)

    Campbell, C. W.

    1983-01-01

    An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.

  4. Significance of Random Bladder Biopsies in Non-Muscle Invasive Bladder Cancer

    PubMed Central

    Kumano, Masafumi; Miyake, Hideaki; Nakano, Yuzo; Fujisawa, Masato

    2013-01-01

    Background/Aims To evaluate retrospectively the clinical outcome of random bladder biopsies in patients with non-muscle invasive bladder cancer (NMIBC) undergoing transurethral resection (TUR). Patients and Method This study included 234 consecutive patients with NMIBC who underwent random biopsies from normal-appearing urothelium of the bladder, including the anterior wall, posterior wall, right wall, left wall, dome, trigone and/or prostatic urethra, during TUR. Result Thirty-seven patients (15.8%) were diagnosed by random biopsies as having urothelial cancer. Among several factors available prior to TUR, preoperative urinary cytology appeared to be independently related to the detection of urothelial cancer in random biopsies on multivariate analysis. Urinary cytology prior to TUR gave 50.0% sensitivity, 91.7% specificity, 56.8% positive predictive value and 89.3% negative predictive value for predicting the findings of the random biopsies. Conclusion Biopsies of normal-appearing urothelium resulted in the additional detection of urothelial cancer in a definite proportion of NMIBC patients, and it remains difficult to find a reliable alternative to random biopsies. Collectively, these findings suggest that it would be beneficial to perform random biopsies as part of the routine management of NMIBC. PMID:24917759

  5. Topology determines force distributions in one-dimensional random spring networks.

    PubMed

    Heidemann, Knut M; Sageman-Furnas, Andrew O; Sharma, Abhinav; Rehfeldt, Florian; Schmidt, Christoph F; Wardetzky, Max

    2018-02-01

    Networks of elastic fibers are ubiquitous in biological systems and often provide mechanical stability to cells and tissues. Fiber-reinforced materials are also common in technology. An important characteristic of such materials is their resistance to failure under load. Rupture occurs when fibers break under excessive force and when that failure propagates. Therefore, it is crucial to understand force distributions. Force distributions within such networks are typically highly inhomogeneous and are not well understood. Here we construct a simple one-dimensional model system with periodic boundary conditions by randomly placing linear springs on a circle. We consider ensembles of such networks that consist of N nodes and have an average degree of connectivity z but vary in topology. Using a graph-theoretical approach that accounts for the full topology of each network in the ensemble, we show that, surprisingly, the force distributions can be fully characterized in terms of the parameters (N,z). Despite the universal properties of such (N,z) ensembles, our analysis further reveals that a classical mean-field approach fails to capture force distributions correctly. We demonstrate that network topology is a crucial determinant of force distributions in elastic spring networks.

  6. Topology determines force distributions in one-dimensional random spring networks

    NASA Astrophysics Data System (ADS)

    Heidemann, Knut M.; Sageman-Furnas, Andrew O.; Sharma, Abhinav; Rehfeldt, Florian; Schmidt, Christoph F.; Wardetzky, Max

    2018-02-01

    Networks of elastic fibers are ubiquitous in biological systems and often provide mechanical stability to cells and tissues. Fiber-reinforced materials are also common in technology. An important characteristic of such materials is their resistance to failure under load. Rupture occurs when fibers break under excessive force and when that failure propagates. Therefore, it is crucial to understand force distributions. Force distributions within such networks are typically highly inhomogeneous and are not well understood. Here we construct a simple one-dimensional model system with periodic boundary conditions by randomly placing linear springs on a circle. We consider ensembles of such networks that consist of N nodes and have an average degree of connectivity z but vary in topology. Using a graph-theoretical approach that accounts for the full topology of each network in the ensemble, we show that, surprisingly, the force distributions can be fully characterized in terms of the parameters (N ,z ) . Despite the universal properties of such (N ,z ) ensembles, our analysis further reveals that a classical mean-field approach fails to capture force distributions correctly. We demonstrate that network topology is a crucial determinant of force distributions in elastic spring networks.

  7. High Voltage Distribution System (HVDS) as a better system compared to Low Voltage Distribution System (LVDS) applied at Medan city power network

    NASA Astrophysics Data System (ADS)

    Dinzi, R.; Hamonangan, TS; Fahmi, F.

    2018-02-01

    In the current distribution system, a large-capacity distribution transformer supplies loads to remote locations. The use of 220/380 V network is nowadays less common compared to 20 kV network. This results in losses due to the non-optimal distribution transformer, which neglected the load location, poor consumer profile, and large power losses along the carrier. This paper discusses how high voltage distribution systems (HVDS) can be a better system used in distribution networks than the currently used distribution system (Low Voltage Distribution System, LVDS). The proposed change of the system into the new configuration is done by replacing a large-capacity distribution transformer with some smaller-capacity distribution transformers and installed them in positions that closest to the load. The use of high voltage distribution systems will result in better voltage profiles and fewer power losses. From the non-technical side, the annual savings and payback periods on high voltage distribution systems will also be the advantage.

  8. Scaling characteristics of one-dimensional fractional diffusion processes in the presence of power-law distributed random noise

    NASA Astrophysics Data System (ADS)

    Nezhadhaghighi, Mohsen Ghasemi

    2017-08-01

    Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ -stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α . We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ -stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.

  9. Scaling characteristics of one-dimensional fractional diffusion processes in the presence of power-law distributed random noise.

    PubMed

    Nezhadhaghighi, Mohsen Ghasemi

    2017-08-01

    Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ-stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α. We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ-stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.

  10. One program, multiple training sites: does site of family medicine training influence professional practice location?

    PubMed

    Jamieson, Jean L; Kernahan, Jill; Calam, Betty; Sivertz, Kristin S

    2013-01-01

    Numerous strategies have been suggested to increase recruitment of family physicians to rural communities and smaller regional centers. One approach has been to implement distributed postgraduate education programs where trainees spend substantial time in such communities. The purpose of the current study was to compare the eventual practice location of family physicians who undertook their postgraduate training through a single university but who were based in either metropolitan or distributed, non-metropolitan communities. Since 1998, the Department of Family Practice at the University of British Columbia in Canada has conducted an annual survey of its residents at 2, 5, and 10 years after completion of training. The authors received Ethics Board approval to use this anonymized data to identify personal and educational factors that predict future practice location. The overall response rate was 45%. At 2 years (N=222), residents trained in distributed sites were 15 times more likely to enter practice in rural communities, small towns and regional centers than those who trained in metropolitan teaching centers. This was even more predictive for retention in non-urban practice sites. Among the subgroup of physicians who remained in a single practice location for more than a year preceding the survey, those who trained in smaller sites were 36 times more likely to choose a rural or regional practice setting. While the vast majority of those trained in metropolitan sites chose an urban practice location, a subgroup of those with some rural upbringing were more likely to practice in rural or regional settings. Trainees from distributed sites considered themselves more prepared for practice regardless of ultimate practice location. Participation in a distributed postgraduate family medicine training site is an important predictor of a non-urban practice location. This effect persists for 10 years after completion of training and is independent of other predictors of

  11. Sensor Location Problem Optimization for Traffic Network with Different Spatial Distributions of Traffic Information.

    PubMed

    Bao, Xu; Li, Haijian; Qin, Lingqiao; Xu, Dongwei; Ran, Bin; Rong, Jian

    2016-10-27

    To obtain adequate traffic information, the density of traffic sensors should be sufficiently high to cover the entire transportation network. However, deploying sensors densely over the entire network may not be realistic for practical applications due to the budgetary constraints of traffic management agencies. This paper describes several possible spatial distributions of traffic information credibility and proposes corresponding different sensor information credibility functions to describe these spatial distribution properties. A maximum benefit model and its simplified model are proposed to solve the traffic sensor location problem. The relationships between the benefit and the number of sensors are formulated with different sensor information credibility functions. Next, expanding models and algorithms in analytic results are performed. For each case, the maximum benefit, the optimal number and spacing of sensors are obtained and the analytic formulations of the optimal sensor locations are derived as well. Finally, a numerical example is proposed to verify the validity and availability of the proposed models for solving a network sensor location problem. The results show that the optimal number of sensors of segments with different model parameters in an entire freeway network can be calculated. Besides, it can also be concluded that the optimal sensor spacing is independent of end restrictions but dependent on the values of model parameters that represent the physical conditions of sensors and roads.

  12. Sensor Location Problem Optimization for Traffic Network with Different Spatial Distributions of Traffic Information

    PubMed Central

    Bao, Xu; Li, Haijian; Qin, Lingqiao; Xu, Dongwei; Ran, Bin; Rong, Jian

    2016-01-01

    To obtain adequate traffic information, the density of traffic sensors should be sufficiently high to cover the entire transportation network. However, deploying sensors densely over the entire network may not be realistic for practical applications due to the budgetary constraints of traffic management agencies. This paper describes several possible spatial distributions of traffic information credibility and proposes corresponding different sensor information credibility functions to describe these spatial distribution properties. A maximum benefit model and its simplified model are proposed to solve the traffic sensor location problem. The relationships between the benefit and the number of sensors are formulated with different sensor information credibility functions. Next, expanding models and algorithms in analytic results are performed. For each case, the maximum benefit, the optimal number and spacing of sensors are obtained and the analytic formulations of the optimal sensor locations are derived as well. Finally, a numerical example is proposed to verify the validity and availability of the proposed models for solving a network sensor location problem. The results show that the optimal number of sensors of segments with different model parameters in an entire freeway network can be calculated. Besides, it can also be concluded that the optimal sensor spacing is independent of end restrictions but dependent on the values of model parameters that represent the physical conditions of sensors and roads. PMID:27801794

  13. Liquid water breakthrough location distances on a gas diffusion layer of polymer electrolyte membrane fuel cells

    NASA Astrophysics Data System (ADS)

    Yu, Junliang; Froning, Dieter; Reimer, Uwe; Lehnert, Werner

    2018-06-01

    The lattice Boltzmann method is adopted to simulate the three dimensional dynamic process of liquid water breaking through the gas diffusion layer (GDL) in the polymer electrolyte membrane fuel cell. 22 micro-structures of Toray GDL are built based on a stochastic geometry model. It is found that more than one breakthrough locations are formed randomly on the GDL surface. Breakthrough location distance (BLD) are analyzed statistically in two ways. The distribution is evaluated statistically by the Lilliefors test. It is concluded that the BLD can be described by the normal distribution with certain statistic characteristics. Information of the shortest neighbor breakthrough location distance can be the input modeling setups on the cell-scale simulations in the field of fuel cell simulation.

  14. Distribution of G concurrence of random pure states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cappellini, Valerio; Sommers, Hans-Juergen; Zyczkowski, Karol

    2006-12-15

    The average entanglement of random pure states of an NxN composite system is analyzed. We compute the average value of the determinant D of the reduced state, which forms an entanglement monotone. Calculating higher moments of the determinant, we characterize the probability distribution P(D). Similar results are obtained for the rescaled Nth root of the determinant, called the G concurrence. We show that in the limit N{yields}{infinity} this quantity becomes concentrated at a single point G{sub *}=1/e. The position of the concentration point changes if one consider an arbitrary NxK bipartite system, in the joint limit N,K{yields}{infinity}, with K/N fixed.

  15. Distributed measurement of acoustic vibration location with frequency multiplexed phase-OTDR

    NASA Astrophysics Data System (ADS)

    Iida, Daisuke; Toge, Kunihiro; Manabe, Tetsuya

    2017-07-01

    All-fiber distributed vibration sensing is attracting attention in relation to structural health monitoring because it is cost effective, offers high coverage of the monitored area and can detect various structural problems. And in particular the demand for high-speed vibration sensing operating at more than 10 kHz has increased because high frequency vibration indicates high energy and severe trouble in the monitored object. Optical fiber vibration sensing with phase-sensitive optical time domain reflectometry (phase-OTDR) has long been studied because it can be used for distributed vibration sensing in optical fiber. However, pulse reflectometry such as OTDR cannot measure high-frequency vibration whose cycle is shorter than the repetition time of the OTDR. That is, the maximum detectable frequency depends on fiber length. In this paper, we describe a vibration sensing technique with frequency-multiplexed OTDR that can detect the entire distribution of a high-frequency vibration thus allowing us to locate a high-speed vibration point. We can measure the position, frequency and dynamic change of a high-frequency vibration whose cycle is shorter than the repetition time. Both frequency and position are visualized simultaneously for a 5-km fiber with an 80-kHz frequency response and a 20-m spatial resolution.

  16. On Edge Exchangeable Random Graphs

    NASA Astrophysics Data System (ADS)

    Janson, Svante

    2017-06-01

    We study a recent model for edge exchangeable random graphs introduced by Crane and Dempsey; in particular we study asymptotic properties of the random simple graph obtained by merging multiple edges. We study a number of examples, and show that the model can produce dense, sparse and extremely sparse random graphs. One example yields a power-law degree distribution. We give some examples where the random graph is dense and converges a.s. in the sense of graph limit theory, but also an example where a.s. every graph limit is the limit of some subsequence. Another example is sparse and yields convergence to a non-integrable generalized graphon defined on (0,∞).

  17. Random bit generation at tunable rates using a chaotic semiconductor laser under distributed feedback.

    PubMed

    Li, Xiao-Zhou; Li, Song-Sui; Zhuang, Jun-Ping; Chan, Sze-Chun

    2015-09-01

    A semiconductor laser with distributed feedback from a fiber Bragg grating (FBG) is investigated for random bit generation (RBG). The feedback perturbs the laser to emit chaotically with the intensity being sampled periodically. The samples are then converted into random bits by a simple postprocessing of self-differencing and selecting bits. Unlike a conventional mirror that provides localized feedback, the FBG provides distributed feedback which effectively suppresses the information of the round-trip feedback delay time. Randomness is ensured even when the sampling period is commensurate with the feedback delay between the laser and the grating. Consequently, in RBG, the FBG feedback enables continuous tuning of the output bit rate, reduces the minimum sampling period, and increases the number of bits selected per sample. RBG is experimentally investigated at a sampling period continuously tunable from over 16 ns down to 50 ps, while the feedback delay is fixed at 7.7 ns. By selecting 5 least-significant bits per sample, output bit rates from 0.3 to 100 Gbps are achieved with randomness examined by the National Institute of Standards and Technology test suite.

  18. Urn models for response-adaptive randomized designs: a simulation study based on a non-adaptive randomized trial.

    PubMed

    Ghiglietti, Andrea; Scarale, Maria Giovanna; Miceli, Rosalba; Ieva, Francesca; Mariani, Luigi; Gavazzi, Cecilia; Paganoni, Anna Maria; Edefonti, Valeria

    2018-03-22

    Recently, response-adaptive designs have been proposed in randomized clinical trials to achieve ethical and/or cost advantages by using sequential accrual information collected during the trial to dynamically update the probabilities of treatment assignments. In this context, urn models-where the probability to assign patients to treatments is interpreted as the proportion of balls of different colors available in a virtual urn-have been used as response-adaptive randomization rules. We propose the use of Randomly Reinforced Urn (RRU) models in a simulation study based on a published randomized clinical trial on the efficacy of home enteral nutrition in cancer patients after major gastrointestinal surgery. We compare results with the RRU design with those previously published with the non-adaptive approach. We also provide a code written with the R software to implement the RRU design in practice. In detail, we simulate 10,000 trials based on the RRU model in three set-ups of different total sample sizes. We report information on the number of patients allocated to the inferior treatment and on the empirical power of the t-test for the treatment coefficient in the ANOVA model. We carry out a sensitivity analysis to assess the effect of different urn compositions. For each sample size, in approximately 75% of the simulation runs, the number of patients allocated to the inferior treatment by the RRU design is lower, as compared to the non-adaptive design. The empirical power of the t-test for the treatment effect is similar in the two designs.

  19. Experimental demonstration of an active phase randomization and monitor module for quantum key distribution

    NASA Astrophysics Data System (ADS)

    Sun, Shi-Hai; Liang, Lin-Mei

    2012-08-01

    Phase randomization is a very important assumption in the BB84 quantum key distribution (QKD) system with weak coherent source; otherwise, eavesdropper may spy the final key. In this Letter, a stable and monitored active phase randomization scheme for the one-way and two-way QKD system is proposed and demonstrated in experiments. Furthermore, our scheme gives an easy way for Alice to monitor the degree of randomization in experiments. Therefore, we expect our scheme to become a standard part in future QKD systems due to its secure significance and feasibility.

  20. Pallet use in grocery distribution affects forest resource consumption location: a spatial model of grocery pallet use

    Treesearch

    R. Bruce Anderson; R. Bruce Anderson

    1991-01-01

    To assess the impact of grocery pallet production on future hardwood resources, better information is needed on the current use of reusable pallets by the grocery and related products industry. A spatial model of pallet use in the grocery distribution system that identifies the locational aspects of grocery pallet production and distribution, determines how these...

  1. A random urine test can identify patients at risk of mesalamine non-adherence: a prospective study.

    PubMed

    Gifford, Anne E; Berg, Anders H; Lahiff, Conor; Cheifetz, Adam S; Horowitz, Gary; Moss, Alan C

    2013-02-01

    Mesalamine non-adherence is common among patients with ulcerative colitis (UC), and can be difficult to identify in practice. We sought to determine whether a random urine test for salicylates could be used as a marker of 5-aminosalicylic acid (5-ASA) ingestion and identify patients at risk of non-adherence. Our aim is to determine whether measurement of salicylates in a random urine sample correlates with 5-ASA levels, and predicts an individual's risk of mesalamine non-adherence. Prospective observational study. Urinary salicylates (by colorimetry) and 5-ASA (by liquid chromatography and tandem-mass spectrometry) were measured in a random urine sample at baseline in patients and controls. Mesalamine adherence was quantified by patient self-reports at enrollment and pharmacy refills of mesalamine over 6 months. A total of 93 patients with UC taking mesalamine maintenance therapy were prospectively enrolled from the clinic. Random urine salicylate levels (by colorimetry) were highly correlated with urine 5-ASA metabolite levels (by mass spectrometry; R2=0.9). A random urine salicylate level above 15 mg/dl distinguished patients who had recently taken mesalamine from controls (area under the curve value 0.9, sensitivity 95%, specificity 77%). A significant proportion of patients (27%) who self-identified as "high adherers" by an adherence questionnaire (Morisky Medication Adherence Scale-8) had random levels of urine salicylate below this threshold. These patients were at higher risk of objectively measured non-adherence to mesalamine over the subsequent 6 months (RR: 2.7, 95% CI: 1.1-7.0). A random urine salicylate level measured in the clinic can identify patients who have not recently taken mesalamine, and who are at higher risk of longitudinal non-adherence. This test could be used to screen patients who may warrant interventions to improve adherence and prevent disease relapse.

  2. On the cause of the non-Gaussian distribution of residuals in geomagnetism

    NASA Astrophysics Data System (ADS)

    Hulot, G.; Khokhlov, A.

    2017-12-01

    To describe errors in the data, Gaussian distributions naturally come to mind. In many practical instances, indeed, Gaussian distributions are appropriate. In the broad field of geomagnetism, however, it has repeatedly been noted that residuals between data and models often display much sharper distributions, sometimes better described by a Laplace distribution. In the present study, we make the case that such non-Gaussian behaviors are very likely the result of what is known as mixture of distributions in the statistical literature. Mixtures arise as soon as the data do not follow a common distribution or are not properly normalized, the resulting global distribution being a mix of the various distributions followed by subsets of the data, or even individual datum. We provide examples of the way such mixtures can lead to distributions that are much sharper than Gaussian distributions and discuss the reasons why such mixtures are likely the cause of the non-Gaussian distributions observed in geomagnetism. We also show that when properly selecting sub-datasets based on geophysical criteria, statistical mixture can sometimes be avoided and much more Gaussian behaviors recovered. We conclude with some general recommendations and point out that although statistical mixture always tends to sharpen the resulting distribution, it does not necessarily lead to a Laplacian distribution. This needs to be taken into account when dealing with such non-Gaussian distributions.

  3. Co-location and Self-Similar Topologies of Urban Infrastructure Networks

    NASA Astrophysics Data System (ADS)

    Klinkhamer, Christopher; Zhan, Xianyuan; Ukkusuri, Satish; Elisabeth, Krueger; Paik, Kyungrock; Rao, Suresh

    2016-04-01

    The co-location of urban infrastructure is too obvious to be easily ignored. For reasons of practicality, reliability, and eminent domain, the spatial locations of many urban infrastructure networks, including drainage, sanitary sewers, and road networks, are well correlated. However, important questions dealing with correlations in the network topologies of differing infrastructure types remain unanswered. Here, we have extracted randomly distributed, nested subnets from the urban drainage, sanitary sewer, and road networks in two distinctly different cities: Amman, Jordan; and Indianapolis, USA. Network analyses were performed for each randomly chosen subnet (location and size), using a dual-mapping approach (Hierarchical Intersection Continuity Negotiation). Topological metrics for each infrastructure type were calculated and compared for all subnets in a given city. Despite large differences in the climate, governance, and populace of the two cities, and functional properties of the different infrastructure types, these infrastructure networks are shown to be highly spatially homogenous. Furthermore, strong correlations are found between topological metrics of differing types of surface and subsurface infrastructure networks. Also, the network topologies of each infrastructure type for both cities are shown to exhibit self-similar characteristics (i.e., power law node-degree distributions, [p(k) = ak-γ]. These findings can be used to assist city planners and engineers either expanding or retrofitting existing infrastructure, or in the case of developing countries, building new cities from the ground up. In addition, the self-similar nature of these infrastructure networks holds significant implications for the vulnerability of these critical infrastructure networks to external hazards and ways in which network resilience can be improved.

  4. Random density matrices versus random evolution of open system

    NASA Astrophysics Data System (ADS)

    Pineda, Carlos; Seligman, Thomas H.

    2015-10-01

    We present and compare two families of ensembles of random density matrices. The first, static ensemble, is obtained foliating an unbiased ensemble of density matrices. As criterion we use fixed purity as the simplest example of a useful convex function. The second, dynamic ensemble, is inspired in random matrix models for decoherence where one evolves a separable pure state with a random Hamiltonian until a given value of purity in the central system is achieved. Several families of Hamiltonians, adequate for different physical situations, are studied. We focus on a two qubit central system, and obtain exact expressions for the static case. The ensemble displays a peak around Werner-like states, modulated by nodes on the degeneracies of the density matrices. For moderate and strong interactions good agreement between the static and the dynamic ensembles is found. Even in a model where one qubit does not interact with the environment excellent agreement is found, but only if there is maximal entanglement with the interacting one. The discussion is started recalling similar considerations for scattering theory. At the end, we comment on the reach of the results for other convex functions of the density matrix, and exemplify the situation with the von Neumann entropy.

  5. Field signatures of non-Fickian transport processes: transit time distributions, spatial correlations, reversibility and hydrogeophysical imaging

    NASA Astrophysics Data System (ADS)

    Le Borgne, T.; Kang, P. K.; Guihéneuf, N.; Shakas, A.; Bour, O.; Linde, N.; Dentz, M.

    2015-12-01

    Non-Fickian transport phenomena are observed in a wide range of scales across hydrological systems. They are generally manifested by a broad range of transit time distributions, as measured for instance in tracer breakthrough curves. However, similar transit time distributions may be caused by different origins, including broad velocity distributions, flow channeling or diffusive mass transfer [1,2]. The identification of these processes is critical for defining relevant transport models. How can we distinguish the different origins of non-Fickian transport in the field? In this presentation, we will review recent experimental developments to decipher the different causes of anomalous transport, based on tracer tests performed at different scales in cross borehole and push pull conditions, and time lapse hydrogeophysical imaging of tracer motion [3,4]. References:[1] de Anna-, P., T. Le Borgne, M. Dentz, A. M. Tartakovsky, D. Bolster, P. Davy (2013) Flow Intermittency, Dispersion and Correlated Continuous Time Random Walks in Porous Media, Phys. Rev. Lett., 110, 184502 [2] Le Borgne T., Dentz M., and Carrera J. (2008) Lagrangian Statistical Model for Transport in Highly Heterogeneous Velocity Fields. Phys. Rev. Lett. 101, 090601 [3] Kang, P. K., T. Le Borgne, M. Dentz, O. Bour, and R. Juanes (2015), Impact of velocity correlation and distribution on transport in fractured media : Field evidence and theoretical model, Water Resour. Res., 51, 940-959 [4] Dorn C., Linde N., Le Borgne T., O. Bour and L. Baron (2011) Single-hole GPR reflection imaging of solute transport in a granitic aquifer Geophys. Res. Lett. Vol.38, L08401

  6. Random walks exhibiting anomalous diffusion: elephants, urns and the limits of normality

    NASA Astrophysics Data System (ADS)

    Kearney, Michael J.; Martin, Richard J.

    2018-01-01

    A random walk model is presented which exhibits a transition from standard to anomalous diffusion as a parameter is varied. The model is a variant on the elephant random walk and differs in respect of the treatment of the initial state, which in the present work consists of a given number N of fixed steps. This also links the elephant random walk to other types of history dependent random walk. As well as being amenable to direct analysis, the model is shown to be asymptotically equivalent to a non-linear urn process. This provides fresh insights into the limiting form of the distribution of the walker’s position at large times. Although the distribution is intrinsically non-Gaussian in the anomalous diffusion regime, it gradually reverts to normal form when N is large under quite general conditions.

  7. Non-homogeneous Behaviour of the Spatial Distribution of Macrospicules

    NASA Astrophysics Data System (ADS)

    Gyenge, N.; Bennett, S.; Erdélyi, R.

    2015-03-01

    In this paper the longitudinal and latitudinal spatial distribution of macrospicules is examined. We found a statistical relationship between the active longitude (determined by sunspot groups) and the longitudinal distribution of macrospicules. This distribution of macrospicules shows an inhomogeneity and non-axisymmetrical behaviour in the time interval between June 2010 and December 2012, covered by observations of the Solar Dynamic Observatory (SDO) satellite. The enhanced positions of the activity and its time variation have been calculated. The migration of the longitudinal distribution of macrospicules shows a similar behaviour to that of the sunspot groups.

  8. Indoor Location Sensing with Invariant Wi-Fi Received Signal Strength Fingerprinting

    PubMed Central

    Husen, Mohd Nizam; Lee, Sukhan

    2016-01-01

    A method of location fingerprinting based on the Wi-Fi received signal strength (RSS) in an indoor environment is presented. The method aims to overcome the RSS instability due to varying channel disturbances in time by introducing the concept of invariant RSS statistics. The invariant RSS statistics represent here the RSS distributions collected at individual calibration locations under minimal random spatiotemporal disturbances in time. The invariant RSS statistics thus collected serve as the reference pattern classes for fingerprinting. Fingerprinting is carried out at an unknown location by identifying the reference pattern class that maximally supports the spontaneous RSS sensed from individual Wi-Fi sources. A design guideline is also presented as a rule of thumb for estimating the number of Wi-Fi signal sources required to be available for any given number of calibration locations under a certain level of random spatiotemporal disturbances. Experimental results show that the proposed method not only provides 17% higher success rate than conventional ones but also removes the need for recalibration. Furthermore, the resolution is shown finer by 40% with the execution time more than an order of magnitude faster than the conventional methods. These results are also backed up by theoretical analysis. PMID:27845711

  9. Optimum aggregation of geographically distributed flexible resources in strategic smart-grid/microgrid locations

    DOE PAGES

    Bhattarai, Bishnu P.; Myers, Kurt S.; Bak-Jensen, Brigitte; ...

    2017-05-17

    This paper determines optimum aggregation areas for a given distribution network considering spatial distribution of loads and costs of aggregation. An elitist genetic algorithm combined with a hierarchical clustering and a Thevenin network reduction is implemented to compute strategic locations and aggregate demand within each area. The aggregation reduces large distribution networks having thousands of nodes to an equivalent network with few aggregated loads, thereby significantly reducing the computational burden. Furthermore, it not only helps distribution system operators in making faster operational decisions by understanding during which time of the day will be in need of flexibility, from which specificmore » area, and in which amount, but also enables the flexibilities stemming from small distributed resources to be traded in various power/energy markets. A combination of central and local aggregation scheme where a central aggregator enables market participation, while local aggregators materialize the accepted bids, is implemented to realize this concept. The effectiveness of the proposed method is evaluated by comparing network performances with and without aggregation. Finally, for a given network configuration, steady-state performance of aggregated network is significantly accurate (≈ ±1.5% error) compared to very high errors associated with forecast of individual consumer demand.« less

  10. Optimum aggregation of geographically distributed flexible resources in strategic smart-grid/microgrid locations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattarai, Bishnu P.; Myers, Kurt S.; Bak-Jensen, Brigitte

    This paper determines optimum aggregation areas for a given distribution network considering spatial distribution of loads and costs of aggregation. An elitist genetic algorithm combined with a hierarchical clustering and a Thevenin network reduction is implemented to compute strategic locations and aggregate demand within each area. The aggregation reduces large distribution networks having thousands of nodes to an equivalent network with few aggregated loads, thereby significantly reducing the computational burden. Furthermore, it not only helps distribution system operators in making faster operational decisions by understanding during which time of the day will be in need of flexibility, from which specificmore » area, and in which amount, but also enables the flexibilities stemming from small distributed resources to be traded in various power/energy markets. A combination of central and local aggregation scheme where a central aggregator enables market participation, while local aggregators materialize the accepted bids, is implemented to realize this concept. The effectiveness of the proposed method is evaluated by comparing network performances with and without aggregation. Finally, for a given network configuration, steady-state performance of aggregated network is significantly accurate (≈ ±1.5% error) compared to very high errors associated with forecast of individual consumer demand.« less

  11. Distributed memory parallel Markov random fields using graph partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heinemann, C.; Perciano, T.; Ushizima, D.

    Markov random fields (MRF) based algorithms have attracted a large amount of interest in image analysis due to their ability to exploit contextual information about data. Image data generated by experimental facilities, though, continues to grow larger and more complex, making it more difficult to analyze in a reasonable amount of time. Applying image processing algorithms to large datasets requires alternative approaches to circumvent performance problems. Aiming to provide scientists with a new tool to recover valuable information from such datasets, we developed a general purpose distributed memory parallel MRF-based image analysis framework (MPI-PMRF). MPI-PMRF overcomes performance and memory limitationsmore » by distributing data and computations across processors. The proposed approach was successfully tested with synthetic and experimental datasets. Additionally, the performance of the MPI-PMRF framework is analyzed through a detailed scalability study. We show that a performance increase is obtained while maintaining an accuracy of the segmentation results higher than 98%. The contributions of this paper are: (a) development of a distributed memory MRF framework; (b) measurement of the performance increase of the proposed approach; (c) verification of segmentation accuracy in both synthetic and experimental, real-world datasets« less

  12. Non-Gaussian Velocity Distributions in Solar Flares from Extreme Ultraviolet Lines: A Possible Diagnostic of Ion Acceleration

    NASA Astrophysics Data System (ADS)

    Jeffrey, Natasha L. S.; Fletcher, Lyndsay; Labrosse, Nicolas

    2017-02-01

    In a solar flare, a large fraction of the magnetic energy released is converted rapidly to the kinetic energy of non-thermal particles and bulk plasma motion. This will likely result in non-equilibrium particle distributions and turbulent plasma conditions. We investigate this by analyzing the profiles of high temperature extreme ultraviolet emission lines from a major flare (SOL2014-03-29T17:44) observed by the EUV Imaging Spectrometer (EIS) on Hinode. We find that in many locations the line profiles are non-Gaussian, consistent with a kappa distribution of emitting ions with properties that vary in space and time. At the flare footpoints, close to sites of hard X-ray emission from non-thermal electrons, the κ index for the Fe xvi 262.976 Å line at 3 MK takes values of 3-5. In the corona, close to a low-energy HXR source, the Fe xxiii 263.760 Å line at 15 MK shows κ values of typically 4-7. The observed trends in the κ parameter show that we are most likely detecting the properties of the ion population rather than any instrumental effects. We calculate that a non-thermal ion population could exist if locally accelerated on timescales ≤0.1 s. However, observations of net redshifts in the lines also imply the presence of plasma downflows, which could lead to bulk turbulence, with increased non-Gaussianity in cooler regions. Both interpretations have important implications for theories of solar flare particle acceleration.

  13. Non-Gaussian Velocity Distributions in Solar Flares from Extreme Ultraviolet Lines: A Possible Diagnostic of Ion Acceleration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey, Natasha L. S.; Fletcher, Lyndsay; Labrosse, Nicolas

    2017-02-10

    In a solar flare, a large fraction of the magnetic energy released is converted rapidly to the kinetic energy of non-thermal particles and bulk plasma motion. This will likely result in non-equilibrium particle distributions and turbulent plasma conditions. We investigate this by analyzing the profiles of high temperature extreme ultraviolet emission lines from a major flare (SOL2014-03-29T17:44) observed by the EUV Imaging Spectrometer (EIS) on Hinode . We find that in many locations the line profiles are non-Gaussian, consistent with a kappa distribution of emitting ions with properties that vary in space and time. At the flare footpoints, close tomore » sites of hard X-ray emission from non-thermal electrons, the κ index for the Fe xvi 262.976 Å line at 3 MK takes values of 3–5. In the corona, close to a low-energy HXR source, the Fe xxiii 263.760 Å line at 15 MK shows κ values of typically 4–7. The observed trends in the κ parameter show that we are most likely detecting the properties of the ion population rather than any instrumental effects. We calculate that a non-thermal ion population could exist if locally accelerated on timescales ≤0.1 s. However, observations of net redshifts in the lines also imply the presence of plasma downflows, which could lead to bulk turbulence, with increased non-Gaussianity in cooler regions. Both interpretations have important implications for theories of solar flare particle acceleration.« less

  14. Radar prediction of absolute rain fade distributions for earth-satellite paths and general methods for extrapolation of fade statistics to other locations

    NASA Technical Reports Server (NTRS)

    Goldhirsh, J.

    1982-01-01

    The first absolute rain fade distribution method described establishes absolute fade statistics at a given site by means of a sampled radar data base. The second method extrapolates absolute fade statistics from one location to another, given simultaneously measured fade and rain rate statistics at the former. Both methods employ similar conditional fade statistic concepts and long term rain rate distributions. Probability deviations in the 2-19% range, with an 11% average, were obtained upon comparison of measured and predicted levels at given attenuations. The extrapolation of fade distributions to other locations at 28 GHz showed very good agreement with measured data at three sites located in the continental temperate region.

  15. Boundary-layer receptivity due to distributed surface imperfections of a deterministic or random nature

    NASA Technical Reports Server (NTRS)

    Choudhari, Meelan

    1992-01-01

    Acoustic receptivity of a Blasius boundary layer in the presence of distributed surface irregularities is investigated analytically. It is shown that, out of the entire spatial spectrum of the surface irregularities, only a small band of Fourier components can lead to an efficient conversion of the acoustic input at any given frequency to an unstable eigenmode of the boundary layer flow. The location, and width, of this most receptive band of wavenumbers corresponds to a relative detuning of O(R sub l.b.(exp -3/8)) with respect to the lower-neutral instability wavenumber at the frequency under consideration, R sub l.b. being the Reynolds number based on a typical boundary-layer thickness at the lower branch of the neutral stability curve. Surface imperfections in the form of discrete mode waviness in this range of wavenumbers lead to initial instability amplitudes which are O(R sub l.b.(exp 3/8)) larger than those caused by a single, isolated roughness element. In contrast, irregularities with a continuous spatial spectrum produce much smaller instability amplitudes, even compared to the isolated case, since the increase due to the resonant nature of the response is more than that compensated for by the asymptotically small band-width of the receptivity process. Analytical expressions for the maximum possible instability amplitudes, as well as their expectation for an ensemble of statistically irregular surfaces with random phase distributions, are also presented.

  16. Quantized vortices in the ideal bose gas: a physical realization of random polynomials.

    PubMed

    Castin, Yvan; Hadzibabic, Zoran; Stock, Sabine; Dalibard, Jean; Stringari, Sandro

    2006-02-03

    We propose a physical system allowing one to experimentally observe the distribution of the complex zeros of a random polynomial. We consider a degenerate, rotating, quasi-ideal atomic Bose gas prepared in the lowest Landau level. Thermal fluctuations provide the randomness of the bosonic field and of the locations of the vortex cores. These vortices can be mapped to zeros of random polynomials, and observed in the density profile of the gas.

  17. Checklists of Methodological Issues for Review Authors to Consider When Including Non-Randomized Studies in Systematic Reviews

    ERIC Educational Resources Information Center

    Wells, George A.; Shea, Beverley; Higgins, Julian P. T.; Sterne, Jonathan; Tugwell, Peter; Reeves, Barnaby C.

    2013-01-01

    Background: There is increasing interest from review authors about including non-randomized studies (NRS) in their systematic reviews of health care interventions. This series from the Ottawa Non-Randomized Studies Workshop consists of six papers identifying methodological issues when doing this. Aim: To format the guidance from the preceding…

  18. A Non-linear Geodetic Data Inversion Using ABIC for Slip Distribution on a Fault With an Unknown dip Angle

    NASA Astrophysics Data System (ADS)

    Fukahata, Y.; Wright, T. J.

    2006-12-01

    We developed a method of geodetic data inversion for slip distribution on a fault with an unknown dip angle. When fault geometry is unknown, the problem of geodetic data inversion is non-linear. A common strategy for obtaining slip distribution is to first determine the fault geometry by minimizing the square misfit under the assumption of a uniform slip on a rectangular fault, and then apply the usual linear inversion technique to estimate a slip distribution on the determined fault. It is not guaranteed, however, that the fault determined under the assumption of a uniform slip gives the best fault geometry for a spatially variable slip distribution. In addition, in obtaining a uniform slip fault model, we have to simultaneously determine the values of the nine mutually dependent parameters, which is a highly non-linear, complicated process. Although the inverse problem is non-linear for cases with unknown fault geometries, the non-linearity of the problems is actually weak, when we can assume the fault surface to be flat. In particular, when a clear fault trace is observed on the EarthOs surface after an earthquake, we can precisely estimate the strike and the location of the fault. In this case only the dip angle has large ambiguity. In geodetic data inversion we usually need to introduce smoothness constraints in order to compromise reciprocal requirements for model resolution and estimation errors in a natural way. Strictly speaking, the inverse problem with smoothness constraints is also non-linear, even if the fault geometry is known. The non-linearity has been dissolved by introducing AkaikeOs Bayesian Information Criterion (ABIC), with which the optimal value of the relative weight of observed data to smoothness constraints is objectively determined. In this study, using ABIC in determining the optimal dip angle, we dissolved the non-linearity of the inverse problem. We applied the method to the InSAR data of the 1995 Dinar, Turkey earthquake and obtained

  19. Time distributions of solar energetic particle events: Are SEPEs really random?

    NASA Astrophysics Data System (ADS)

    Jiggens, P. T. A.; Gabriel, S. B.

    2009-10-01

    Solar energetic particle events (SEPEs) can exhibit flux increases of several orders of magnitude over background levels and have always been considered to be random in nature in statistical models with no dependence of any one event on the occurrence of previous events. We examine whether this assumption of randomness in time is correct. Engineering modeling of SEPEs is important to enable reliable and efficient design of both Earth-orbiting and interplanetary spacecraft and future manned missions to Mars and the Moon. All existing engineering models assume that the frequency of SEPEs follows a Poisson process. We present analysis of the event waiting times using alternative distributions described by Lévy and time-dependent Poisson processes and compared these with the usual Poisson distribution. The results show significant deviation from a Poisson process and indicate that the underlying physical processes might be more closely related to a Lévy-type process, suggesting that there is some inherent “memory” in the system. Inherent Poisson assumptions of stationarity and event independence are investigated, and it appears that they do not hold and can be dependent upon the event definition used. SEPEs appear to have some memory indicating that events are not completely random with activity levels varying even during solar active periods and are characterized by clusters of events. This could have significant ramifications for engineering models of the SEP environment, and it is recommended that current statistical engineering models of the SEP environment should be modified to incorporate long-term event dependency and short-term system memory.

  20. Effects of cluster location and cluster distribution on performance on the traveling salesman problem.

    PubMed

    MacGregor, James N

    2015-10-01

    Research on human performance in solving traveling salesman problems typically uses point sets as stimuli, and most models have proposed a processing stage at which stimulus dots are clustered. However, few empirical studies have investigated the effects of clustering on performance. In one recent study, researchers compared the effects of clustered, random, and regular stimuli, and concluded that clustering facilitates performance (Dry, Preiss, & Wagemans, 2012). Another study suggested that these results may have been influenced by the location rather than the degree of clustering (MacGregor, 2013). Two experiments are reported that mark an attempt to disentangle these factors. The first experiment tested several combinations of degree of clustering and cluster location, and revealed mixed evidence that clustering influences performance. In a second experiment, both factors were varied independently, showing that they interact. The results are discussed in terms of the importance of clustering effects, in particular, and perceptual factors, in general, during performance of the traveling salesman problem.

  1. Vocal activities reflect the temporal distribution of bottlenose dolphin social and non-social activity in a zoological park.

    PubMed

    Lima, Alice; Lemasson, Alban; Boye, Martin; Hausberger, Martine

    2017-12-01

    Under natural conditions bottlenose dolphins (Tursiops truncatus) spend their time mostly feeding and then travelling, socializing, or resting. These activities are not randomly distributed, with feeding being higher in early morning and late afternoon. Social activities and vocal behavior seem to be very important in dolphin daily activity. This study aimed to describe the activity time-budget and its relation to vocal behavior for dolphins in a zoological park. We recorded behaviors and vocalizations of six dolphins over 2 months. All subjects performed more non-agonistic social interactions and play in the morning than in the afternoon. The different categories of vocalizations were distributed non-randomly throughout the day, with more chirps in the afternoon, when the animals were "less social." The most striking result was the strong correlation between activities and the categories of vocalizations produced. The results confirm the association between burst pulses and whistles with social activities, but also reveal that both are also associated with solitary play. More chirps were produced when dolphins were engaged in socio-sexual behaviors, emphasizing the need for further questioning about the function of this vocal category. This study reveals that: (i) in a group kept in zoological management, social activities are mostly present in the morning; and (ii) the acoustic signals produced by dolphins may give a reliable representation of their current activities. While more studies on the context of signal production are needed, our findings provide a useful tool for understanding free ranging dolphin behavior when they are not visible. © 2017 Wiley Periodicals, Inc.

  2. Scattering by a slab containing randomly located cylinders: comparison between radiative transfer and electromagnetic simulation.

    PubMed

    Roux, L; Mareschal, P; Vukadinovic, N; Thibaud, J B; Greffet, J J

    2001-02-01

    This study is devoted to the examination of scattering of waves by a slab containing randomly located cylinders. For the first time to our knowledge, the complete transmission problem has been solved numerically. We have compared the radiative transfer theory with a numerical solution of the wave equation. We discuss the coherent effects, such as forward-scattering dip and backscattering enhancement. It is seen that the radiative transfer equation can be used with great accuracy even for optically thin systems whose geometric thickness is comparable with the wavelength. We have also shown the presence of dependent scattering.

  3. Non-random association between alleles detected at D4S95 and D4S98 and the Huntington's disease gene.

    PubMed Central

    Theilmann, J; Kanani, S; Shiang, R; Robbins, C; Quarrell, O; Huggins, M; Hedrick, A; Weber, B; Collins, C; Wasmuth, J J

    1989-01-01

    Analysis of many families with linked DNA markers has provided support for the Huntington's disease (HD) gene being close to the telomere on the short arm of chromosome 4. However, analysis of recombination events in particular families has provided conflicting results about the precise location of the HD gene relative to these closely linked DNA markers. Here we report an investigation of linkage disequilibrium between six DNA markers and the HD gene in 75 separate families of varied ancestry. We show significant non-random association between alleles detected at D4S95 and D4S98 and the mutant gene. These data suggest that it may be possible to construct high and low risk haplotypes, which may be helpful in DNA analysis and genetic counselling for HD, and represent independent evidence that the gene for HD is centromeric to more distally located DNA markers such as D4S90. This information may be helpful in defining a strategy to clone the gene for HD based on its location in the human genome. Images PMID:2531224

  4. Alignment in Teacher Education and Distribution of Leadership: An Example Concerning Learning Study

    ERIC Educational Resources Information Center

    Nilsson, Ingrid

    2008-01-01

    The critical aspects distribution of professional leadership, alignment in learning and research close to practices, were lifted forward in order to exemplify a research project with learning study as an approach for alignment between teacher education and practice, and as consequence an instrument for distribution of power. The results showed…

  5. Reducing bias in survival under non-random temporary emigration

    USGS Publications Warehouse

    Peñaloza, Claudia L.; Kendall, William L.; Langtimm, Catherine Ann

    2014-01-01

    Despite intensive monitoring, temporary emigration from the sampling area can induce bias severe enough for managers to discard life-history parameter estimates toward the terminus of the times series (terminal bias). Under random temporary emigration unbiased parameters can be estimated with CJS models. However, unmodeled Markovian temporary emigration causes bias in parameter estimates and an unobservable state is required to model this type of emigration. The robust design is most flexible when modeling temporary emigration, and partial solutions to mitigate bias have been identified, nonetheless there are conditions were terminal bias prevails. Long-lived species with high adult survival and highly variable non-random temporary emigration present terminal bias in survival estimates, despite being modeled with the robust design and suggested constraints. Because this bias is due to uncertainty about the fate of individuals that are undetected toward the end of the time series, solutions should involve using additional information on survival status or location of these individuals at that time. Using simulation, we evaluated the performance of models that jointly analyze robust design data and an additional source of ancillary data (predictive covariate on temporary emigration, telemetry, dead recovery, or auxiliary resightings) in reducing terminal bias in survival estimates. The auxiliary resighting and predictive covariate models reduced terminal bias the most. Additional telemetry data was effective at reducing terminal bias only when individuals were tracked for a minimum of two years. High adult survival of long-lived species made the joint model with recovery data ineffective at reducing terminal bias because of small-sample bias. The naïve constraint model (last and penultimate temporary emigration parameters made equal), was the least efficient, though still able to reduce terminal bias when compared to an unconstrained model. Joint analysis of several

  6. Locations of Sampling Stations for Water Quality Monitoring in Water Distribution Networks.

    PubMed

    Rathi, Shweta; Gupta, Rajesh

    2014-04-01

    Water quality is required to be monitored in the water distribution networks (WDNs) at salient locations to assure the safe quality of water supplied to the consumers. Such monitoring stations (MSs) provide warning against any accidental contaminations. Various objectives like demand coverage, time for detection, volume of water contaminated before detection, extent of contamination, expected population affected prior to detection, detection likelihood and others, have been independently or jointly considered in determining optimal number and location of MSs in WDNs. "Demand coverage" defined as the percentage of network demand monitored by a particular monitoring station is a simple measure to locate MSs. Several methods based on formulation of coverage matrix using pre-specified coverage criteria and optimization have been suggested. Coverage criteria is defined as some minimum percentage of total flow received at the monitoring stations that passed through any upstream node included then as covered node of the monitoring station. Number of monitoring stations increases with the increase in the value of coverage criteria. Thus, the design of monitoring station becomes subjective. A simple methodology is proposed herein which priority wise iteratively selects MSs to achieve targeted demand coverage. The proposed methodology provided the same number and location of MSs for illustrative network as an optimization method did. Further, the proposed method is simple and avoids subjectivity that could arise from the consideration of coverage criteria. The application of methodology is also shown on a WDN of Dharampeth zone (Nagpur city WDN in Maharashtra, India) having 285 nodes and 367 pipes.

  7. Formulation of the Multi-Hit Model With a Non-Poisson Distribution of Hits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vassiliev, Oleg N., E-mail: Oleg.Vassiliev@albertahealthservices.ca

    2012-07-15

    Purpose: We proposed a formulation of the multi-hit single-target model in which the Poisson distribution of hits was replaced by a combination of two distributions: one for the number of particles entering the target and one for the number of hits a particle entering the target produces. Such an approach reflects the fact that radiation damage is a result of two different random processes: particle emission by a radiation source and interaction of particles with matter inside the target. Methods and Materials: Poisson distribution is well justified for the first of the two processes. The second distribution depends on howmore » a hit is defined. To test our approach, we assumed that the second distribution was also a Poisson distribution. The two distributions combined resulted in a non-Poisson distribution. We tested the proposed model by comparing it with previously reported data for DNA single- and double-strand breaks induced by protons and electrons, for survival of a range of cell lines, and variation of the initial slopes of survival curves with radiation quality for heavy-ion beams. Results: Analysis of cell survival equations for this new model showed that they had realistic properties overall, such as the initial and high-dose slopes of survival curves, the shoulder, and relative biological effectiveness (RBE) In most cases tested, a better fit of survival curves was achieved with the new model than with the linear-quadratic model. The results also suggested that the proposed approach may extend the multi-hit model beyond its traditional role in analysis of survival curves to predicting effects of radiation quality and analysis of DNA strand breaks. Conclusions: Our model, although conceptually simple, performed well in all tests. The model was able to consistently fit data for both cell survival and DNA single- and double-strand breaks. It correctly predicted the dependence of radiation effects on parameters of radiation quality.« less

  8. A random effects meta-analysis model with Box-Cox transformation.

    PubMed

    Yamaguchi, Yusuke; Maruo, Kazushi; Partlett, Christopher; Riley, Richard D

    2017-07-19

    In a random effects meta-analysis model, true treatment effects for each study are routinely assumed to follow a normal distribution. However, normality is a restrictive assumption and the misspecification of the random effects distribution may result in a misleading estimate of overall mean for the treatment effect, an inappropriate quantification of heterogeneity across studies and a wrongly symmetric prediction interval. We focus on problems caused by an inappropriate normality assumption of the random effects distribution, and propose a novel random effects meta-analysis model where a Box-Cox transformation is applied to the observed treatment effect estimates. The proposed model aims to normalise an overall distribution of observed treatment effect estimates, which is sum of the within-study sampling distributions and the random effects distribution. When sampling distributions are approximately normal, non-normality in the overall distribution will be mainly due to the random effects distribution, especially when the between-study variation is large relative to the within-study variation. The Box-Cox transformation addresses this flexibly according to the observed departure from normality. We use a Bayesian approach for estimating parameters in the proposed model, and suggest summarising the meta-analysis results by an overall median, an interquartile range and a prediction interval. The model can be applied for any kind of variables once the treatment effect estimate is defined from the variable. A simulation study suggested that when the overall distribution of treatment effect estimates are skewed, the overall mean and conventional I 2 from the normal random effects model could be inappropriate summaries, and the proposed model helped reduce this issue. We illustrated the proposed model using two examples, which revealed some important differences on summary results, heterogeneity measures and prediction intervals from the normal random effects model. The

  9. Dataset on spatial distribution and location of universities in Nigeria.

    PubMed

    Adeyemi, G A; Edeki, S O

    2018-06-01

    Access to quality educational system, and the location of educational institutions are of great importance for future prospect of youth in any nation. These in return, have great effects on the economy growth and development of any country. Thus, the dataset contained in this article examines and explains the spatial distribution of universities in the Nigeria system of education. Data from the university commission, Nigeria, as at December 2017 are used. These include all the 40 federal universities, 44 states universities, and 69 private universities making a total of 153 universities in the Nigerian system of education. The data analysis is via the Geographic Information System (GIS) software. The dataset contained in this article will be of immense assistance to the national educational policy makers, parents, and potential students as regards smart and reliable decision making academically.

  10. Full-wave simulations of ICRF heating regimes in toroidal plasmas with non-Maxwellian distribution functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertelli, N.; Valeo, E.J.; Green, D.L.

    At the power levels required for significant heating and current drive in magnetically-confined toroidal plasma, modification of the particle distribution function from a Maxwellian shape is likely [T. H. Stix, Nucl. Fusion, 15 737 (1975)], with consequent changes in wave propagation and in the location and amount of absorption. In order to study these effects computationally, both the finite-Larmor-radius and the high-harmonic fast wave (HHFW), versions of the full-wave, hot-plasma toroidal simulation code TORIC [M. Brambilla, Plasma Phys. Control. Fusion 41, 1 (1999) and M. Brambilla, Plasma Phys. Control. Fusion 44, 2423 (2002)], have been extended to allow the prescriptionmore » of arbitrary velocity distributions of the form f(v||, v_perp, psi , theta). For hydrogen (H) minority heating of a deuterium (D) plasma with anisotropic Maxwellian H distributions, the fractional H absorption varies significantly with changes in parallel temperature but is essentially independent of perpendicular temperature. On the other hand, for HHFW regime with anisotropic Maxwellian fast ion distribution, the fractional beam ion absorption varies mainly with changes in the perpendicular temperature. The evaluation of the wave-field and power absorption, through the full wave solver, with the ion distribution function provided by either aMonte-Carlo particle and Fokker-Planck codes is also examined for Alcator C-Mod and NSTX plasmas. Non-Maxwellian effects generally tends to increase the absorption with respect to the equivalent Maxwellian distribution.« less

  11. Full-wave simulations of ICRF heating regimes in toroidal plasma with non-Maxwellian distribution functions

    NASA Astrophysics Data System (ADS)

    Bertelli, N.; Valeo, E. J.; Green, D. L.; Gorelenkova, M.; Phillips, C. K.; Podestà, M.; Lee, J. P.; Wright, J. C.; Jaeger, E. F.

    2017-05-01

    At the power levels required for significant heating and current drive in magnetically-confined toroidal plasma, modification of the particle distribution function from a Maxwellian shape is likely (Stix 1975 Nucl. Fusion 15 737), with consequent changes in wave propagation and in the location and amount of absorption. In order to study these effects computationally, both the finite-Larmor-radius and the high-harmonic fast wave (HHFW), versions of the full-wave, hot-plasma toroidal simulation code TORIC (Brambilla 1999 Plasma Phys. Control. Fusion 41 1 and Brambilla 2002 Plasma Phys. Control. Fusion 44 2423), have been extended to allow the prescription of arbitrary velocity distributions of the form f≤ft({{v}\\parallel},{{v}\\bot},\\psi,θ \\right) . For hydrogen (H) minority heating of a deuterium (D) plasma with anisotropic Maxwellian H distributions, the fractional H absorption varies significantly with changes in parallel temperature but is essentially independent of perpendicular temperature. On the other hand, for HHFW regime with anisotropic Maxwellian fast ion distribution, the fractional beam ion absorption varies mainly with changes in the perpendicular temperature. The evaluation of the wave-field and power absorption, through the full wave solver, with the ion distribution function provided by either a Monte-Carlo particle and Fokker-Planck codes is also examined for Alcator C-Mod and NSTX plasmas. Non-Maxwellian effects generally tend to increase the absorption with respect to the equivalent Maxwellian distribution.

  12. Conceptualizing and Exemplifying Science Teachers' Assessment Expertise

    NASA Astrophysics Data System (ADS)

    Geaney Lyon, Edward

    2013-05-01

    Although research in science education has led to new assessment forms and functions, the reality is that little work has been done to unpack and capture what it means for a teacher to develop expertise at assessing science. The purpose of this paper is two-fold. First, I suggest a conceptualization of assessment expertise that is organized around three dimensions: (a) designing aligned and theoretically cohesive assessment (Design), (b) using assessment to support students' science learning (Use), and (c) equitably assessing language minorities (Equity). The second purpose is to suggest and exemplify various levels of teaching expertise across the three conceptual dimensions using written assessment plans gathered from a study on secondary science pre-service teachers' assessment growth. The contribution of this paper lies in its further conceptual development of assessment expertise, instantiated in a rubric, which can spark discussion about how to capture the range of assessment practices that might be found in science classrooms as well as move toward a potential learning progression of assessment expertise.

  13. Selective attention to sound location or pitch studied with event-related brain potentials and magnetic fields.

    PubMed

    Degerman, Alexander; Rinne, Teemu; Särkkä, Anna-Kaisa; Salmi, Juha; Alho, Kimmo

    2008-06-01

    Event-related brain potentials (ERPs) and magnetic fields (ERFs) were used to compare brain activity associated with selective attention to sound location or pitch in humans. Sixteen healthy adults participated in the ERP experiment, and 11 adults in the ERF experiment. In different conditions, the participants focused their attention on a designated sound location or pitch, or pictures presented on a screen, in order to detect target sounds or pictures among the attended stimuli. In the Attend Location condition, the location of sounds varied randomly (left or right), while their pitch (high or low) was kept constant. In the Attend Pitch condition, sounds of varying pitch (high or low) were presented at a constant location (left or right). Consistent with previous ERP results, selective attention to either sound feature produced a negative difference (Nd) between ERPs to attended and unattended sounds. In addition, ERPs showed a more posterior scalp distribution for the location-related Nd than for the pitch-related Nd, suggesting partially different generators for these Nds. The ERF source analyses found no source distribution differences between the pitch-related Ndm (the magnetic counterpart of the Nd) and location-related Ndm in the superior temporal cortex (STC), where the main sources of the Ndm effects are thought to be located. Thus, the ERP scalp distribution differences between the location-related and pitch-related Nd effects may have been caused by activity of areas outside the STC, perhaps in the inferior parietal regions.

  14. Mental Health and Drivers of Need in Emergent and Non-Emergent Emergency Department (ED) Use: Do Living Location and Non-Emergent Care Sources Matter?

    PubMed

    McManus, Moira C; Cramer, Robert J; Boshier, Maureen; Akpinar-Elci, Muge; Van Lunen, Bonnie

    2018-01-13

    Emergency department (ED) utilization has increased due to factors such as admissions for mental health conditions, including suicide and self-harm. We investigate direct and moderating influences on non-emergent ED utilization through the Behavioral Model of Health Services Use. Through logistic regression, we examined correlates of ED use via 2014 New York State Department of Health Statewide Planning and Research Cooperative System outpatient data. Consistent with the primary hypothesis, mental health admissions were associated with emergent use across models, with only a slight decrease in effect size in rural living locations. Concerning moderating effects, Spanish/Hispanic origin was associated with increased likelihood for emergent ED use in the rural living location model, and non-emergent ED use for the no non-emergent source model. 'Other' ethnic origin increased the likelihood of emergent ED use for rural living location and no non-emergent source models. The findings reveal 'need', including mental health admissions, as the largest driver for ED use. This may be due to mental healthcare access, or patients with mental health emergencies being transported via first responders to the ED, as in the case of suicide, self-harm, manic episodes or psychotic episodes. Further educating ED staff on this patient population through gatekeeper training may ensure patients receive the best treatment and aid in driving access to mental healthcare delivery changes.

  15. The Non-Gaussian Nature of Bibliometric and Scientometric Distributions: A New Approach to Interpretation.

    ERIC Educational Resources Information Center

    Ivancheva, Ludmila E.

    2001-01-01

    Discusses the concept of the hyperbolic or skew distribution as a universal statistical law in information science and socioeconomic studies. Topics include Zipf's law; Stankov's universal law; non-Gaussian distributions; and why most bibliometric and scientometric laws reveal characters of non-Gaussian distribution. (Author/LRW)

  16. Disparities in the Population Distribution of African American and Non-Hispanic White Smokers Along the Quitting Continuum.

    PubMed

    Trinidad, Dennis R; Xie, Bin; Fagan, Pebbles; Pulvers, Kim; Romero, Devan R; Blanco, Lyzette; Sakuma, Kari-Lyn K

    2015-12-01

    To examine disparities and changes over time in the population-level distribution of smokers along a cigarette quitting continuum among African American smokers compared with non-Hispanic Whites. Secondary data analyses of the 1999, 2002, 2005, and 2008 California Tobacco Surveys (CTS). The CTS are large, random-digit-dialed, population-based surveys designed to assess changes in tobacco use in California. The number of survey respondents ranged from n = 6,744 to n = 12,876 across CTS years. Current smoking behavior (daily or nondaily smoking), number of cigarettes smoked per day, intention to quit in the next 6 months, length of most recent quit attempt among current smokers, and total length of time quit among former smokers were assessed and used to recreate the quitting continuum model. While current smoking rates were significantly higher among African Americans compared with non-Hispanic Whites across all years, cigarette consumption rates were lower among African Americans in all years. There were significant increases in the proportion of former smokers who had been quit for at least 12 months from 1999 (African Americans, 26.8% ± 5.5%; non-Hispanic Whites, 36.8% ± 1.6%) to 2008 (African Americans, 43.6% ± 4.1%; non-Hispanic Whites, 57.4% ± 2.9%). The proportion of African American former smokers in each CTS year was significantly lower than that of non-Hispanic Whites. Despite positive progression along the quitting continuum for both African American and non-Hispanic White smokers, the overall distribution was less favorable for African Americans. The lower smoking consumption levels among African Americans, combined with the lower rates of successful smoking cessation, suggest that cigarette addiction and the quitting process may be different for African American smokers. © 2015 Society for Public Health Education.

  17. Dynamic design of ecological monitoring networks for non-Gaussian spatio-temporal data

    USGS Publications Warehouse

    Wikle, C.K.; Royle, J. Andrew

    2005-01-01

    Many ecological processes exhibit spatial structure that changes over time in a coherent, dynamical fashion. This dynamical component is often ignored in the design of spatial monitoring networks. Furthermore, ecological variables related to processes such as habitat are often non-Gaussian (e.g. Poisson or log-normal). We demonstrate that a simulation-based design approach can be used in settings where the data distribution is from a spatio-temporal exponential family. The key random component in the conditional mean function from this distribution is then a spatio-temporal dynamic process. Given the computational burden of estimating the expected utility of various designs in this setting, we utilize an extended Kalman filter approximation to facilitate implementation. The approach is motivated by, and demonstrated on, the problem of selecting sampling locations to estimate July brood counts in the prairie pothole region of the U.S.

  18. An International Perspective on Value Learning in the Kindergarten--Exemplified by the Value Forgiveness

    ERIC Educational Resources Information Center

    Gunnestad, Arve; Mørreaunet, Sissel; Onyango, Silas

    2015-01-01

    This article highlights value learning in kindergartens exemplified by the value of forgiveness. Values are basic ideas on human behaviour and they function as a compass that helps children to make choices and priorities in their lives, to choose between good or bad, right or wrong. Value learning is an important part of the educational work in a…

  19. Evaluation of physical activity interventions in children via the reach, efficacy/effectiveness, adoption, implementation, and maintenance (RE-AIM) framework: A systematic review of randomized and non-randomized trials.

    PubMed

    McGoey, Tara; Root, Zach; Bruner, Mark W; Law, Barbi

    2016-01-01

    Existing reviews of physical activity (PA) interventions designed to increase PA behavior exclusively in children (ages 5 to 11years) focus primarily on the efficacy (e.g., internal validity) of the interventions without addressing the applicability of the results in terms of generalizability and translatability (e.g., external validity). This review used the RE-AIM (Reach, Efficacy/Effectiveness, Adoption, Implementation, Maintenance) framework to measure the degree to which randomized and non-randomized PA interventions in children report on internal and external validity factors. A systematic search for controlled interventions conducted within the past 12years identified 78 studies that met the inclusion criteria. Based on the RE-AIM criteria, most of the studies focused on elements of internal validity (e.g., sample size, intervention location and efficacy/effectiveness) with minimal reporting of external validity indicators (e.g., representativeness of participants, start-up costs, protocol fidelity and sustainability). Results of this RE-AIM review emphasize the need for future PA interventions in children to report on real-world challenges and limitations, and to highlight considerations for translating evidence-based results into health promotion practice. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. New Quantum Key Distribution Scheme Based on Random Hybrid Quantum Channel with EPR Pairs and GHZ States

    NASA Astrophysics Data System (ADS)

    Yan, Xing-Yu; Gong, Li-Hua; Chen, Hua-Ying; Zhou, Nan-Run

    2018-05-01

    A theoretical quantum key distribution scheme based on random hybrid quantum channel with EPR pairs and GHZ states is devised. In this scheme, EPR pairs and tripartite GHZ states are exploited to set up random hybrid quantum channel. Only one photon in each entangled state is necessary to run forth and back in the channel. The security of the quantum key distribution scheme is guaranteed by more than one round of eavesdropping check procedures. It is of high capacity since one particle could carry more than two bits of information via quantum dense coding.

  1. Modelling population distribution using remote sensing imagery and location-based data

    NASA Astrophysics Data System (ADS)

    Song, J.; Prishchepov, A. V.

    2017-12-01

    Detailed spatial distribution of population density is essential for city studies such as urban planning, environmental pollution and city emergency, even estimate pressure on the environment and human exposure and risks to health. However, most of the researches used census data as the detailed dynamic population distribution are difficult to acquire, especially in microscale research. This research describes a method using remote sensing imagery and location-based data to model population distribution at the function zone level. Firstly, urban functional zones within a city were mapped by high-resolution remote sensing images and POIs. The workflow of functional zones extraction includes five parts: (1) Urban land use classification. (2) Segmenting images in built-up area. (3) Identification of functional segments by POIs. (4) Identification of functional blocks by functional segmentation and weight coefficients. (5) Assessing accuracy by validation points. The result showed as Fig.1. Secondly, we applied ordinary least square and geographically weighted regression to assess spatial nonstationary relationship between light digital number (DN) and population density of sampling points. The two methods were employed to predict the population distribution over the research area. The R²of GWR model were in the order of 0.7 and typically showed significant variations over the region than traditional OLS model. The result showed as Fig.2.Validation with sampling points of population density demonstrated that the result predicted by the GWR model correlated well with light value. The result showed as Fig.3. Results showed: (1) Population density is not linear correlated with light brightness using global model. (2) VIIRS night-time light data could estimate population density integrating functional zones at city level. (3) GWR is a robust model to map population distribution, the adjusted R2 of corresponding GWR models were higher than the optimal OLS models

  2. Analysis of in vitro evolution reveals the underlying distribution of catalytic activity among random sequences.

    PubMed

    Pressman, Abe; Moretti, Janina E; Campbell, Gregory W; Müller, Ulrich F; Chen, Irene A

    2017-08-21

    The emergence of catalytic RNA is believed to have been a key event during the origin of life. Understanding how catalytic activity is distributed across random sequences is fundamental to estimating the probability that catalytic sequences would emerge. Here, we analyze the in vitro evolution of triphosphorylating ribozymes and translate their fitnesses into absolute estimates of catalytic activity for hundreds of ribozyme families. The analysis efficiently identified highly active ribozymes and estimated catalytic activity with good accuracy. The evolutionary dynamics follow Fisher's Fundamental Theorem of Natural Selection and a corollary, permitting retrospective inference of the distribution of fitness and activity in the random sequence pool for the first time. The frequency distribution of rate constants appears to be log-normal, with a surprisingly steep dropoff at higher activity, consistent with a mechanism for the emergence of activity as the product of many independent contributions. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. Determination of the Optimal Chromosomal Location(s) for a DNA Element in Escherichia coli Using a Novel Transposon-mediated Approach.

    PubMed

    Frimodt-Møller, Jakob; Charbon, Godefroid; Krogfelt, Karen A; Løbner-Olesen, Anders

    2017-09-11

    The optimal chromosomal position(s) of a given DNA element was/were determined by transposon-mediated random insertion followed by fitness selection. In bacteria, the impact of the genetic context on the function of a genetic element can be difficult to assess. Several mechanisms, including topological effects, transcriptional interference from neighboring genes, and/or replication-associated gene dosage, may affect the function of a given genetic element. Here, we describe a method that permits the random integration of a DNA element into the chromosome of Escherichia coli and select the most favorable locations using a simple growth competition experiment. The method takes advantage of a well-described transposon-based system of random insertion, coupled with a selection of the fittest clone(s) by growth advantage, a procedure that is easily adjustable to experimental needs. The nature of the fittest clone(s) can be determined by whole-genome sequencing on a complex multi-clonal population or by easy gene walking for the rapid identification of selected clones. Here, the non-coding DNA region DARS2, which controls the initiation of chromosome replication in E. coli, was used as an example. The function of DARS2 is known to be affected by replication-associated gene dosage; the closer DARS2 gets to the origin of DNA replication, the more active it becomes. DARS2 was randomly inserted into the chromosome of a DARS2-deleted strain. The resultant clones containing individual insertions were pooled and competed against one another for hundreds of generations. Finally, the fittest clones were characterized and found to contain DARS2 inserted in close proximity to the original DARS2 location.

  4. Large-area imaging reveals biologically driven non-random spatial patterns of corals at a remote reef

    NASA Astrophysics Data System (ADS)

    Edwards, Clinton B.; Eynaud, Yoan; Williams, Gareth J.; Pedersen, Nicole E.; Zgliczynski, Brian J.; Gleason, Arthur C. R.; Smith, Jennifer E.; Sandin, Stuart A.

    2017-12-01

    For sessile organisms such as reef-building corals, differences in the degree of dispersion of individuals across a landscape may result from important differences in life-history strategies or may reflect patterns of habitat availability. Descriptions of spatial patterns can thus be useful not only for the identification of key biological and physical mechanisms structuring an ecosystem, but also by providing the data necessary to generate and test ecological theory. Here, we used an in situ imaging technique to create large-area photomosaics of 16 plots at Palmyra Atoll, central Pacific, each covering 100 m2 of benthic habitat. We mapped the location of 44,008 coral colonies and identified each to the lowest taxonomic level possible. Using metrics of spatial dispersion, we tested for departures from spatial randomness. We also used targeted model fitting to explore candidate processes leading to differences in spatial patterns among taxa. Most taxa were clustered and the degree of clustering varied by taxon. A small number of taxa did not significantly depart from randomness and none revealed evidence of spatial uniformity. Importantly, taxa that readily fragment or tolerate stress through partial mortality were more clustered. With little exception, clustering patterns were consistent with models of fragmentation and dispersal limitation. In some taxa, dispersion was linearly related to abundance, suggesting density dependence of spatial patterning. The spatial patterns of stony corals are non-random and reflect fundamental life-history characteristics of the taxa, suggesting that the reef landscape may, in many cases, have important elements of spatial predictability.

  5. Extracting DNA words based on the sequence features: non-uniform distribution and integrity.

    PubMed

    Li, Zhi; Cao, Hongyan; Cui, Yuehua; Zhang, Yanbo

    2016-01-25

    DNA sequence can be viewed as an unknown language with words as its functional units. Given that most sequence alignment algorithms such as the motif discovery algorithms depend on the quality of background information about sequences, it is necessary to develop an ab initio algorithm for extracting the "words" based only on the DNA sequences. We considered that non-uniform distribution and integrity were two important features of a word, based on which we developed an ab initio algorithm to extract "DNA words" that have potential functional meaning. A Kolmogorov-Smirnov test was used for consistency test of uniform distribution of DNA sequences, and the integrity was judged by the sequence and position alignment. Two random base sequences were adopted as negative control, and an English book was used as positive control to verify our algorithm. We applied our algorithm to the genomes of Saccharomyces cerevisiae and 10 strains of Escherichia coli to show the utility of the methods. The results provide strong evidences that the algorithm is a promising tool for ab initio building a DNA dictionary. Our method provides a fast way for large scale screening of important DNA elements and offers potential insights into the understanding of a genome.

  6. Role of origin and release location in pre-spawning distribution and movements of anadromous alewife

    USGS Publications Warehouse

    Frank, Holly J.; Mather, M. E.; Smith, Joseph M.; Muth, Robert M.; Finn, John T.

    2011-01-01

    Capturing adult anadromous fish that are ready to spawn from a self sustaining population and transferring them into a depleted system is a common fisheries enhancement tool. The behaviour of these transplanted fish, however, has not been fully evaluated. The movements of stocked and native anadromous alewife, Alosa pseudoharengus (Wilson), were monitored in the Ipswich River, Massachusetts, USA, to provide a scientific basis for this management tool. Radiotelemetry was used to examine the effect of origin (native or stocked) and release location (upstream or downstream) on distribution and movement during the spawning migration. Native fish remained in the river longer than stocked fish regardless of release location. Release location and origin influenced where fish spent time and how they moved. The spatial mosaic of available habitats and the entire trajectory of freshwater movements should be considered to restore effectively spawners that traverse tens of kilometres within coastal rivers.

  7. Multi-peak structure of generation spectrum of random distributed feedback fiber Raman lasers.

    PubMed

    Vatnik, I D; Zlobina, E A; Kablukov, S I; Babin, S A

    2017-02-06

    We study spectral features of the generation of random distributed feedback fiber Raman laser arising from two-peak shape of the Raman gain spectral profile realized in the germanosilicate fibers. We demonstrate that number of peaks can be calculated using power balance model considering different subcomponents within each Stokes component.

  8. The systematic position of the enigmatic thyreophoran dinosaur Paranthodon africanus, and the use of basal exemplifiers in phylogenetic analysis.

    PubMed

    Raven, Thomas J; Maidment, Susannah C R

    2018-01-01

    The first African dinosaur to be discovered, Paranthodon africanus was found in 1845 in the Lower Cretaceous of South Africa. Taxonomically assigned to numerous groups since discovery, in 1981 it was described as a stegosaur, a group of armoured ornithischian dinosaurs characterised by bizarre plates and spines extending from the neck to the tail. This assignment has been subsequently accepted. The type material consists of a premaxilla, maxilla, a nasal, and a vertebra, and contains no synapomorphies of Stegosauria. Several features of the maxilla and dentition are reminiscent of Ankylosauria, the sister-taxon to Stegosauria, and the premaxilla appears superficially similar to that of some ornithopods. The vertebral material has never been described, and since the last description of the specimen, there have been numerous discoveries of thyreophoran material potentially pertinent to establishing the taxonomic assignment of the specimen. An investigation of the taxonomic and systematic position of Paranthodon is therefore warranted. This study provides a detailed re-description, including the first description of the vertebra. Numerous phylogenetic analyses demonstrate that the systematic position of Paranthodon is highly labile and subject to change depending on which exemplifier for the clade Stegosauria is used. The results indicate that the use of a basal exemplifier may not result in the correct phylogenetic position of a taxon being recovered if the taxon displays character states more derived than those of the basal exemplifier, and we recommend the use, minimally, of one basal and one derived exemplifier per clade. Paranthodon is most robustly recovered as a stegosaur in our analyses, meaning it is one of the youngest and southernmost stegosaurs.

  9. Distribution of ureteral stones and factors affecting their location and expulsion in patients with renal colic.

    PubMed

    Moon, Young Joon; Kim, Hong-Wook; Kim, Jin Bum; Kim, Hyung Joon; Chang, Young-Seop

    2015-10-01

    To evaluate the distribution of ureteral stones and to determine their characteristics and expulsion rate based on their location. We retrospectively reviewed computed tomography (CT) findings of 246 patients who visited our Emergency Department (ED) for renal colic caused by unilateral ureteral stones between January 2013 and April 2014. Histograms were constructed to plot the distribution of stones based on initial CT findings. Data from 144 of the 246 patients who underwent medical expulsive therapy (MET) for 2 weeks were analyzed to evaluate the factors responsible for the stone distribution and expulsion. The upper ureter and ureterovesical junction (UVJ) were 2 peak locations at which stones initially lodged. Stones lodged at the upper ureter and ureteropelvic junction (group A) had a larger longitudinal diameter (4.21 mm vs. 3.56 mm, p=0.004) compared to those lodged at the lower ureter and UVJ (group B). The expulsion rate was 75.6% and 94.9% in groups A and B, respectively. There was no significant difference in the time interval from initiation of renal colic to arrival at the ED between groups A and B (p=0.422). Stone diameter was a significant predictor of MET failure (odds ratio [OR], 1.795; p=0.005) but the initial stone location was not (OR, 0.299; p=0.082). The upper ureter and UVJ are 2 peak sites at which stones lodge. For stone size 10 mm or less, initial stone lodge site is not a significant predictor of MET failure in patients who have no previous history of active stone treatment in the ureter.

  10. Shape of growth-rate distribution determines the type of Non-Gibrat’s Property

    NASA Astrophysics Data System (ADS)

    Ishikawa, Atushi; Fujimoto, Shouji; Mizuno, Takayuki

    2011-11-01

    In this study, the authors examine exhaustive business data on Japanese firms, which cover nearly all companies in the mid- and large-scale ranges in terms of firm size, to reach several key findings on profits/sales distribution and business growth trends. Here, profits denote net profits. First, detailed balance is observed not only in profits data but also in sales data. Furthermore, the growth-rate distribution of sales has wider tails than the linear growth-rate distribution of profits in log-log scale. On the one hand, in the mid-scale range of profits, the probability of positive growth decreases and the probability of negative growth increases symmetrically as the initial value increases. This is called Non-Gibrat’s First Property. On the other hand, in the mid-scale range of sales, the probability of positive growth decreases as the initial value increases, while the probability of negative growth hardly changes. This is called Non-Gibrat’s Second Property. Under detailed balance, Non-Gibrat’s First and Second Properties are analytically derived from the linear and quadratic growth-rate distributions in log-log scale, respectively. In both cases, the log-normal distribution is inferred from Non-Gibrat’s Properties and detailed balance. These analytic results are verified by empirical data. Consequently, this clarifies the notion that the difference in shapes between growth-rate distributions of sales and profits is closely related to the difference between the two Non-Gibrat’s Properties in the mid-scale range.

  11. Abundance and Relative Distribution of Frankia Host Infection Groups Under Actinorhizal Alnus glutinosa and Non-actinorhizal Betula nigra Trees.

    PubMed

    Samant, Suvidha; Huo, Tian; Dawson, Jeffrey O; Hahn, Dittmar

    2016-02-01

    Quantitative polymerase chain reaction (qPCR) was used to assess the abundance and relative distribution of host infection groups of the root-nodule forming, nitrogen-fixing actinomycete Frankia in four soils with similar physicochemical characteristics, two of which were vegetated with a host plant, Alnus glutinosa, and two with a non-host plant, Betula nigra. Analyses of DAPI-stained cells at three locations, i.e., at a distance of less than 1 m (near stem), 2.5 m (middle crown), and 3-5 m (crown edge) from the stems of both tree species revealed no statistically significant differences in abundance. Frankiae generally accounted for 0.01 to 0.04 % of these cells, with values between 4 and 36 × 10(5) cells (g soil)(-1). In three out of four soils, abundance of frankiae was significantly higher at locations "near stem" and/or "middle crown" compared to "crown edge," while numbers at these locations were not different in the fourth soil. Frankiae of the Alnus host infection group were dominant in all samples accounting for about 75 % and more of the cells, with no obvious differences with distance to stem. In three of the soils, all of these cells were represented by strain Ag45/Mut15. In the fourth soil that was vegetated with older A. glutinosa trees, about half of these cells belonged to a different subgroup represented by strain ArI3. In all soils, the remaining cells belonged to the Elaeagnus host infection group represented by strain EAN1pec. Casuarina-infective frankiae were not found. Abundance and relative distribution of Frankia host infection groups were similar in soils under the host plant A. glutinosa and the non-host plant B. nigra. Results did thus not reveal any specific effects of plant species on soil Frankia populations.

  12. Comparing methods for Earthquake Location

    NASA Astrophysics Data System (ADS)

    Turkaya, Semih; Bodin, Thomas; Sylvander, Matthieu; Parroucau, Pierre; Manchuel, Kevin

    2017-04-01

    There are plenty of methods available for locating small magnitude point source earthquakes. However, it is known that these different approaches produce different results. For each approach, results also depend on a number of parameters which can be separated into two main branches: (1) parameters related to observations (number and distribution of for example) and (2) parameters related to the inversion process (velocity model, weighting parameters, initial location etc.). Currently, the results obtained from most of the location methods do not systematically include quantitative uncertainties. The effect of the selected parameters on location uncertainties is also poorly known. Understanding the importance of these different parameters and their effect on uncertainties is clearly required to better constrained knowledge on fault geometry, seismotectonic processes and at the end to improve seismic hazard assessment. In this work, realized in the frame of the SINAPS@ research program (http://www.institut-seism.fr/projets/sinaps/), we analyse the effect of different parameters on earthquakes location (e.g. type of phase, max. hypocentral separation etc.). We compare several codes available (Hypo71, HypoDD, NonLinLoc etc.) and determine their strengths and weaknesses in different cases by means of synthetic tests. The work, performed for the moment on synthetic data, is planned to be applied, in a second step, on data collected by the Midi-Pyrénées Observatory (OMP).

  13. Generation mechanism of nonlinear ultrasonic Lamb waves in thin plates with randomly distributed micro-cracks.

    PubMed

    Zhao, Youxuan; Li, Feilong; Cao, Peng; Liu, Yaolu; Zhang, Jianyu; Fu, Shaoyun; Zhang, Jun; Hu, Ning

    2017-08-01

    Since the identification of micro-cracks in engineering materials is very valuable in understanding the initial and slight changes in mechanical properties of materials under complex working environments, numerical simulations on the propagation of the low frequency S 0 Lamb wave in thin plates with randomly distributed micro-cracks were performed to study the behavior of nonlinear Lamb waves. The results showed that while the influence of the randomly distributed micro-cracks on the phase velocity of the low frequency S 0 fundamental waves could be neglected, significant ultrasonic nonlinear effects caused by the randomly distributed micro-cracks was discovered, which mainly presented as a second harmonic generation. By using a Monte Carlo simulation method, we found that the acoustic nonlinear parameter increased linearly with the micro-crack density and the size of micro-crack zone, and it was also related to the excitation frequency and friction coefficient of the micro-crack surfaces. In addition, it was found that the nonlinear effect of waves reflected by the micro-cracks was more noticeable than that of the transmitted waves. This study theoretically reveals that the low frequency S 0 mode of Lamb waves can be used as the fundamental waves to quantitatively identify micro-cracks in thin plates. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Full-wave simulations of ICRF heating regimes in toroidal plasma with non-Maxwellian distribution functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertelli, N.; Valeo, E. J.; Green, D. L.

    At the power levels required for significant heating and current drive in magnetically-confined toroidal plasma, modification of the particle distribution function from a Maxwellian shape is likely (Stix 1975 Nucl. Fusion 15 737), with consequent changes in wave propagation and in the location and amount of absorption. In order to study these effects computationally, both the finite-Larmor-radius and the high-harmonic fast wave (HHFW), versions of the full-wave, hot-plasma toroidal simulation code TORIC (Brambilla 1999 Plasma Phys. Control. Fusion 41 1 and Brambilla 2002 Plasma Phys. Control. Fusion 44 2423), have been extended to allow the prescription of arbitrary velocity distributionsmore » of the form f(v(parallel to), v(perpendicular to) , psi, theta). For hydrogen (H) minority heating of a deuterium (D) plasma with anisotropic Maxwellian H distributions, the fractional H absorption varies significantly with changes in parallel temperature but is essentially independent of perpendicular temperature. On the other hand, for HHFW regime with anisotropic Maxwellian fast ion distribution, the fractional beam ion absorption varies mainly with changes in the perpendicular temperature. The evaluation of the wave-field and power absorption, through the full wave solver, with the ion distribution function provided by either a Monte-Carlo particle and Fokker-Planck codes is also examined for Alcator C-Mod and NSTX plasmas. Non-Maxwellian effects generally tend to increase the absorption with respect to the equivalent Maxwellian distribution.« less

  15. Full-wave simulations of ICRF heating regimes in toroidal plasma with non-Maxwellian distribution functions

    DOE PAGES

    Bertelli, N.; Valeo, E. J.; Green, D. L.; ...

    2017-04-03

    At the power levels required for significant heating and current drive in magnetically-confined toroidal plasma, modification of the particle distribution function from a Maxwellian shape is likely (Stix 1975 Nucl. Fusion 15 737), with consequent changes in wave propagation and in the location and amount of absorption. In order to study these effects computationally, both the finite-Larmor-radius and the high-harmonic fast wave (HHFW), versions of the full-wave, hot-plasma toroidal simulation code TORIC (Brambilla 1999 Plasma Phys. Control. Fusion 41 1 and Brambilla 2002 Plasma Phys. Control. Fusion 44 2423), have been extended to allow the prescription of arbitrary velocity distributionsmore » of the form f(v(parallel to), v(perpendicular to) , psi, theta). For hydrogen (H) minority heating of a deuterium (D) plasma with anisotropic Maxwellian H distributions, the fractional H absorption varies significantly with changes in parallel temperature but is essentially independent of perpendicular temperature. On the other hand, for HHFW regime with anisotropic Maxwellian fast ion distribution, the fractional beam ion absorption varies mainly with changes in the perpendicular temperature. The evaluation of the wave-field and power absorption, through the full wave solver, with the ion distribution function provided by either a Monte-Carlo particle and Fokker-Planck codes is also examined for Alcator C-Mod and NSTX plasmas. Non-Maxwellian effects generally tend to increase the absorption with respect to the equivalent Maxwellian distribution.« less

  16. Random walks with random velocities.

    PubMed

    Zaburdaev, Vasily; Schmiedeberg, Michael; Stark, Holger

    2008-07-01

    We consider a random walk model that takes into account the velocity distribution of random walkers. Random motion with alternating velocities is inherent to various physical and biological systems. Moreover, the velocity distribution is often the first characteristic that is experimentally accessible. Here, we derive transport equations describing the dispersal process in the model and solve them analytically. The asymptotic properties of solutions are presented in the form of a phase diagram that shows all possible scaling regimes, including superdiffusive, ballistic, and superballistic motion. The theoretical results of this work are in excellent agreement with accompanying numerical simulations.

  17. Residual Defect Density in Random Disks Deposits.

    PubMed

    Topic, Nikola; Pöschel, Thorsten; Gallas, Jason A C

    2015-08-03

    We investigate the residual distribution of structural defects in very tall packings of disks deposited randomly in large channels. By performing simulations involving the sedimentation of up to 50 × 10(9) particles we find all deposits to consistently show a non-zero residual density of defects obeying a characteristic power-law as a function of the channel width. This remarkable finding corrects the widespread belief that the density of defects should vanish algebraically with growing height. A non-zero residual density of defects implies a type of long-range spatial order in the packing, as opposed to only local ordering. In addition, we find deposits of particles to involve considerably less randomness than generally presumed.

  18. Multiwavelength generation in a random distributed feedback fiber laser using an all fiber Lyot filter.

    PubMed

    Sugavanam, S; Yan, Z; Kamynin, V; Kurkov, A S; Zhang, L; Churkin, D V

    2014-02-10

    Multiwavelength lasing in the random distributed feedback fiber laser is demonstrated by employing an all fiber Lyot filter. Stable multiwavelength generation is obtained, with each line exhibiting sub-nanometer line-widths. A flat power distribution over multiple lines is obtained, which indicates that the power between lines is redistributed in nonlinear mixing processes. The multiwavelength generation is observed both in first and second Stokes waves.

  19. Spatiotemporal Distribution of Location and Object Effects in Primary Motor Cortex Neurons during Reach-to-Grasp

    PubMed Central

    Rouse, Adam G.

    2016-01-01

    Reaching and grasping typically are considered to be spatially separate processes that proceed concurrently in the arm and the hand, respectively. The proximal representation in the primary motor cortex (M1) controls the arm for reaching, while the distal representation controls the hand for grasping. Many studies of M1 activity therefore have focused either on reaching to various locations without grasping different objects, or else on grasping different objects all at the same location. Here, we recorded M1 neurons in the anterior bank and lip of the central sulcus as monkeys performed more naturalistic movements, reaching toward, grasping, and manipulating four different objects in up to eight different locations. We quantified the extent to which variation in firing rates depended on location, on object, and on their interaction—all as a function of time. Activity proceeded largely in two sequential phases: the first related predominantly to the location to which the upper extremity reached, and the second related to the object about to be grasped. Both phases involved activity distributed widely throughout the sampled territory, spanning both the proximal and the distal upper extremity representation in caudal M1. Our findings indicate that naturalistic reaching and grasping, rather than being spatially segregated processes that proceed concurrently, each are spatially distributed processes controlled by caudal M1 in large part sequentially. Rather than neuromuscular processes separated in space but not time, reaching and grasping are separated more in time than in space. SIGNIFICANCE STATEMENT Reaching and grasping typically are viewed as processes that proceed concurrently in the arm and hand, respectively. The arm region in the primary motor cortex (M1) is assumed to control reaching, while the hand region controls grasping. During naturalistic reach–grasp–manipulate movements, we found, however, that neuron activity proceeds largely in two sequential

  20. Differential Location and Distribution of Hepatic Immune Cells

    PubMed Central

    Freitas-Lopes, Maria Alice; Mafra, Kassiana; David, Bruna A.; Carvalho-Gontijo, Raquel; Menezes, Gustavo B.

    2017-01-01

    The liver is one of the main organs in the body, performing several metabolic and immunological functions that are indispensable to the organism. The liver is strategically positioned in the abdominal cavity between the intestine and the systemic circulation. Due to its location, the liver is continually exposed to nutritional insults, microbiota products from the intestinal tract, and to toxic substances. Hepatocytes are the major functional constituents of the hepatic lobes, and perform most of the liver’s secretory and synthesizing functions, although another important cell population sustains the vitality of the organ: the hepatic immune cells. Liver immune cells play a fundamental role in host immune responses and exquisite mechanisms are necessary to govern the density and the location of the different hepatic leukocytes. Here we discuss the location of these pivotal cells within the different liver compartments, and how their frequency and tissular location can dictate the fate of liver immune responses. PMID:29215603

  1. Distribution of diameters for Erdős-Rényi random graphs.

    PubMed

    Hartmann, A K; Mézard, M

    2018-03-01

    We study the distribution of diameters d of Erdős-Rényi random graphs with average connectivity c. The diameter d is the maximum among all the shortest distances between pairs of nodes in a graph and an important quantity for all dynamic processes taking place on graphs. Here we study the distribution P(d) numerically for various values of c, in the nonpercolating and percolating regimes. Using large-deviation techniques, we are able to reach small probabilities like 10^{-100} which allow us to obtain the distribution over basically the full range of the support, for graphs up to N=1000 nodes. For values c<1, our results are in good agreement with analytical results, proving the reliability of our numerical approach. For c>1 the distribution is more complex and no complete analytical results are available. For this parameter range, P(d) exhibits an inflection point, which we found to be related to a structural change of the graphs. For all values of c, we determined the finite-size rate function Φ(d/N) and were able to extrapolate numerically to N→∞, indicating that the large-deviation principle holds.

  2. Distribution of diameters for Erdős-Rényi random graphs

    NASA Astrophysics Data System (ADS)

    Hartmann, A. K.; Mézard, M.

    2018-03-01

    We study the distribution of diameters d of Erdős-Rényi random graphs with average connectivity c . The diameter d is the maximum among all the shortest distances between pairs of nodes in a graph and an important quantity for all dynamic processes taking place on graphs. Here we study the distribution P (d ) numerically for various values of c , in the nonpercolating and percolating regimes. Using large-deviation techniques, we are able to reach small probabilities like 10-100 which allow us to obtain the distribution over basically the full range of the support, for graphs up to N =1000 nodes. For values c <1 , our results are in good agreement with analytical results, proving the reliability of our numerical approach. For c >1 the distribution is more complex and no complete analytical results are available. For this parameter range, P (d ) exhibits an inflection point, which we found to be related to a structural change of the graphs. For all values of c , we determined the finite-size rate function Φ (d /N ) and were able to extrapolate numerically to N →∞ , indicating that the large-deviation principle holds.

  3. For love of the profession. Award recipients exemplify the best in our field.

    PubMed

    Bachrach, J David; Kious, A Gus; Milburn, B Jeffrey; Barlow, D Scott

    2007-01-01

    Award recipients exemplify the best in our field Read about the extraordinary achievements that earned three individuals and one organization coveted ACMPE-MGMA awards for 2007. MGMA Connexion spoke with the recipients of the Harry J. Harwick Lifetime Achievement Award, the Medical Practice Executive of the Year Award, the Fred Graham Award for Innovation in Improving Community Health and the Edward B. Stevens Article of the Year Award to learn what motivates and inspires them.

  4. Comparison of the distribution of non-AIDS Kaposi's sarcoma and non-Hodgkin's lymphoma in Europe

    PubMed Central

    Maso, L Dal; Franceschi, S; Re, A Lo; Vecchia, C La

    1999-01-01

    To evaluate whether some form of mild immunosuppression may influence the geographical distribution of non-AIDS Kaposi's sarcoma (KS), we correlated incidence rates of KS and non-Hodgkin's lymphoma in individuals aged 60 or more in 18 European countries and Israel. Significant positive correlations emerged but, within highest risk countries (i.e.Italy and Israel), internal correlations were inconsistent. © 1999 Cancer Research Campaign PMID:10408708

  5. Clustering, randomness, and regularity in cloud fields. 4. Stratocumulus cloud fields

    NASA Astrophysics Data System (ADS)

    Lee, J.; Chou, J.; Weger, R. C.; Welch, R. M.

    1994-07-01

    To complete the analysis of the spatial distribution of boundary layer cloudiness, the present study focuses on nine stratocumulus Landsat scenes. The results indicate many similarities between stratocumulus and cumulus spatial distributions. Most notably, at full spatial resolution all scenes exhibit a decidedly clustered distribution. The strength of the clustering signal decreases with increasing cloud size; the clusters themselves consist of a few clouds (less than 10), occupy a small percentage of the cloud field area (less than 5%), contain between 20% and 60% of the cloud field population, and are randomly located within the scene. In contrast, stratocumulus in almost every respect are more strongly clustered than are cumulus cloud fields. For instance, stratocumulus clusters contain more clouds per cluster, occupy a larger percentage of the total area, and have a larger percentage of clouds participating in clusters than the corresponding cumulus examples. To investigate clustering at intermediate spatial scales, the local dimensionality statistic is introduced. Results obtained from this statistic provide the first direct evidence for regularity among large (>900 m in diameter) clouds in stratocumulus and cumulus cloud fields, in support of the inhibition hypothesis of Ramirez and Bras (1990). Also, the size compensated point-to-cloud cumulative distribution function statistic is found to be necessary to obtain a consistent description of stratocumulus cloud distributions. A hypothesis regarding the underlying physical mechanisms responsible for cloud clustering is presented. It is suggested that cloud clusters often arise from 4 to 10 triggering events localized within regions less than 2 km in diameter and randomly distributed within the cloud field. As the size of the cloud surpasses the scale of the triggering region, the clustering signal weakens and the larger cloud locations become more random.

  6. Clustering, randomness, and regularity in cloud fields. 4: Stratocumulus cloud fields

    NASA Technical Reports Server (NTRS)

    Lee, J.; Chou, J.; Weger, R. C.; Welch, R. M.

    1994-01-01

    To complete the analysis of the spatial distribution of boundary layer cloudiness, the present study focuses on nine stratocumulus Landsat scenes. The results indicate many similarities between stratocumulus and cumulus spatial distributions. Most notably, at full spatial resolution all scenes exhibit a decidedly clustered distribution. The strength of the clustering signal decreases with increasing cloud size; the clusters themselves consist of a few clouds (less than 10), occupy a small percentage of the cloud field area (less than 5%), contain between 20% and 60% of the cloud field population, and are randomly located within the scene. In contrast, stratocumulus in almost every respect are more strongly clustered than are cumulus cloud fields. For instance, stratocumulus clusters contain more clouds per cluster, occupy a larger percentage of the total area, and have a larger percentage of clouds participating in clusters than the corresponding cumulus examples. To investigate clustering at intermediate spatial scales, the local dimensionality statistic is introduced. Results obtained from this statistic provide the first direct evidence for regularity among large (more than 900 m in diameter) clouds in stratocumulus and cumulus cloud fields, in support of the inhibition hypothesis of Ramirez and Bras (1990). Also, the size compensated point-to-cloud cumulative distribution function statistic is found to be necessary to obtain a consistent description of stratocumulus cloud distributions. A hypothesis regarding the underlying physical mechanisms responsible for cloud clustering is presented. It is suggested that cloud clusters often arise from 4 to 10 triggering events localized within regions less than 2 km in diameter and randomly distributed within the cloud field. As the size of the cloud surpasses the scale of the triggering region, the clustering signal weakens and the larger cloud locations become more random.

  7. Influence of the Redundant Verification and the Non-Redundant Verification on the Hydraulic Tomography

    NASA Astrophysics Data System (ADS)

    Wei, T. B.; Chen, Y. L.; Lin, H. R.; Huang, S. Y.; Yeh, T. C. J.; Wen, J. C.

    2016-12-01

    In the groundwater study, it estimated the heterogeneous spatial distribution of hydraulic Properties, there were many scholars use to hydraulic tomography (HT) from field site pumping tests to estimate inverse of heterogeneous spatial distribution of hydraulic Properties, to prove the most of most field site aquifer was heterogeneous hydrogeological parameters spatial distribution field. Many scholars had proposed a method of hydraulic tomography to estimate heterogeneous spatial distribution of hydraulic Properties of aquifer, the Huang et al. [2011] was used the non-redundant verification analysis of pumping wells changed, observation wells fixed on the inverse and the forward, to reflect the feasibility of the heterogeneous spatial distribution of hydraulic Properties of field site aquifer of the non-redundant verification analysis on steady-state model.From post literature, finding only in steady state, non-redundant verification analysis of pumping well changed location and observation wells fixed location for inverse and forward. But the studies had not yet pumping wells fixed or changed location, and observation wells fixed location for redundant verification or observation wells change location for non-redundant verification of the various combinations may to explore of influences of hydraulic tomography method. In this study, it carried out redundant verification method and non-redundant verification method for forward to influences of hydraulic tomography method in transient. And it discuss above mentioned in NYUST campus sites the actual case, to prove the effectiveness of hydraulic tomography methods, and confirmed the feasibility on inverse and forward analysis from analysis results.Keywords: Hydraulic Tomography, Redundant Verification, Heterogeneous, Inverse, Forward

  8. Current-wave spectra coupling project. Volume III. Cumulative distribution of forces on structures subjected to the combined action of currents and random waves for potential OTEC sites: (A) Keahole Point, Hawaii, 100 year hurricane; (B) Punta Tuna, Puerto Rico, 100 year hurricane; (C) New Orleans, Louisiana, 100 year hurricane; (D) West Coast of Florida, 100 year hurricane. [CUFOR code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venezian, G.; Bretschneider, C.L.

    1980-08-01

    This volume details a new methodology to analyze statistically the forces experienced by a structure at sea. Conventionally a wave climate is defined using a spectral function. The wave climate is described using a joint distribution of wave heights and periods (wave lengths), characterizing actual sea conditions through some measured or estimated parameters like the significant wave height, maximum spectral density, etc. Random wave heights and periods satisfying the joint distribution are then generated. Wave kinetics are obtained using linear or non-linear theory. In the case of currents a linear wave-current interaction theory of Venezian (1979) is used. The peakmore » force experienced by the structure for each individual wave is identified. Finally, the probability of exceedance of any given peak force on the structure may be obtained. A three-parameter Longuet-Higgins type joint distribution of wave heights and periods is discussed in detail. This joint distribution was used to model sea conditions at four potential OTEC locations. A uniform cylindrical pipe of 3 m diameter, extending to a depth of 550 m was used as a sample structure. Wave-current interactions were included and forces computed using Morison's equation. The drag and virtual mass coefficients were interpolated from published data. A Fortran program CUFOR was written to execute the above procedure. Tabulated and graphic results of peak forces experienced by the structure, for each location, are presented. A listing of CUFOR is included. Considerable flexibility of structural definition has been incorporated. The program can easily be modified in the case of an alternative joint distribution or for inclusion of effects like non-linearity of waves, transverse forces and diffraction.« less

  9. Continuous-variable quantum key distribution in non-Markovian channels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasile, Ruggero; Olivares, Stefano; CNISM, Unita di Ricerca di Milano Universita, I-20133 Milano

    2011-04-15

    We address continuous-variable quantum key distribution (QKD) in non-Markovian lossy channels and show how the non-Markovian features may be exploited to enhance security and/or to detect the presence and the position of an eavesdropper along the transmission line. In particular, we suggest a coherent-state QKD protocol which is secure against Gaussian individual attacks based on optimal 1{yields}2 asymmetric cloning machines for arbitrarily low values of the overall transmission line. The scheme relies on specific non-Markovian properties, and cannot be implemented in ordinary Markovian channels characterized by uniform losses. Our results give a clear indication of the potential impact of non-Markovianmore » effects in QKD.« less

  10. Non-uniform dose distributions in cranial radiation therapy

    NASA Astrophysics Data System (ADS)

    Bender, Edward T.

    Radiation treatments are often delivered to patients with brain metastases. For those patients who receive radiation to the entire brain, there is a risk of long-term neuro-cognitive side effects, which may be due to damage to the hippocampus. In clinical MRI and CT scans it can be difficult to identify the hippocampus, but once identified it can be partially spared from radiation dose. Using deformable image registration we demonstrate a semi-automatic technique for obtaining an estimated location of this structure in a clinical MRI or CT scan. Deformable image registration is a useful tool in other areas such as adaptive radiotherapy, where the radiation oncology team monitors patients during the course of treatment and adjusts the radiation treatments if necessary when the patient anatomy changes. Deformable image registration is used in this setting, but there is a considerable level of uncertainty. This work represents one of many possible approaches at investigating the nature of these uncertainties utilizing consistency metrics. We will show that metrics such as the inverse consistency error correlate with actual registration uncertainties. Specifically relating to brain metastases, this work investigates where in the brain metastases are likely to form, and how the primary cancer site is related. We will show that the cerebellum is at high risk for metastases and that non-uniform dose distributions may be advantageous when delivering prophylactic cranial irradiation for patients with small cell lung cancer in complete remission.

  11. Three-dimensional direct laser written graphitic electrical contacts to randomly distributed components

    NASA Astrophysics Data System (ADS)

    Dorin, Bryce; Parkinson, Patrick; Scully, Patricia

    2018-04-01

    The development of cost-effective electrical packaging for randomly distributed micro/nano-scale devices is a widely recognized challenge for fabrication technologies. Three-dimensional direct laser writing (DLW) has been proposed as a solution to this challenge, and has enabled the creation of rapid and low resistance graphitic wires within commercial polyimide substrates. In this work, we utilize the DLW technique to electrically contact three fully encapsulated and randomly positioned light-emitting diodes (LEDs) in a one-step process. The resolution of the contacts is in the order of 20 μ m, with an average circuit resistance of 29 ± 18 kΩ per LED contacted. The speed and simplicity of this technique is promising to meet the needs of future microelectronics and device packaging.

  12. Measuring Symmetry, Asymmetry and Randomness in Neural Network Connectivity

    PubMed Central

    Esposito, Umberto; Giugliano, Michele; van Rossum, Mark; Vasilaki, Eleni

    2014-01-01

    Cognitive functions are stored in the connectome, the wiring diagram of the brain, which exhibits non-random features, so-called motifs. In this work, we focus on bidirectional, symmetric motifs, i.e. two neurons that project to each other via connections of equal strength, and unidirectional, non-symmetric motifs, i.e. within a pair of neurons only one neuron projects to the other. We hypothesise that such motifs have been shaped via activity dependent synaptic plasticity processes. As a consequence, learning moves the distribution of the synaptic connections away from randomness. Our aim is to provide a global, macroscopic, single parameter characterisation of the statistical occurrence of bidirectional and unidirectional motifs. To this end we define a symmetry measure that does not require any a priori thresholding of the weights or knowledge of their maximal value. We calculate its mean and variance for random uniform or Gaussian distributions, which allows us to introduce a confidence measure of how significantly symmetric or asymmetric a specific configuration is, i.e. how likely it is that the configuration is the result of chance. We demonstrate the discriminatory power of our symmetry measure by inspecting the eigenvalues of different types of connectivity matrices. We show that a Gaussian weight distribution biases the connectivity motifs to more symmetric configurations than a uniform distribution and that introducing a random synaptic pruning, mimicking developmental regulation in synaptogenesis, biases the connectivity motifs to more asymmetric configurations, regardless of the distribution. We expect that our work will benefit the computational modelling community, by providing a systematic way to characterise symmetry and asymmetry in network structures. Further, our symmetry measure will be of use to electrophysiologists that investigate symmetry of network connectivity. PMID:25006663

  13. Measuring symmetry, asymmetry and randomness in neural network connectivity.

    PubMed

    Esposito, Umberto; Giugliano, Michele; van Rossum, Mark; Vasilaki, Eleni

    2014-01-01

    Cognitive functions are stored in the connectome, the wiring diagram of the brain, which exhibits non-random features, so-called motifs. In this work, we focus on bidirectional, symmetric motifs, i.e. two neurons that project to each other via connections of equal strength, and unidirectional, non-symmetric motifs, i.e. within a pair of neurons only one neuron projects to the other. We hypothesise that such motifs have been shaped via activity dependent synaptic plasticity processes. As a consequence, learning moves the distribution of the synaptic connections away from randomness. Our aim is to provide a global, macroscopic, single parameter characterisation of the statistical occurrence of bidirectional and unidirectional motifs. To this end we define a symmetry measure that does not require any a priori thresholding of the weights or knowledge of their maximal value. We calculate its mean and variance for random uniform or Gaussian distributions, which allows us to introduce a confidence measure of how significantly symmetric or asymmetric a specific configuration is, i.e. how likely it is that the configuration is the result of chance. We demonstrate the discriminatory power of our symmetry measure by inspecting the eigenvalues of different types of connectivity matrices. We show that a Gaussian weight distribution biases the connectivity motifs to more symmetric configurations than a uniform distribution and that introducing a random synaptic pruning, mimicking developmental regulation in synaptogenesis, biases the connectivity motifs to more asymmetric configurations, regardless of the distribution. We expect that our work will benefit the computational modelling community, by providing a systematic way to characterise symmetry and asymmetry in network structures. Further, our symmetry measure will be of use to electrophysiologists that investigate symmetry of network connectivity.

  14. Parallelization of a spatial random field characterization process using the Method of Anchored Distributions and the HTCondor high throughput computing system

    NASA Astrophysics Data System (ADS)

    Osorio-Murillo, C. A.; Over, M. W.; Frystacky, H.; Ames, D. P.; Rubin, Y.

    2013-12-01

    A new software application called MAD# has been coupled with the HTCondor high throughput computing system to aid scientists and educators with the characterization of spatial random fields and enable understanding the spatial distribution of parameters used in hydrogeologic and related modeling. MAD# is an open source desktop software application used to characterize spatial random fields using direct and indirect information through Bayesian inverse modeling technique called the Method of Anchored Distributions (MAD). MAD relates indirect information with a target spatial random field via a forward simulation model. MAD# executes inverse process running the forward model multiple times to transfer information from indirect information to the target variable. MAD# uses two parallelization profiles according to computational resources available: one computer with multiple cores and multiple computers - multiple cores through HTCondor. HTCondor is a system that manages a cluster of desktop computers for submits serial or parallel jobs using scheduling policies, resources monitoring, job queuing mechanism. This poster will show how MAD# reduces the time execution of the characterization of random fields using these two parallel approaches in different case studies. A test of the approach was conducted using 1D problem with 400 cells to characterize saturated conductivity, residual water content, and shape parameters of the Mualem-van Genuchten model in four materials via the HYDRUS model. The number of simulations evaluated in the inversion was 10 million. Using the one computer approach (eight cores) were evaluated 100,000 simulations in 12 hours (10 million - 1200 hours approximately). In the evaluation on HTCondor, 32 desktop computers (132 cores) were used, with a processing time of 60 hours non-continuous in five days. HTCondor reduced the processing time for uncertainty characterization by a factor of 20 (1200 hours reduced to 60 hours.)

  15. A hybrid flower pollination algorithm based modified randomized location for multi-threshold medical image segmentation.

    PubMed

    Wang, Rui; Zhou, Yongquan; Zhao, Chengyan; Wu, Haizhou

    2015-01-01

    Multi-threshold image segmentation is a powerful image processing technique that is used for the preprocessing of pattern recognition and computer vision. However, traditional multilevel thresholding methods are computationally expensive because they involve exhaustively searching the optimal thresholds to optimize the objective functions. To overcome this drawback, this paper proposes a flower pollination algorithm with a randomized location modification. The proposed algorithm is used to find optimal threshold values for maximizing Otsu's objective functions with regard to eight medical grayscale images. When benchmarked against other state-of-the-art evolutionary algorithms, the new algorithm proves itself to be robust and effective through numerical experimental results including Otsu's objective values and standard deviations.

  16. Specialty resident perceptions of the impact of a distributed education model on practice location intentions.

    PubMed

    Myhre, Douglas L; Adamiak, Paul J; Pedersen, Jeanette S

    2015-01-01

    There is an increased focus internationally on the social mandate of postgraduate training programs. This study explores specialty residents' perceptions of the impact of the University of Calgary's (UC) distributed education rotations on their self-perceived likelihood of practice location, and if this effect is influenced by resident specialty or stage of program. Residents participating in the UC Distributed Royal College Initiative (DistRCI) between July 2010 and June 2013 completed an online survey following their rotation. Descriptive statistics and student's t-test were employed to analyze quantitative survey data, and a constant comparative approach was used to analyze free text qualitative responses. Residents indicated they were satisfied with the program (92%), and that the distributed rotations significantly increased their self-reported likelihood of practicing in smaller centers (p < 0.05). The findings suggest that the shift in attitude is independent of discipline, program year, and logistical experiences of living at the distributed sites, and is consistent across multiple cohorts over several academic years. The findings highlight the value of a distributed education program in contributing to future practice and career development, and its relevance in the social accountability of postgraduate programs.

  17. Kinetic corrections from analytic non-Maxwellian distribution functions in magnetized plasmas

    NASA Astrophysics Data System (ADS)

    Izacard, Olivier

    2016-08-01

    In magnetized plasma physics, almost all developed analytic theories assume a Maxwellian distribution function (MDF) and in some cases small deviations are described using the perturbation theory. The deviations with respect to the Maxwellian equilibrium, called kinetic effects, are required to be taken into account especially for fusion reactor plasmas. Generally, because the perturbation theory is not consistent with observed steady-state non-Maxwellians, these kinetic effects are numerically evaluated by very central processing unit (CPU)-expensive codes, avoiding the analytic complexity of velocity phase space integrals. We develop here a new method based on analytic non-Maxwellian distribution functions constructed from non-orthogonal basis sets in order to (i) use as few parameters as possible, (ii) increase the efficiency to model numerical and experimental non-Maxwellians, (iii) help to understand unsolved problems such as diagnostics discrepancies from the physical interpretation of the parameters, and (iv) obtain analytic corrections due to kinetic effects given by a small number of terms and removing the numerical error of the evaluation of velocity phase space integrals. This work does not attempt to derive new physical effects even if it could be possible to discover one from the better understandings of some unsolved problems, but here we focus on the analytic prediction of kinetic corrections from analytic non-Maxwellians. As applications, examples of analytic kinetic corrections are shown for the secondary electron emission, the Langmuir probe characteristic curve, and the entropy. This is done by using three analytic representations of the distribution function: the Kappa distribution function, the bi-modal or a new interpreted non-Maxwellian distribution function (INMDF). The existence of INMDFs is proved by new understandings of the experimental discrepancy of the measured electron temperature between two diagnostics in JET. As main results, it

  18. Kinetic corrections from analytic non-Maxwellian distribution functions in magnetized plasmas

    DOE PAGES

    Izacard, Olivier

    2016-08-02

    In magnetized plasma physics, almost all developed analytic theories assume a Maxwellian distribution function (MDF) and in some cases small deviations are described using the perturbation theory. The deviations with respect to the Maxwellian equilibrium, called kinetic effects, are required to be taken into account especially for fusion reactor plasmas. Generally, because the perturbation theory is not consistent with observed steady-state non-Maxwellians, these kinetic effects are numerically evaluated by very central processing unit (CPU)-expensive codes, avoiding the analytic complexity of velocity phase space integrals. We develop here a new method based on analytic non-Maxwellian distribution functions constructed from non-orthogonal basismore » sets in order to (i) use as few parameters as possible, (ii) increase the efficiency to model numerical and experimental non-Maxwellians, (iii) help to understand unsolved problems such as diagnostics discrepancies from the physical interpretation of the parameters, and (iv) obtain analytic corrections due to kinetic effects given by a small number of terms and removing the numerical error of the evaluation of velocity phase space integrals. This work does not attempt to derive new physical effects even if it could be possible to discover one from the better understandings of some unsolved problems, but here we focus on the analytic prediction of kinetic corrections from analytic non-Maxwellians. As applications, examples of analytic kinetic corrections are shown for the secondary electron emission, the Langmuir probe characteristic curve, and the entropy. This is done by using three analytic representations of the distribution function: the Kappa distribution function, the bi-modal or a new interpreted non-Maxwellian distribution function (INMDF). The existence of INMDFs is proved by new understandings of the experimental discrepancy of the measured electron temperature between two diagnostics in JET. As main

  19. Kinetic corrections from analytic non-Maxwellian distribution functions in magnetized plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Izacard, Olivier, E-mail: izacard@llnl.gov

    In magnetized plasma physics, almost all developed analytic theories assume a Maxwellian distribution function (MDF) and in some cases small deviations are described using the perturbation theory. The deviations with respect to the Maxwellian equilibrium, called kinetic effects, are required to be taken into account especially for fusion reactor plasmas. Generally, because the perturbation theory is not consistent with observed steady-state non-Maxwellians, these kinetic effects are numerically evaluated by very central processing unit (CPU)-expensive codes, avoiding the analytic complexity of velocity phase space integrals. We develop here a new method based on analytic non-Maxwellian distribution functions constructed from non-orthogonal basismore » sets in order to (i) use as few parameters as possible, (ii) increase the efficiency to model numerical and experimental non-Maxwellians, (iii) help to understand unsolved problems such as diagnostics discrepancies from the physical interpretation of the parameters, and (iv) obtain analytic corrections due to kinetic effects given by a small number of terms and removing the numerical error of the evaluation of velocity phase space integrals. This work does not attempt to derive new physical effects even if it could be possible to discover one from the better understandings of some unsolved problems, but here we focus on the analytic prediction of kinetic corrections from analytic non-Maxwellians. As applications, examples of analytic kinetic corrections are shown for the secondary electron emission, the Langmuir probe characteristic curve, and the entropy. This is done by using three analytic representations of the distribution function: the Kappa distribution function, the bi-modal or a new interpreted non-Maxwellian distribution function (INMDF). The existence of INMDFs is proved by new understandings of the experimental discrepancy of the measured electron temperature between two diagnostics in JET. As main

  20. [Non-randomized evaluation studies (TREND)].

    PubMed

    Vallvé, Carles; Artés, Maite; Cobo, Erik

    2005-12-01

    Nonrandomized intervention trials are needed when randomized clinical trials cannot be performed. To report the results from nonrandomized intervention studies transparently, the TREND (Transparent Reporting of Evaluations with Nonrandomized Designs) checklist should be used. This implies that nonrandomized studies should follow the remaining methodological tools usually employed in randomized trials and that the uncertainty introduced by the allocation mechanism should be explicitly reported and, if possible, quantified.

  1. Locating illicit connections in storm water sewers using fiber-optic distributed temperature sensing.

    PubMed

    Hoes, O A C; Schilperoort, R P S; Luxemburg, W M J; Clemens, F H L R; van de Giesen, N C

    2009-12-01

    A newly developed technique using distributed temperature sensing (DTS) has been developed to find illicit household sewage connections to storm water systems in the Netherlands. DTS allows for the accurate measurement of temperature along a fiber-optic cable, with high spatial (2m) and temporal (30s) resolution. We inserted a fiber-optic cable of 1300m in two storm water drains. At certain locations, significant temperature differences with an intermittent character were measured, indicating inflow of water that was not storm water. In all cases, we found that foul water from households or companies entered the storm water system through an illicit sewage connection. The method of using temperature differences for illicit connection detection in storm water networks is discussed. The technique of using fiber-optic cables for distributed temperature sensing is explained in detail. The DTS method is a reliable, inexpensive and practically feasible method to detect illicit connections to storm water systems, which does not require access to private property.

  2. Non-specific filtering of beta-distributed data.

    PubMed

    Wang, Xinhui; Laird, Peter W; Hinoue, Toshinori; Groshen, Susan; Siegmund, Kimberly D

    2014-06-19

    Non-specific feature selection is a dimension reduction procedure performed prior to cluster analysis of high dimensional molecular data. Not all measured features are expected to show biological variation, so only the most varying are selected for analysis. In DNA methylation studies, DNA methylation is measured as a proportion, bounded between 0 and 1, with variance a function of the mean. Filtering on standard deviation biases the selection of probes to those with mean values near 0.5. We explore the effect this has on clustering, and develop alternate filter methods that utilize a variance stabilizing transformation for Beta distributed data and do not share this bias. We compared results for 11 different non-specific filters on eight Infinium HumanMethylation data sets, selected to span a variety of biological conditions. We found that for data sets having a small fraction of samples showing abnormal methylation of a subset of normally unmethylated CpGs, a characteristic of the CpG island methylator phenotype in cancer, a novel filter statistic that utilized a variance-stabilizing transformation for Beta distributed data outperformed the common filter of using standard deviation of the DNA methylation proportion, or its log-transformed M-value, in its ability to detect the cancer subtype in a cluster analysis. However, the standard deviation filter always performed among the best for distinguishing subgroups of normal tissue. The novel filter and standard deviation filter tended to favour features in different genome contexts; for the same data set, the novel filter always selected more features from CpG island promoters and the standard deviation filter always selected more features from non-CpG island intergenic regions. Interestingly, despite selecting largely non-overlapping sets of features, the two filters did find sample subsets that overlapped for some real data sets. We found two different filter statistics that tended to prioritize features with

  3. Bayesian approach to non-Gaussian field statistics for diffusive broadband terahertz pulses.

    PubMed

    Pearce, Jeremy; Jian, Zhongping; Mittleman, Daniel M

    2005-11-01

    We develop a closed-form expression for the probability distribution function for the field components of a diffusive broadband wave propagating through a random medium. We consider each spectral component to provide an individual observation of a random variable, the configurationally averaged spectral intensity. Since the intensity determines the variance of the field distribution at each frequency, this random variable serves as the Bayesian prior that determines the form of the non-Gaussian field statistics. This model agrees well with experimental results.

  4. The First Order Correction to the Exit Distribution for Some Random Walks

    NASA Astrophysics Data System (ADS)

    Kennedy, Tom

    2016-07-01

    We study three different random walk models on several two-dimensional lattices by Monte Carlo simulations. One is the usual nearest neighbor random walk. Another is the nearest neighbor random walk which is not allowed to backtrack. The final model is the smart kinetic walk. For all three of these models the distribution of the point where the walk exits a simply connected domain D in the plane converges weakly to harmonic measure on partial D as the lattice spacing δ → 0. Let ω (0,\\cdot ;D) be harmonic measure for D, and let ω _δ (0,\\cdot ;D) be the discrete harmonic measure for one of the random walk models. Our definition of the random walk models is unusual in that we average over the orientation of the lattice with respect to the domain. We are interested in the limit of (ω _δ (0,\\cdot ;D)- ω (0,\\cdot ;D))/δ . Our Monte Carlo simulations of the three models lead to the conjecture that this limit equals c_{M,L} ρ _D(z) times Lebesgue measure with respect to arc length along the boundary, where the function ρ _D(z) depends on the domain, but not on the model or lattice, and the constant c_{M,L} depends on the model and on the lattice, but not on the domain. So there is a form of universality for this first order correction. We also give an explicit formula for the conjectured density ρ _D.

  5. Non-thermal electron distribution functions through 3D magnetic reconnection instabilities in the solar wind

    NASA Astrophysics Data System (ADS)

    Alejandro Munoz Sepulveda, Patricio; Buechner, Joerg

    2017-04-01

    The effects of kinetic instabilities on the solar wind electron velocity distribution functions (eVDFs) are mostly well understood under local homogeneous and stationary conditions. But the solar wind also contains current sheets, which affect the local properties of instabilities, turbulence and thus the observed non-maxwellian features in the eVDFs. Those processes are vastly unexplored. Therefore, we aim to investigate the influence of self-consistently generated turbulence via electron-scale instabilities in reconnecting current sheets on the formation of suprathermal features in the eVDFs. For this sake, we carry out 3D fully-kinetic Particle-in-Cell code numerical simulations of force free current sheets with a guide magnetic field. We find extended tails, anisotropic plateaus and non-gyrotropic features in the eVDFs, correlated with the locations and time where micro-turbulence is enhanced in the current sheet due to current-aligned streaming instabilities. We also discuss the influence of the plasma parameters, such as the ion to electron temperature ratio, on the excitation of current sheet instabilities and their effect on the properties of the eVDFs.

  6. Predictions for an invaded world: A strategy to predict the distribution of native and non-indigenous species at multiple scales

    USGS Publications Warehouse

    Reusser, D.A.; Lee, H.

    2008-01-01

    Habitat models can be used to predict the distributions of marine and estuarine non-indigenous species (NIS) over several spatial scales. At an estuary scale, our goal is to predict the estuaries most likely to be invaded, but at a habitat scale, the goal is to predict the specific locations within an estuary that are most vulnerable to invasion. As an initial step in evaluating several habitat models, model performance for a suite of benthic species with reasonably well-known distributions on the Pacific coast of the US needs to be compared. We discuss the utility of non-parametric multiplicative regression (NPMR) for predicting habitat- and estuary-scale distributions of native and NIS. NPMR incorporates interactions among variables, allows qualitative and categorical variables, and utilizes data on absence as well as presence. Preliminary results indicate that NPMR generally performs well at both spatial scales and that distributions of NIS are predicted as well as those of native species. For most species, latitude was the single best predictor, although similar model performance could be obtained at both spatial scales with combinations of other habitat variables. Errors of commission were more frequent at a habitat scale, with omission and commission errors approximately equal at an estuary scale. ?? 2008 International Council for the Exploration of the Sea. Published by Oxford Journals. All rights reserved.

  7. Location Management in Distributed Mobile Environments

    DTIC Science & Technology

    1994-09-01

    carried out to evaluate the performanceof di erent static strategies for various communica-tion and mobility patterns. Simulation results indi-cate...results ofsimulations carried out to evaluate the performanceof proposed static location management strategies forvarious call and mobility patterns...Systems, Austin, Sept. 1994. U.S. Government or Federal Rights License 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17

  8. Non-random mate choice in humans: insights from a genome scan.

    PubMed

    Laurent, R; Toupance, B; Chaix, R

    2012-02-01

    Little is known about the genetic factors influencing mate choice in humans. Still, there is evidence for non-random mate choice with respect to physical traits. In addition, some studies suggest that the Major Histocompatibility Complex may affect pair formation. Nowadays, the availability of high density genomic data sets gives the opportunity to scan the genome for signatures of non-random mate choice without prior assumptions on which genes may be involved, while taking into account socio-demographic factors. Here, we performed a genome scan to detect extreme patterns of similarity or dissimilarity among spouses throughout the genome in three populations of African, European American, and Mexican origins from the HapMap 3 database. Our analyses identified genes and biological functions that may affect pair formation in humans, including genes involved in skin appearance, morphogenesis, immunity and behaviour. We found little overlap between the three populations, suggesting that the biological functions potentially influencing mate choice are population specific, in other words are culturally driven. Moreover, whenever the same functional category of genes showed a significant signal in two populations, different genes were actually involved, which suggests the possibility of evolutionary convergences. © 2011 Blackwell Publishing Ltd.

  9. Locating non-volcanic tremor along the San Andreas Fault using a multiple array source imaging technique

    USGS Publications Warehouse

    Ryberg, T.; Haberland, C.H.; Fuis, G.S.; Ellsworth, W.L.; Shelly, D.R.

    2010-01-01

    Non-volcanic tremor (NVT) has been observed at several subduction zones and at the San Andreas Fault (SAF). Tremor locations are commonly derived by cross-correlating envelope-transformed seismic traces in combination with source-scanning techniques. Recently, they have also been located by using relative relocations with master events, that is low-frequency earthquakes that are part of the tremor; locations are derived by conventional traveltime-based methods. Here we present a method to locate the sources of NVT using an imaging approach for multiple array data. The performance of the method is checked with synthetic tests and the relocation of earthquakes. We also applied the method to tremor occurring near Cholame, California. A set of small-aperture arrays (i.e. an array consisting of arrays) installed around Cholame provided the data set for this study. We observed several tremor episodes and located tremor sources in the vicinity of SAF. During individual tremor episodes, we observed a systematic change of source location, indicating rapid migration of the tremor source along SAF. ?? 2010 The Authors Geophysical Journal International ?? 2010 RAS.

  10. Degree of target utilization influences the location of movement endpoint distributions.

    PubMed

    Slifkin, Andrew B; Eder, Jeffrey R

    2017-03-01

    According to dominant theories of motor control, speed and accuracy are optimized when, on the average, movement endpoints are located at the target center and when the variability of the movement endpoint distributions is matched to the width of the target (viz., Meyer, Abrams, Kornblum, Wright, & Smith, 1988). The current study tested those predictions. According to the speed-accuracy trade-off, expanding the range of variability to the amount permitted by the limits of the target boundaries allows for maximization of movement speed while centering the distribution on the target center prevents movement errors that would have occurred had the distribution been off center. Here, participants (N=20) were required to generate 100 consecutive targeted hand movements under each of 15 unique conditions: There were three movement amplitude requirements (80, 160, 320mm) and within each there were five target widths (5, 10, 20, 40, 80mm). According to the results, it was only at the smaller target widths (5, 10mm) that movement endpoint distributions were centered on the target center and the range of movement endpoint variability matched the range specified by the target boundaries. As target width increased (20, 40, 80mm), participants increasingly undershot the target center and the range of movement endpoint variability increasingly underestimated the variability permitted by the target region. The degree of target center undershooting was strongly predicted by the difference between the size of the target and the amount of movement endpoint variability, i.e., the amount of unused space in the target. The results suggest that participants have precise knowledge of their variability relative to that permitted by the target, and they use that knowledge to systematically reduce the travel distance to targets. The reduction in travel distance across the larger target widths might have resulted in greater cost savings than those associated with increases in speed

  11. Comparison of four modeling tools for the prediction of potential distribution for non-indigenous weeds in the United States

    USGS Publications Warehouse

    Magarey, Roger; Newton, Leslie; Hong, Seung C.; Takeuchi, Yu; Christie, Dave; Jarnevich, Catherine S.; Kohl, Lisa; Damus, Martin; Higgins, Steven I.; Miller, Leah; Castro, Karen; West, Amanda; Hastings, John; Cook, Gericke; Kartesz, John; Koop, Anthony

    2018-01-01

    This study compares four models for predicting the potential distribution of non-indigenous weed species in the conterminous U.S. The comparison focused on evaluating modeling tools and protocols as currently used for weed risk assessment or for predicting the potential distribution of invasive weeds. We used six weed species (three highly invasive and three less invasive non-indigenous species) that have been established in the U.S. for more than 75 years. The experiment involved providing non-U. S. location data to users familiar with one of the four evaluated techniques, who then developed predictive models that were applied to the United States without knowing the identity of the species or its U.S. distribution. We compared a simple GIS climate matching technique known as Proto3, a simple climate matching tool CLIMEX Match Climates, the correlative model MaxEnt, and a process model known as the Thornley Transport Resistance (TTR) model. Two experienced users ran each modeling tool except TTR, which had one user. Models were trained with global species distribution data excluding any U.S. data, and then were evaluated using the current known U.S. distribution. The influence of weed species identity and modeling tool on prevalence and sensitivity effects was compared using a generalized linear mixed model. Each modeling tool itself had a low statistical significance, while weed species alone accounted for 69.1 and 48.5% of the variance for prevalence and sensitivity, respectively. These results suggest that simple modeling tools might perform as well as complex ones in the case of predicting potential distribution for a weed not yet present in the United States. Considerations of model accuracy should also be balanced with those of reproducibility and ease of use. More important than the choice of modeling tool is the construction of robust protocols and testing both new and experienced users under blind test conditions that approximate operational conditions.

  12. Development of a Random Field Model for Gas Plume Detection in Multiple LWIR Images.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heasler, Patrick G.

    This report develops a random field model that describes gas plumes in LWIR remote sensing images. The random field model serves as a prior distribution that can be combined with LWIR data to produce a posterior that determines the probability that a gas plume exists in the scene and also maps the most probable location of any plume. The random field model is intended to work with a single pixel regression estimator--a regression model that estimates gas concentration on an individual pixel basis.

  13. Anomalous Anticipatory Responses in Networked Random Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Roger D.; Bancel, Peter A.

    2006-10-16

    We examine an 8-year archive of synchronized, parallel time series of random data from a world spanning network of physical random event generators (REGs). The archive is a publicly accessible matrix of normally distributed 200-bit sums recorded at 1 Hz which extends from August 1998 to the present. The primary question is whether these data show non-random structure associated with major events such as natural or man-made disasters, terrible accidents, or grand celebrations. Secondarily, we examine the time course of apparently correlated responses. Statistical analyses of the data reveal consistent evidence that events which strongly affect people engender small butmore » significant effects. These include suggestions of anticipatory responses in some cases, leading to a series of specialized analyses to assess possible non-random structure preceding precisely timed events. A focused examination of data collected around the time of earthquakes with Richter magnitude 6 and greater reveals non-random structure with a number of intriguing, potentially important features. Anomalous effects in the REG data are seen only when the corresponding earthquakes occur in populated areas. No structure is found if they occur in the oceans. We infer that an important contributor to the effect is the relevance of the earthquake to humans. Epoch averaging reveals evidence for changes in the data some hours prior to the main temblor, suggestive of reverse causation.« less

  14. Smooth conditional distribution function and quantiles under random censorship.

    PubMed

    Leconte, Eve; Poiraud-Casanova, Sandrine; Thomas-Agnan, Christine

    2002-09-01

    We consider a nonparametric random design regression model in which the response variable is possibly right censored. The aim of this paper is to estimate the conditional distribution function and the conditional alpha-quantile of the response variable. We restrict attention to the case where the response variable as well as the explanatory variable are unidimensional and continuous. We propose and discuss two classes of estimators which are smooth with respect to the response variable as well as to the covariate. Some simulations demonstrate that the new methods have better mean square error performances than the generalized Kaplan-Meier estimator introduced by Beran (1981) and considered in the literature by Dabrowska (1989, 1992) and Gonzalez-Manteiga and Cadarso-Suarez (1994).

  15. Case-based or non-case-based questions for teaching postgraduate physicians: a randomized crossover trial.

    PubMed

    Cook, David A; Thompson, Warren G; Thomas, Kris G

    2009-10-01

    The comparative efficacy of case-based (CB) and non-CB self-assessment questions in Web-based instruction is unknown. The authors sought to compare CB and non-CB questions. The authors conducted a randomized crossover trial in the continuity clinics of two academic residency programs. Four Web-based modules on ambulatory medicine were developed in both CB (periodic questions based on patient scenarios) and non-CB (questions matched for content but lacking patient scenarios) formats. Participants completed two modules in each format (sequence randomly assigned). Participants also completed a pretest of applied knowledge for two modules (randomly assigned). For the 130 participating internal medicine and family medicine residents, knowledge scores improved significantly (P < .0001) from pretest (mean: 53.5; SE: 1.1) to posttest (75.1; SE: 0.7). Posttest knowledge scores were similar in CB (75.0; SE: 0.1) and non-CB formats (74.7; SE: 1.1); the 95% CI was -1.6, 2.2 (P = .76). A nearly significant (P = .062) interaction between format and the presence or absence of pretest suggested a differential effect of question format, depending on pretest. Overall, those taking pretests had higher posttest knowledge scores (76.7; SE: 1.1) than did those not taking pretests (73.0; SE: 1.1; 95% CI: 1.7, 5.6; P = .0003). Learners preferred the CB format. Time required was similar (CB: 42.5; SE: 1.8 minutes, non-CB: 40.9; SE: 1.8 minutes; P = .22). Our findings suggest that, among postgraduate physicians, CB and non-CB questions have similar effects on knowledge scores, but learners prefer CB questions. Pretests influence posttest scores.

  16. Generation Mechanism of Nonlinear Rayleigh Surface Waves for Randomly Distributed Surface Micro-Cracks.

    PubMed

    Ding, Xiangyan; Li, Feilong; Zhao, Youxuan; Xu, Yongmei; Hu, Ning; Cao, Peng; Deng, Mingxi

    2018-04-23

    This paper investigates the propagation of Rayleigh surface waves in structures with randomly distributed surface micro-cracks using numerical simulations. The results revealed a significant ultrasonic nonlinear effect caused by the surface micro-cracks, which is mainly represented by a second harmonic with even more distinct third/quadruple harmonics. Based on statistical analysis from the numerous results of random micro-crack models, it is clearly found that the acoustic nonlinear parameter increases linearly with micro-crack density, the proportion of surface cracks, the size of micro-crack zone, and the excitation frequency. This study theoretically reveals that nonlinear Rayleigh surface waves are feasible for use in quantitatively identifying the physical characteristics of surface micro-cracks in structures.

  17. Generation Mechanism of Nonlinear Rayleigh Surface Waves for Randomly Distributed Surface Micro-Cracks

    PubMed Central

    Ding, Xiangyan; Li, Feilong; Xu, Yongmei; Cao, Peng; Deng, Mingxi

    2018-01-01

    This paper investigates the propagation of Rayleigh surface waves in structures with randomly distributed surface micro-cracks using numerical simulations. The results revealed a significant ultrasonic nonlinear effect caused by the surface micro-cracks, which is mainly represented by a second harmonic with even more distinct third/quadruple harmonics. Based on statistical analysis from the numerous results of random micro-crack models, it is clearly found that the acoustic nonlinear parameter increases linearly with micro-crack density, the proportion of surface cracks, the size of micro-crack zone, and the excitation frequency. This study theoretically reveals that nonlinear Rayleigh surface waves are feasible for use in quantitatively identifying the physical characteristics of surface micro-cracks in structures. PMID:29690580

  18. A simple approximation of moments of the quasi-equilibrium distribution of an extended stochastic theta-logistic model with non-integer powers.

    PubMed

    Bhowmick, Amiya Ranjan; Bandyopadhyay, Subhadip; Rana, Sourav; Bhattacharya, Sabyasachi

    2016-01-01

    The stochastic versions of the logistic and extended logistic growth models are applied successfully to explain many real-life population dynamics and share a central body of literature in stochastic modeling of ecological systems. To understand the randomness in the population dynamics of the underlying processes completely, it is important to have a clear idea about the quasi-equilibrium distribution and its moments. Bartlett et al. (1960) took a pioneering attempt for estimating the moments of the quasi-equilibrium distribution of the stochastic logistic model. Matis and Kiffe (1996) obtain a set of more accurate and elegant approximations for the mean, variance and skewness of the quasi-equilibrium distribution of the same model using cumulant truncation method. The method is extended for stochastic power law logistic family by the same and several other authors (Nasell, 2003; Singh and Hespanha, 2007). Cumulant truncation and some alternative methods e.g. saddle point approximation, derivative matching approach can be applied if the powers involved in the extended logistic set up are integers, although plenty of evidence is available for non-integer powers in many practical situations (Sibly et al., 2005). In this paper, we develop a set of new approximations for mean, variance and skewness of the quasi-equilibrium distribution under more general family of growth curves, which is applicable for both integer and non-integer powers. The deterministic counterpart of this family of models captures both monotonic and non-monotonic behavior of the per capita growth rate, of which theta-logistic is a special case. The approximations accurately estimate the first three order moments of the quasi-equilibrium distribution. The proposed method is illustrated with simulated data and real data from global population dynamics database. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Spatial Distribution of Cyanobacteria in Modern Stromatolites

    NASA Technical Reports Server (NTRS)

    Prufert-Bebout, Lee; Dacles-Mariani, Jennifer; Herbert, Alice; DeVincenzi, Donald (Technical Monitor)

    2001-01-01

    Living stromatolites consist of complex microbial communities with distinct distribution patterns for different microbial groups. The cyanobacterial populations of Highborne Cay Bahamas exemplify this phenomenon. Field observations reveal distinct distribution patterns for several of these cyanobacterial species. To date 10 different cyanobacterial cultures, including both filamentous and endolithic species, have been isolated from these stromatolites. We will present data on the growth and motility characteristics as well as on the nutritional requirements of these isolates. These data will then be correlated with the field observed distributions for these species. Lastly laboratory simulations of stromatolites grown under various conditions of irradiance, flow and cyanobacterial community composition will be presented. These experiments allow us to evaluate our predictions regarding controls on cyanobacterial distribution.

  20. Non-functional Avionics Requirements

    NASA Astrophysics Data System (ADS)

    Paulitsch, Michael; Ruess, Harald; Sorea, Maria

    Embedded systems in aerospace become more and more integrated in order to reduce weight, volume/size, and power of hardware for more fuel-effi ciency. Such integration tendencies change architectural approaches of system ar chi tec tures, which subsequently change non-functional requirements for plat forms. This paper provides some insight into state-of-the-practice of non-func tional requirements for developing ultra-critical embedded systems in the aero space industry, including recent changes and trends. In particular, formal requi re ment capture and formal analysis of non-functional requirements of avionic systems - including hard-real time, fault-tolerance, reliability, and per for mance - are exemplified by means of recent developments in SAL and HiLiTE.

  1. From moonlight to movement and synchronized randomness: Fourier and wavelet analyses of animal location time series data

    PubMed Central

    Polansky, Leo; Wittemyer, George; Cross, Paul C.; Tambling, Craig J.; Getz, Wayne M.

    2011-01-01

    High-resolution animal location data are increasingly available, requiring analytical approaches and statistical tools that can accommodate the temporal structure and transient dynamics (non-stationarity) inherent in natural systems. Traditional analyses often assume uncorrelated or weakly correlated temporal structure in the velocity (net displacement) time series constructed using sequential location data. We propose that frequency and time–frequency domain methods, embodied by Fourier and wavelet transforms, can serve as useful probes in early investigations of animal movement data, stimulating new ecological insight and questions. We introduce a novel movement model with time-varying parameters to study these methods in an animal movement context. Simulation studies show that the spectral signature given by these methods provides a useful approach for statistically detecting and characterizing temporal dependency in animal movement data. In addition, our simulations provide a connection between the spectral signatures observed in empirical data with null hypotheses about expected animal activity. Our analyses also show that there is not a specific one-to-one relationship between the spectral signatures and behavior type and that departures from the anticipated signatures are also informative. Box plots of net displacement arranged by time of day and conditioned on common spectral properties can help interpret the spectral signatures of empirical data. The first case study is based on the movement trajectory of a lion (Panthera leo) that shows several characteristic daily activity sequences, including an active–rest cycle that is correlated with moonlight brightness. A second example based on six pairs of African buffalo (Syncerus caffer) illustrates the use of wavelet coherency to show that their movements synchronize when they are within ∼1 km of each other, even when individual movement was best described as an uncorrelated random walk, providing an

  2. Detection of Leaks in Water Distribution System using Non-Destructive Techniques

    NASA Astrophysics Data System (ADS)

    Aslam, H.; Kaur, M.; Sasi, S.; Mortula, Md M.; Yehia, S.; Ali, T.

    2018-05-01

    Water is scarce and needs to be conserved. A considerable amount of water which flows in the water distribution systems was found to be lost due to pipe leaks. Consequently, innovations in methods of pipe leakage detections for early recognition and repair of these leaks is vital to ensure minimum wastage of water in distribution systems. A major component of detection of pipe leaks is the ability to accurately locate the leak location in pipes through minimum invasion. Therefore, this paper studies the leak detection abilities of the three NDT’s: Ground Penetration Radar (GPR) and spectrometer and aims at determining whether these instruments are effective in identifying the leak. An experimental setup was constructed to simulate the underground conditions of water distribution systems. After analysing the experimental data, it was concluded that both the GPR and the spectrometer were effective in detecting leaks in the pipes. However, the results obtained from the spectrometer were not very differentiating in terms of observing the leaks in comparison to the results obtained from the GPR. In addition to this, it was concluded that both instruments could not be used if the water from the leaks had reached on the surface, resulting in surface ponding.

  3. Enzyme/non-enzyme discrimination and prediction of enzyme active site location using charge-based methods.

    PubMed

    Bate, Paul; Warwicker, Jim

    2004-07-02

    Calculations of charge interactions complement analysis of a characterised active site, rationalising pH-dependence of activity and transition state stabilisation. Prediction of active site location through large DeltapK(a)s or electrostatic strain is relevant for structural genomics. We report a study of ionisable groups in a set of 20 enzymes, finding that false positives obscure predictive potential. In a larger set of 156 enzymes, peaks in solvent-space electrostatic properties are calculated. Both electric field and potential match well to active site location. The best correlation is found with electrostatic potential calculated from uniform charge density over enzyme volume, rather than from assignment of a standard atom-specific charge set. Studying a shell around each molecule, for 77% of enzymes the potential peak is within that 5% of the shell closest to the active site centre, and 86% within 10%. Active site identification by largest cleft, also with projection onto a shell, gives 58% of enzymes for which the centre of the largest cleft lies within 5% of the active site, and 70% within 10%. Dielectric boundary conditions emphasise clefts in the uniform charge density method, which is suited to recognition of binding pockets embedded within larger clefts. The variation of peak potential with distance from active site, and comparison between enzyme and non-enzyme sets, gives an optimal threshold distinguishing enzyme from non-enzyme. We find that 87% of the enzyme set exceeds the threshold as compared to 29% of the non-enzyme set. Enzyme/non-enzyme homologues, "structural genomics" annotated proteins and catalytic/non-catalytic RNAs are studied in this context.

  4. No evidence for MHC class II-based non-random mating at the gametic haplotype in Atlantic salmon.

    PubMed

    Promerová, M; Alavioon, G; Tusso, S; Burri, R; Immler, S

    2017-06-01

    Genes of the major histocompatibility complex (MHC) are a likely target of mate choice because of their role in inbreeding avoidance and potential benefits for offspring immunocompetence. Evidence for female choice for complementary MHC alleles among competing males exists both for the pre- and the postmating stages. However, it remains unclear whether the latter may involve non-random fusion of gametes depending on gametic haplotypes resulting in transmission ratio distortion or non-random sequence divergence among fused gametes. We tested whether non-random gametic fusion of MHC-II haplotypes occurs in Atlantic salmon Salmo salar. We performed in vitro fertilizations that excluded interindividual sperm competition using a split family design with large clutch sample sizes to test for a possible role of the gametic haplotype in mate choice. We sequenced two MHC-II loci in 50 embryos per clutch to assess allelic frequencies and sequence divergence. We found no evidence for transmission ratio distortion at two linked MHC-II loci, nor for non-random gamete fusion with respect to MHC-II alleles. Our findings suggest that the gametic MHC-II haplotypes play no role in gamete association in Atlantic salmon and that earlier findings of MHC-based mate choice most likely reflect choice among diploid genotypes. We discuss possible explanations for these findings and how they differ from findings in mammals.

  5. Skewness and kurtosis analysis for non-Gaussian distributions

    NASA Astrophysics Data System (ADS)

    Celikoglu, Ahmet; Tirnakli, Ugur

    2018-06-01

    In this paper we address a number of pitfalls regarding the use of kurtosis as a measure of deviations from the Gaussian. We treat kurtosis in both its standard definition and that which arises in q-statistics, namely q-kurtosis. We have recently shown that the relation proposed by Cristelli et al. (2012) between skewness and kurtosis can only be verified for relatively small data sets, independently of the type of statistics chosen; however it fails for sufficiently large data sets, if the fourth moment of the distribution is finite. For infinite fourth moments, kurtosis is not defined as the size of the data set tends to infinity. For distributions with finite fourth moments, the size, N, of the data set for which the standard kurtosis saturates to a fixed value, depends on the deviation of the original distribution from the Gaussian. Nevertheless, using kurtosis as a criterion for deciding which distribution deviates further from the Gaussian can be misleading for small data sets, even for finite fourth moment distributions. Going over to q-statistics, we find that although the value of q-kurtosis is finite in the range of 0 < q < 3, this quantity is not useful for comparing different non-Gaussian distributed data sets, unless the appropriate q value, which truly characterizes the data set of interest, is chosen. Finally, we propose a method to determine the correct q value and thereby to compute the q-kurtosis of q-Gaussian distributed data sets.

  6. Slip-Size Distribution and Self-Organized Criticality in Block-Spring Models with Quenched Randomness

    NASA Astrophysics Data System (ADS)

    Sakaguchi, Hidetsugu; Kadowaki, Shuntaro

    2017-07-01

    We study slowly pulling block-spring models in random media. Second-order phase transitions exist in a model pulled by a constant force in the case of velocity-strengthening friction. If external forces are slowly increased, nearly critical states are self-organized. Slips of various sizes occur, and the probability distributions of slip size roughly obey power laws. The exponent is close to that in the quenched Edwards-Wilkinson model. Furthermore, the slip-size distributions are investigated in cases of Coulomb friction, velocity-weakening friction, and two-dimensional block-spring models.

  7. In the eye of the beholder: Inhomogeneous distribution of high-resolution shapes within the random-walk ensemble.

    PubMed

    Müller, Christian L; Sbalzarini, Ivo F; van Gunsteren, Wilfred F; Zagrović, Bojan; Hünenberger, Philippe H

    2009-06-07

    The concept of high-resolution shapes (also referred to as folds or states, depending on the context) of a polymer chain plays a central role in polymer science, structural biology, bioinformatics, and biopolymer dynamics. However, although the idea of shape is intuitively very useful, there is no unambiguous mathematical definition for this concept. In the present work, the distributions of high-resolution shapes within the ideal random-walk ensembles with N=3,...,6 beads (or up to N=10 for some properties) are investigated using a systematic (grid-based) approach based on a simple working definition of shapes relying on the root-mean-square atomic positional deviation as a metric (i.e., to define the distance between pairs of structures) and a single cutoff criterion for the shape assignment. Although the random-walk ensemble appears to represent the paramount of homogeneity and randomness, this analysis reveals that the distribution of shapes within this ensemble, i.e., in the total absence of interatomic interactions characteristic of a specific polymer (beyond the generic connectivity constraint), is significantly inhomogeneous. In particular, a specific (densest) shape occurs with a local probability that is 1.28, 1.79, 2.94, and 10.05 times (N=3,...,6) higher than the corresponding average over all possible shapes (these results can tentatively be extrapolated to a factor as large as about 10(28) for N=100). The qualitative results of this analysis lead to a few rather counterintuitive suggestions, namely, that, e.g., (i) a fold classification analysis applied to the random-walk ensemble would lead to the identification of random-walk "folds;" (ii) a clustering analysis applied to the random-walk ensemble would also lead to the identification random-walk "states" and associated relative free energies; and (iii) a random-walk ensemble of polymer chains could lead to well-defined diffraction patterns in hypothetical fiber or crystal diffraction experiments

  8. Random-phase metasurfaces at optical wavelengths

    NASA Astrophysics Data System (ADS)

    Pors, Anders; Ding, Fei; Chen, Yiting; Radko, Ilya P.; Bozhevolnyi, Sergey I.

    2016-06-01

    Random-phase metasurfaces, in which the constituents scatter light with random phases, have the property that an incident plane wave will diffusely scatter, hereby leading to a complex far-field response that is most suitably described by statistical means. In this work, we present and exemplify the statistical description of the far-field response, particularly highlighting how the response for polarised and unpolarised light might be alike or different depending on the correlation of scattering phases for two orthogonal polarisations. By utilizing gap plasmon-based metasurfaces, consisting of an optically thick gold film overlaid by a subwavelength thin glass spacer and an array of gold nanobricks, we design and realize random-phase metasurfaces at a wavelength of 800 nm. Optical characterisation of the fabricated samples convincingly demonstrates the diffuse scattering of reflected light, with statistics obeying the theoretical predictions. We foresee the use of random-phase metasurfaces for camouflage applications and as high-quality reference structures in dark-field microscopy, while the control of the statistics for polarised and unpolarised light might find usage in security applications. Finally, by incorporating a certain correlation between scattering by neighbouring metasurface constituents new types of functionalities can be realised, such as a Lambertian reflector.

  9. Health Messaging and African-American Infant Sleep Location: A Randomized Controlled Trial

    PubMed Central

    Moon, Rachel Y; Mathews, Anita; Joyner, Brandi L.; Oden, Rosalind P.; He, Jianping; McCarter, Robert

    2016-01-01

    Background Infant-parent bedsharing increases the risk of SIDS and other sleep-related deaths. Despite AAP recommendations to avoid bedsharing, public health efforts have been unsuccessful in changing behaviors. African-American infants are more than twice as likely to die from SIDS and other sleep-related deaths, and are also twice as likely to bedshare with their parents. Further, African-American parents have a high degree of self-efficacy with regards to preventing infant suffocation, but low self-efficacy with regards to SIDS risk reduction. It is unclear whether messages emphasizing suffocation prevention will decrease bedsharing. Objectives To evaluate the impact of specific health messages on African-American parental decisions regarding infant sleep location. Methods We conducted a randomized, controlled trial of African-American mothers of infants. The control group received standard messaging emphasizing AAP-recommended safe sleep practices, including avoidance of bedsharing, for the purposes of SIDS risk reduction. The intervention group received enhanced messaging emphasizing safe sleep practices, including avoidance of bedsharing, for both SIDS risk reduction and suffocation prevention. Participants completed interviews at 2–3 weeks, 2–3 months, and 5–6 months after the infant’s birth. Results 1194 mothers were enrolled in the study, and 637 completed all interviews. Bedsharing, both usually (aOR 1.005 [95% CI 1.003, 1.006]) and last night (aOR 1.004 [95% CI 1.002, 1.007]) increased slightly but statistically significantly with infant age (p<0.001). Receipt of the enhanced message did not impact on sleep location. Maternal belief that bedsharing increased the risk of SIDS or suffocation declined over 6 months (p<0.001) and did not differ by group assignment. Conclusion African-American mothers who received an enhanced message about SIDS risk reduction and suffocation prevention were no less likely to bedshare with their infants. PMID:27470122

  10. The influence of statistical properties of Fourier coefficients on random Gaussian surfaces.

    PubMed

    de Castro, C P; Luković, M; Andrade, R F S; Herrmann, H J

    2017-05-16

    Many examples of natural systems can be described by random Gaussian surfaces. Much can be learned by analyzing the Fourier expansion of the surfaces, from which it is possible to determine the corresponding Hurst exponent and consequently establish the presence of scale invariance. We show that this symmetry is not affected by the distribution of the modulus of the Fourier coefficients. Furthermore, we investigate the role of the Fourier phases of random surfaces. In particular, we show how the surface is affected by a non-uniform distribution of phases.

  11. Diffusion in randomly perturbed dissipative dynamics

    NASA Astrophysics Data System (ADS)

    Rodrigues, Christian S.; Chechkin, Aleksei V.; de Moura, Alessandro P. S.; Grebogi, Celso; Klages, Rainer

    2014-11-01

    Dynamical systems having many coexisting attractors present interesting properties from both fundamental theoretical and modelling points of view. When such dynamics is under bounded random perturbations, the basins of attraction are no longer invariant and there is the possibility of transport among them. Here we introduce a basic theoretical setting which enables us to study this hopping process from the perspective of anomalous transport using the concept of a random dynamical system with holes. We apply it to a simple model by investigating the role of hyperbolicity for the transport among basins. We show numerically that our system exhibits non-Gaussian position distributions, power-law escape times, and subdiffusion. Our simulation results are reproduced consistently from stochastic continuous time random walk theory.

  12. Human mammary epithelial cells exhibit a bimodal correlated random walk pattern.

    PubMed

    Potdar, Alka A; Jeon, Junhwan; Weaver, Alissa M; Quaranta, Vito; Cummings, Peter T

    2010-03-10

    Organisms, at scales ranging from unicellular to mammals, have been known to exhibit foraging behavior described by random walks whose segments confirm to Lévy or exponential distributions. For the first time, we present evidence that single cells (mammary epithelial cells) that exist in multi-cellular organisms (humans) follow a bimodal correlated random walk (BCRW). Cellular tracks of MCF-10A pBabe, neuN and neuT random migration on 2-D plastic substrates, analyzed using bimodal analysis, were found to reveal the BCRW pattern. We find two types of exponentially distributed correlated flights (corresponding to what we refer to as the directional and re-orientation phases) each having its own correlation between move step-lengths within flights. The exponential distribution of flight lengths was confirmed using different analysis methods (logarithmic binning with normalization, survival frequency plots and maximum likelihood estimation). Because of the presence of non-uniform turn angle distribution of move step-lengths within a flight and two different types of flights, we propose that the epithelial random walk is a BCRW comprising of two alternating modes with varying degree of correlations, rather than a simple persistent random walk. A BCRW model rather than a simple persistent random walk correctly matches the super-diffusivity in the cell migration paths as indicated by simulations based on the BCRW model.

  13. Commercial milk distribution profiles and production locations. Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deonigi, D.E.; Anderson, D.M.; Wilfert, G.L.

    1993-12-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project was established to estimate radiation doses that people could have received from nuclear operations at the Hanford Site since 1944. For this period iodine-131 is the most important offsite contributor to radiation doses from Hanford operations. Consumption of milk from cows that ate vegetation contaminated by iodine-131 is the dominant radiation pathway for individuals who drank milk. Information has been developed on commercial milk cow locations and commercial milk distribution during 1945 and 1951. The year 1945 was selected because during 1945 the largest amount of iodine-131 was released from Hanford facilities inmore » a calendar year; therefore, 1945 was the year in which an individual was likely to have received the highest dose. The year 1951 was selected to provide data for comparing the changes that occurred in commercial milk flows (i.e., sources, processing locations, and market areas) between World War II and the post-war period. To estimate the doses people could have received from this milk flow, it is necessary to estimate the amount of milk people consumed, the source of the milk, the specific feeding regime used for milk cows, and the amount of iodine-131 contamination deposited on feed.« less

  14. Commercial milk distribution profiles and production locations. Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deonigi, D.E.; Anderson, D.M.; Wilfert, G.L.

    1994-04-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project was established to estimate radiation doses that people could have received from nuclear operations at the Hanford Site since 1944. For this period iodine-131 is the most important offsite contributor to radiation doses from Hanford operations. Consumption of milk from cows that ate vegetation contaminated by iodine-131 is the dominant radiation pathway for individuals who drank milk (Napier 1992). Information has been developed on commercial milk cow locations and commercial milk distribution during 1945 and 1951. The year 1945 was selected because during 1945 the largest amount of iodine-131 was released from Hanfordmore » facilities in a calendar year (Heeb 1993); therefore, 1945 was the year in which an individual was likely to have received the highest dose. The year 1951 was selected to provide data for comparing the changes that occurred in commercial milk flows (i.e., sources, processing locations, and market areas) between World War II and the post-war period. To estimate the doses people could have received from this milk flow, it is necessary to estimate the amount of milk people consumed, the source of the milk, the specific feeding regime used for milk cows, and the amount of iodine-131 contamination deposited on feed.« less

  15. Air bubble migration is a random event post embryo transfer.

    PubMed

    Confino, E; Zhang, J; Risquez, F

    2007-06-01

    Air bubble location following embryo transfer (ET) is the presumable placement spot of embryos. The purpose of this study was to document endometrial air bubble position and migration following embryo transfer. Multicenter prospective case study. Eighty-eight embryo transfers were performed under abdominal ultrasound guidance in two countries by two authors. A single or double air bubble was loaded with the embryos using a soft, coaxial, end opened catheters. The embryos were slowly injected 10-20 mm from the fundus. Air bubble position was recorded immediately, 30 minutes later and when the patient stood up. Bubble marker location analysis revealed a random distribution without visible gravity effect when the patients stood up. The bubble markers demonstrated splitting, moving in all directions and dispersion. Air bubbles move and split frequently post ET with the patient in the horizontal position, suggestive of active uterine contractions. Bubble migration analysis supports a rather random movement of the bubbles and possibly the embryos. Standing up changed somewhat bubble configuration and distribution in the uterine cavity. Gravity related bubble motion was uncommon, suggesting that horizontal rest post ET may not be necessary. This report challenges the common belief that a very accurate ultrasound guided embryo placement is mandatory. The very random bubble movement observed in this two-center study suggests that a large "window" of embryo placement maybe present.

  16. Inference of median difference based on the Box-Cox model in randomized clinical trials.

    PubMed

    Maruo, K; Isogawa, N; Gosho, M

    2015-05-10

    In randomized clinical trials, many medical and biological measurements are not normally distributed and are often skewed. The Box-Cox transformation is a powerful procedure for comparing two treatment groups for skewed continuous variables in terms of a statistical test. However, it is difficult to directly estimate and interpret the location difference between the two groups on the original scale of the measurement. We propose a helpful method that infers the difference of the treatment effect on the original scale in a more easily interpretable form. We also provide statistical analysis packages that consistently include an estimate of the treatment effect, covariance adjustments, standard errors, and statistical hypothesis tests. The simulation study that focuses on randomized parallel group clinical trials with two treatment groups indicates that the performance of the proposed method is equivalent to or better than that of the existing non-parametric approaches in terms of the type-I error rate and power. We illustrate our method with cluster of differentiation 4 data in an acquired immune deficiency syndrome clinical trial. Copyright © 2015 John Wiley & Sons, Ltd.

  17. Hacking on decoy-state quantum key distribution system with partial phase randomization

    NASA Astrophysics Data System (ADS)

    Sun, Shi-Hai; Jiang, Mu-Sheng; Ma, Xiang-Chun; Li, Chun-Yan; Liang, Lin-Mei

    2014-04-01

    Quantum key distribution (QKD) provides means for unconditional secure key transmission between two distant parties. However, in practical implementations, it suffers from quantum hacking due to device imperfections. Here we propose a hybrid measurement attack, with only linear optics, homodyne detection, and single photon detection, to the widely used vacuum + weak decoy state QKD system when the phase of source is partially randomized. Our analysis shows that, in some parameter regimes, the proposed attack would result in an entanglement breaking channel but still be able to trick the legitimate users to believe they have transmitted secure keys. That is, the eavesdropper is able to steal all the key information without discovered by the users. Thus, our proposal reveals that partial phase randomization is not sufficient to guarantee the security of phase-encoding QKD systems with weak coherent states.

  18. Hacking on decoy-state quantum key distribution system with partial phase randomization.

    PubMed

    Sun, Shi-Hai; Jiang, Mu-Sheng; Ma, Xiang-Chun; Li, Chun-Yan; Liang, Lin-Mei

    2014-04-23

    Quantum key distribution (QKD) provides means for unconditional secure key transmission between two distant parties. However, in practical implementations, it suffers from quantum hacking due to device imperfections. Here we propose a hybrid measurement attack, with only linear optics, homodyne detection, and single photon detection, to the widely used vacuum + weak decoy state QKD system when the phase of source is partially randomized. Our analysis shows that, in some parameter regimes, the proposed attack would result in an entanglement breaking channel but still be able to trick the legitimate users to believe they have transmitted secure keys. That is, the eavesdropper is able to steal all the key information without discovered by the users. Thus, our proposal reveals that partial phase randomization is not sufficient to guarantee the security of phase-encoding QKD systems with weak coherent states.

  19. Trophallaxis-inspired model for distributed transport between randomly interacting agents

    NASA Astrophysics Data System (ADS)

    Gräwer, Johannes; Ronellenfitsch, Henrik; Mazza, Marco G.; Katifori, Eleni

    2017-08-01

    Trophallaxis, the regurgitation and mouth to mouth transfer of liquid food between members of eusocial insect societies, is an important process that allows the fast and efficient dissemination of food in the colony. Trophallactic systems are typically treated as a network of agent interactions. This approach, though valuable, does not easily lend itself to analytic predictions. In this work we consider a simple trophallactic system of randomly interacting agents with finite carrying capacity, and calculate analytically and via a series of simulations the global food intake rate for the whole colony as well as observables describing how uniformly the food is distributed within the nest. Our model and predictions provide a useful benchmark to assess to what level the observed food uptake rates and efficiency in food distribution is due to stochastic effects or specific trophallactic strategies by the ant colony. Our work also serves as a stepping stone to describing the collective properties of more complex trophallactic systems, such as those including division of labor between foragers and workers.

  20. Trophallaxis-inspired model for distributed transport between randomly interacting agents.

    PubMed

    Gräwer, Johannes; Ronellenfitsch, Henrik; Mazza, Marco G; Katifori, Eleni

    2017-08-01

    Trophallaxis, the regurgitation and mouth to mouth transfer of liquid food between members of eusocial insect societies, is an important process that allows the fast and efficient dissemination of food in the colony. Trophallactic systems are typically treated as a network of agent interactions. This approach, though valuable, does not easily lend itself to analytic predictions. In this work we consider a simple trophallactic system of randomly interacting agents with finite carrying capacity, and calculate analytically and via a series of simulations the global food intake rate for the whole colony as well as observables describing how uniformly the food is distributed within the nest. Our model and predictions provide a useful benchmark to assess to what level the observed food uptake rates and efficiency in food distribution is due to stochastic effects or specific trophallactic strategies by the ant colony. Our work also serves as a stepping stone to describing the collective properties of more complex trophallactic systems, such as those including division of labor between foragers and workers.

  1. Inflation with a graceful exit in a random landscape

    NASA Astrophysics Data System (ADS)

    Pedro, F. G.; Westphal, A.

    2017-03-01

    We develop a stochastic description of small-field inflationary histories with a graceful exit in a random potential whose Hessian is a Gaussian random matrix as a model of the unstructured part of the string landscape. The dynamical evolution in such a random potential from a small-field inflation region towards a viable late-time de Sitter (dS) minimum maps to the dynamics of Dyson Brownian motion describing the relaxation of non-equilibrium eigenvalue spectra in random matrix theory. We analytically compute the relaxation probability in a saddle point approximation of the partition function of the eigenvalue distribution of the Wigner ensemble describing the mass matrices of the critical points. When applied to small-field inflation in the landscape, this leads to an exponentially strong bias against small-field ranges and an upper bound N ≪ 10 on the number of light fields N participating during inflation from the non-observation of negative spatial curvature.

  2. What influences national and foreign physicians’ geographic distribution? An analysis of medical doctors’ residence location in Portugal

    PubMed Central

    2012-01-01

    Background The debate over physicians’ geographical distribution has attracted the attention of the economic and public health literature over the last forty years. Nonetheless, it is still to date unclear what influences physicians’ location, and whether foreign physicians contribute to fill the geographical gaps left by national doctors in any given country. The present research sets out to investigate the current distribution of national and international physicians in Portugal, with the objective to understand its determinants and provide an evidence base for policy-makers to identify policies to influence it. Methods A cross-sectional study of physicians currently registered in Portugal was conducted to describe the population and explore the association of physician residence patterns with relevant personal and municipality characteristics. Data from the Portuguese Medical Council on physicians’ residence and characteristics were analysed, as well as data from the National Institute of Statistics on municipalities’ population, living standards and health care network. Descriptive statistics, chi-square tests, negative binomial and logistic regression modelling were applied to determine: (a) municipality characteristics predicting Portuguese and International physicians’ geographical distribution, and; (b) doctors’ characteristics that could increase the odds of residing outside the country’s metropolitan areas. Results There were 39,473 physicians in Portugal in 2008, 51.1% of whom male, and 40.2% between 41 and 55 years of age. They were predominantly Portuguese (90.5%), with Spanish, Brazilian and African nationalities also represented. Population, Population’s Purchasing Power, Nurses per capita and Municipality Development Index (MDI) were the municipality characteristics displaying the strongest association with national physicians’ location. For foreign physicians, the MDI was not statistically significant, while municipalities

  3. What influences national and foreign physicians' geographic distribution? An analysis of medical doctors' residence location in Portugal.

    PubMed

    Russo, Giuliano; Ferrinho, Paulo; de Sousa, Bruno; Conceição, Cláudia

    2012-07-02

    The debate over physicians' geographical distribution has attracted the attention of the economic and public health literature over the last forty years. Nonetheless, it is still to date unclear what influences physicians' location, and whether foreign physicians contribute to fill the geographical gaps left by national doctors in any given country. The present research sets out to investigate the current distribution of national and international physicians in Portugal, with the objective to understand its determinants and provide an evidence base for policy-makers to identify policies to influence it. A cross-sectional study of physicians currently registered in Portugal was conducted to describe the population and explore the association of physician residence patterns with relevant personal and municipality characteristics. Data from the Portuguese Medical Council on physicians' residence and characteristics were analysed, as well as data from the National Institute of Statistics on municipalities' population, living standards and health care network. Descriptive statistics, chi-square tests, negative binomial and logistic regression modelling were applied to determine: (a) municipality characteristics predicting Portuguese and International physicians' geographical distribution, and; (b) doctors' characteristics that could increase the odds of residing outside the country's metropolitan areas. There were 39,473 physicians in Portugal in 2008, 51.1% of whom male, and 40.2% between 41 and 55 years of age. They were predominantly Portuguese (90.5%), with Spanish, Brazilian and African nationalities also represented. Population, Population's Purchasing Power, Nurses per capita and Municipality Development Index (MDI) were the municipality characteristics displaying the strongest association with national physicians' location. For foreign physicians, the MDI was not statistically significant, while municipalities' foreign population applying for residence

  4. Measurement of non-volatile particle number size distribution

    NASA Astrophysics Data System (ADS)

    Gkatzelis, G. I.; Papanastasiou, D. K.; Florou, K.; Kaltsonoudis, C.; Louvaris, E.; Pandis, S. N.

    2015-06-01

    An experimental methodology was developed to measure the non-volatile particle number concentration using a thermodenuder (TD). The TD was coupled with a high-resolution time-of-flight aerosol mass spectrometer, measuring the chemical composition and mass size distribution of the submicrometer aerosol and a scanning mobility particle sizer (SMPS) that provided the number size distribution of the aerosol in the range from 10 to 500 nm. The method was evaluated with a set of smog chamber experiments and achieved almost complete evaporation (> 98 %) of secondary organic as well as freshly nucleated particles, using a TD temperature of 400 °C and a centerline residence time of 15 s. This experimental approach was applied in a winter field campaign in Athens and provided a direct measurement of number concentration and size distribution for particles emitted from major pollution sources. During periods in which the contribution of biomass burning sources was dominant, more than 80 % of particle number concentration remained after passing through the thermodenuder, suggesting that nearly all biomass burning particles had a non-volatile core. These remaining particles consisted mostly of black carbon (60 % mass contribution) and organic aerosol, OA (40 %). Organics that had not evaporated through the TD were mostly biomass burning OA (BBOA) and oxygenated OA (OOA) as determined from AMS source apportionment analysis. For periods during which traffic contribution was dominant 50-60 % of the particles had a non-volatile core while the rest evaporated at 400 °C. The remaining particle mass consisted mostly of black carbon (BC) with an 80 % contribution, while OA was responsible for another 15-20 %. Organics were mostly hydrocarbon-like OA (HOA) and OOA. These results suggest that even at 400 °C some fraction of the OA does not evaporate from particles emitted from common combustion processes, such as biomass burning and car engines, indicating that a fraction of this type

  5. Random Vibrations

    NASA Technical Reports Server (NTRS)

    Messaro. Semma; Harrison, Phillip

    2010-01-01

    Ares I Zonal Random vibration environments due to acoustic impingement and combustion processes are develop for liftoff, ascent and reentry. Random Vibration test criteria for Ares I Upper Stage pyrotechnic components are developed by enveloping the applicable zonal environments where each component is located. Random vibration tests will be conducted to assure that these components will survive and function appropriately after exposure to the expected vibration environments. Methodology: Random Vibration test criteria for Ares I Upper Stage pyrotechnic components were desired that would envelope all the applicable environments where each component was located. Applicable Ares I Vehicle drawings and design information needed to be assessed to determine the location(s) for each component on the Ares I Upper Stage. Design and test criteria needed to be developed by plotting and enveloping the applicable environments using Microsoft Excel Spreadsheet Software and documenting them in a report Using Microsoft Word Processing Software. Conclusion: Random vibration liftoff, ascent, and green run design & test criteria for the Upper Stage Pyrotechnic Components were developed by using Microsoft Excel to envelope zonal environments applicable to each component. Results were transferred from Excel into a report using Microsoft Word. After the report is reviewed and edited by my mentor it will be submitted for publication as an attachment to a memorandum. Pyrotechnic component designers will extract criteria from my report for incorporation into the design and test specifications for components. Eventually the hardware will be tested to the environments I developed to assure that the components will survive and function appropriately after exposure to the expected vibration environments.

  6. Distributed solar photovoltaic array location and extent dataset for remote sensing object identification

    PubMed Central

    Bradbury, Kyle; Saboo, Raghav; L. Johnson, Timothy; Malof, Jordan M.; Devarajan, Arjun; Zhang, Wuming; M. Collins, Leslie; G. Newell, Richard

    2016-01-01

    Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment. PMID:27922592

  7. Distributed solar photovoltaic array location and extent dataset for remote sensing object identification.

    PubMed

    Bradbury, Kyle; Saboo, Raghav; L Johnson, Timothy; Malof, Jordan M; Devarajan, Arjun; Zhang, Wuming; M Collins, Leslie; G Newell, Richard

    2016-12-06

    Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment.

  8. Distributed solar photovoltaic array location and extent dataset for remote sensing object identification

    NASA Astrophysics Data System (ADS)

    Bradbury, Kyle; Saboo, Raghav; L. Johnson, Timothy; Malof, Jordan M.; Devarajan, Arjun; Zhang, Wuming; M. Collins, Leslie; G. Newell, Richard

    2016-12-01

    Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment.

  9. Chemical Continuous Time Random Walks

    NASA Astrophysics Data System (ADS)

    Aquino, T.; Dentz, M.

    2017-12-01

    Traditional methods for modeling solute transport through heterogeneous media employ Eulerian schemes to solve for solute concentration. More recently, Lagrangian methods have removed the need for spatial discretization through the use of Monte Carlo implementations of Langevin equations for solute particle motions. While there have been recent advances in modeling chemically reactive transport with recourse to Lagrangian methods, these remain less developed than their Eulerian counterparts, and many open problems such as efficient convergence and reconstruction of the concentration field remain. We explore a different avenue and consider the question: In heterogeneous chemically reactive systems, is it possible to describe the evolution of macroscopic reactant concentrations without explicitly resolving the spatial transport? Traditional Kinetic Monte Carlo methods, such as the Gillespie algorithm, model chemical reactions as random walks in particle number space, without the introduction of spatial coordinates. The inter-reaction times are exponentially distributed under the assumption that the system is well mixed. In real systems, transport limitations lead to incomplete mixing and decreased reaction efficiency. We introduce an arbitrary inter-reaction time distribution, which may account for the impact of incomplete mixing. This process defines an inhomogeneous continuous time random walk in particle number space, from which we derive a generalized chemical Master equation and formulate a generalized Gillespie algorithm. We then determine the modified chemical rate laws for different inter-reaction time distributions. We trace Michaelis-Menten-type kinetics back to finite-mean delay times, and predict time-nonlocal macroscopic reaction kinetics as a consequence of broadly distributed delays. Non-Markovian kinetics exhibit weak ergodicity breaking and show key features of reactions under local non-equilibrium.

  10. Distributional Effects of Word Frequency on Eye Fixation Durations

    ERIC Educational Resources Information Center

    Staub, Adrian; White, Sarah J.; Drieghe, Denis; Hollway, Elizabeth C.; Rayner, Keith

    2010-01-01

    Recent research using word recognition paradigms, such as lexical decision and speeded pronunciation, has investigated how a range of variables affect the location and shape of response time distributions, using both parametric and non-parametric techniques. In this article, we explore the distributional effects of a word frequency manipulation on…

  11. Mortality risks during extreme temperature events (ETEs) using a distributed lag non-linear model

    NASA Astrophysics Data System (ADS)

    Allen, Michael J.; Sheridan, Scott C.

    2018-01-01

    This study investigates the relationship between all-cause mortality and extreme temperature events (ETEs) from 1975 to 2004. For 50 U.S. locations, these heat and cold events were defined based on location-specific thresholds of daily mean apparent temperature. Heat days were defined by a 3-day mean apparent temperature greater than the 95th percentile while extreme heat days were greater than the 97.5th percentile. Similarly, calculations for cold and extreme cold days relied upon the 5th and 2.5th percentiles. A distributed lag non-linear model assessed the relationship between mortality and ETEs for a cumulative 14-day period following exposure. Subsets for season and duration effect denote the differences between early- and late-season as well as short and long ETEs. While longer-lasting heat days resulted in elevated mortality, early season events also impacted mortality outcomes. Over the course of the summer season, heat-related risk decreased, though prolonged heat days still had a greater influence on mortality. Unlike heat, cold-related risk was greatest in more southerly locations. Risk was highest for early season cold events and decreased over the course of the winter season. Statistically, short episodes of cold showed the highest relative risk, suggesting unsettled weather conditions may have some relationship to cold-related mortality. For both heat and cold, results indicate higher risk to the more extreme thresholds. Risk values provide further insight into the role of adaptation, geographical variability, and acclimatization with respect to ETEs.

  12. Non-random Mis-segregation of Human Chromosomes.

    PubMed

    Worrall, Joseph Thomas; Tamura, Naoka; Mazzagatti, Alice; Shaikh, Nadeem; van Lingen, Tineke; Bakker, Bjorn; Spierings, Diana Carolina Johanna; Vladimirou, Elina; Foijer, Floris; McClelland, Sarah Elizabeth

    2018-06-12

    A common assumption is that human chromosomes carry equal chances of mis-segregation during compromised cell division. Human chromosomes vary in multiple parameters that might generate bias, but technological limitations have precluded a comprehensive analysis of chromosome-specific aneuploidy. Here, by imaging specific centromeres coupled with high-throughput single-cell analysis as well as single-cell sequencing, we show that aneuploidy occurs non-randomly following common treatments to elevate chromosome mis-segregation. Temporary spindle disruption leads to elevated mis-segregation and aneuploidy of a subset of chromosomes, particularly affecting chromosomes 1 and 2. Unexpectedly, we find that a period of mitotic delay weakens centromeric cohesion and promotes chromosome mis-segregation and that chromosomes 1 and 2 are particularly prone to suffer cohesion fatigue. Our findings demonstrate that inherent properties of individual chromosomes can bias chromosome mis-segregation and aneuploidy rates, with implications for studies on aneuploidy in human disease. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  13. Exceptional diversity, non-random distribution, and rapid evolution of retroelements in the B73 maize genome.

    PubMed

    Baucom, Regina S; Estill, James C; Chaparro, Cristian; Upshaw, Naadira; Jogi, Ansuya; Deragon, Jean-Marc; Westerman, Richard P; Sanmiguel, Phillip J; Bennetzen, Jeffrey L

    2009-11-01

    Recent comprehensive sequence analysis of the maize genome now permits detailed discovery and description of all transposable elements (TEs) in this complex nuclear environment. Reiteratively optimized structural and homology criteria were used in the computer-assisted search for retroelements, TEs that transpose by reverse transcription of an RNA intermediate, with the final results verified by manual inspection. Retroelements were found to occupy the majority (>75%) of the nuclear genome in maize inbred B73. Unprecedented genetic diversity was discovered in the long terminal repeat (LTR) retrotransposon class of retroelements, with >400 families (>350 newly discovered) contributing >31,000 intact elements. The two other classes of retroelements, SINEs (four families) and LINEs (at least 30 families), were observed to contribute 1,991 and approximately 35,000 copies, respectively, or a combined approximately 1% of the B73 nuclear genome. With regard to fully intact elements, median copy numbers for all retroelement families in maize was 2 because >250 LTR retrotransposon families contained only one or two intact members that could be detected in the B73 draft sequence. The majority, perhaps all, of the investigated retroelement families exhibited non-random dispersal across the maize genome, with LINEs, SINEs, and many low-copy-number LTR retrotransposons exhibiting a bias for accumulation in gene-rich regions. In contrast, most (but not all) medium- and high-copy-number LTR retrotransposons were found to preferentially accumulate in gene-poor regions like pericentromeric heterochromatin, while a few high-copy-number families exhibited the opposite bias. Regions of the genome with the highest LTR retrotransposon density contained the lowest LTR retrotransposon diversity. These results indicate that the maize genome provides a great number of different niches for the survival and procreation of a great variety of retroelements that have evolved to differentially

  14. An Isometric Mapping Based Co-Location Decision Tree Algorithm

    NASA Astrophysics Data System (ADS)

    Zhou, G.; Wei, J.; Zhou, X.; Zhang, R.; Huang, W.; Sha, H.; Chen, J.

    2018-05-01

    Decision tree (DT) induction has been widely used in different pattern classification. However, most traditional DTs have the disadvantage that they consider only non-spatial attributes (ie, spectral information) as a result of classifying pixels, which can result in objects being misclassified. Therefore, some researchers have proposed a co-location decision tree (Cl-DT) method, which combines co-location and decision tree to solve the above the above-mentioned traditional decision tree problems. Cl-DT overcomes the shortcomings of the existing DT algorithms, which create a node for each value of a given attribute, which has a higher accuracy than the existing decision tree approach. However, for non-linearly distributed data instances, the euclidean distance between instances does not reflect the true positional relationship between them. In order to overcome these shortcomings, this paper proposes an isometric mapping method based on Cl-DT (called, (Isomap-based Cl-DT), which is a method that combines heterogeneous and Cl-DT together. Because isometric mapping methods use geodetic distances instead of Euclidean distances between non-linearly distributed instances, the true distance between instances can be reflected. The experimental results and several comparative analyzes show that: (1) The extraction method of exposed carbonate rocks is of high accuracy. (2) The proposed method has many advantages, because the total number of nodes, the number of leaf nodes and the number of nodes are greatly reduced compared to Cl-DT. Therefore, the Isomap -based Cl-DT algorithm can construct a more accurate and faster decision tree.

  15. New distributed fusion filtering algorithm based on covariances over sensor networks with random packet dropouts

    NASA Astrophysics Data System (ADS)

    Caballero-Águila, R.; Hermoso-Carazo, A.; Linares-Pérez, J.

    2017-07-01

    This paper studies the distributed fusion estimation problem from multisensor measured outputs perturbed by correlated noises and uncertainties modelled by random parameter matrices. Each sensor transmits its outputs to a local processor over a packet-erasure channel and, consequently, random losses may occur during transmission. Different white sequences of Bernoulli variables are introduced to model the transmission losses. For the estimation, each lost output is replaced by its estimator based on the information received previously, and only the covariances of the processes involved are used, without requiring the signal evolution model. First, a recursive algorithm for the local least-squares filters is derived by using an innovation approach. Then, the cross-correlation matrices between any two local filters is obtained. Finally, the distributed fusion filter weighted by matrices is obtained from the local filters by applying the least-squares criterion. The performance of the estimators and the influence of both sensor uncertainties and transmission losses on the estimation accuracy are analysed in a numerical example.

  16. Epidermis Microstructure Inspired Graphene Pressure Sensor with Random Distributed Spinosum for High Sensitivity and Large Linearity.

    PubMed

    Pang, Yu; Zhang, Kunning; Yang, Zhen; Jiang, Song; Ju, Zhenyi; Li, Yuxing; Wang, Xuefeng; Wang, Danyang; Jian, Muqiang; Zhang, Yingying; Liang, Renrong; Tian, He; Yang, Yi; Ren, Tian-Ling

    2018-03-27

    Recently, wearable pressure sensors have attracted tremendous attention because of their potential applications in monitoring physiological signals for human healthcare. Sensitivity and linearity are the two most essential parameters for pressure sensors. Although various designed micro/nanostructure morphologies have been introduced, the trade-off between sensitivity and linearity has not been well balanced. Human skin, which contains force receptors in a reticular layer, has a high sensitivity even for large external stimuli. Herein, inspired by the skin epidermis with high-performance force sensing, we have proposed a special surface morphology with spinosum microstructure of random distribution via the combination of an abrasive paper template and reduced graphene oxide. The sensitivity of the graphene pressure sensor with random distribution spinosum (RDS) microstructure is as high as 25.1 kPa -1 in a wide linearity range of 0-2.6 kPa. Our pressure sensor exhibits superior comprehensive properties compared with previous surface-modified pressure sensors. According to simulation and mechanism analyses, the spinosum microstructure and random distribution contribute to the high sensitivity and large linearity range, respectively. In addition, the pressure sensor shows promising potential in detecting human physiological signals, such as heartbeat, respiration, phonation, and human motions of a pushup, arm bending, and walking. The wearable pressure sensor array was further used to detect gait states of supination, neutral, and pronation. The RDS microstructure provides an alternative strategy to improve the performance of pressure sensors and extend their potential applications in monitoring human activities.

  17. Non-motor outcomes of subthalamic stimulation in Parkinson's disease depend on location of active contacts.

    PubMed

    Dafsari, Haidar Salimi; Petry-Schmelzer, Jan Niklas; Ray-Chaudhuri, K; Ashkan, Keyoumars; Weis, Luca; Dembek, Till A; Samuel, Michael; Rizos, Alexandra; Silverdale, Monty; Barbe, Michael T; Fink, Gereon R; Evans, Julian; Martinez-Martin, Pablo; Antonini, Angelo; Visser-Vandewalle, Veerle; Timmermann, Lars

    2018-03-16

    Subthalamic nucleus (STN) deep brain stimulation (DBS) improves quality of life (QoL), motor, and non-motor symptoms (NMS) in Parkinson's disease (PD). Few studies have investigated the influence of the location of neurostimulation on NMS. To investigate the impact of active contact location on NMS in STN-DBS in PD. In this prospective, open-label, multicenter study including 50 PD patients undergoing bilateral STN-DBS, we collected NMSScale (NMSS), NMSQuestionnaire (NMSQ), Hospital Anxiety and Depression Scale (anxiety/depression, HADS-A/-D), PDQuestionnaire-8 (PDQ-8), Scales for Outcomes in PD-motor examination, motor complications, activities of daily living (ADL), and levodopa equivalent daily dose (LEDD) preoperatively and at 6 months follow-up. Changes were analyzed with Wilcoxon signed-rank/t-test and Bonferroni-correction for multiple comparisons. Although the STN was targeted visually, we employed an atlas-based approach to explore the relationship between active contact locations and DBS outcomes. Based on fused MRI/CT-images, we identified Cartesian coordinates of active contacts with patient-specific Mai-atlas standardization. We computed linear mixed-effects models with x-/y-/z-coordinates as independent, hemispheres as within-subject, and test change scores as dependent variables. NMSS, NMSQ, PDQ-8, motor examination, complications, and LEDD significantly improved at follow-up. Linear mixed-effect models showed that NMS and QoL improvement significantly depended on more medial (HADS-D, NMSS), anterior (HADS-D, NMSQ, PDQ-8), and ventral (HADS-A/-D, NMSS, PDQ-8) neurostimulation. ADL improved more in posterior, LEDD in lateral neurostimulation locations. No relationship was observed for motor examination and complications scores. Our study provides evidence that more anterior, medial, and ventral STN-DBS is significantly related to more beneficial non-motor outcomes. Copyright © 2018. Published by Elsevier Inc.

  18. Effect of platykurtic and leptokurtic distributions in the random-field Ising model: mean-field approach.

    PubMed

    Duarte Queirós, Sílvio M; Crokidakis, Nuno; Soares-Pinto, Diogo O

    2009-07-01

    The influence of the tail features of the local magnetic field probability density function (PDF) on the ferromagnetic Ising model is studied in the limit of infinite range interactions. Specifically, we assign a quenched random field whose value is in accordance with a generic distribution that bears platykurtic and leptokurtic distributions depending on a single parameter tau<3 to each site. For tau<5/3, such distributions, which are basically Student-t and r distribution extended for all plausible real degrees of freedom, present a finite standard deviation, if not the distribution has got the same asymptotic power-law behavior as a alpha-stable Lévy distribution with alpha=(3-tau)/(tau-1). For every value of tau, at specific temperature and width of the distribution, the system undergoes a continuous phase transition. Strikingly, we impart the emergence of an inflexion point in the temperature-PDF width phase diagrams for distributions broader than the Cauchy-Lorentz (tau=2) which is accompanied with a divergent free energy per spin (at zero temperature).

  19. Distribution and correlates of non-high-density lipoprotein cholesterol and triglycerides in Lebanese school children.

    PubMed

    Gannagé-Yared, Marie-Hélène; Farah, Vanessa; Chahine, Elise; Balech, Nicole; Ibrahim, Toni; Asmar, Nadia; Barakett-Hamadé, Vanda; Jambart, Selim

    2016-01-01

    The prevalence of dyslipidelmia in pediatric Middle-Eastern populations is unknown. Our study aims to investigate the distribution and correlates of non-high-density lipoprotein cholesterol (non-HDL-C) and triglycerides among Lebanese school children. A total of 969 subjects aged 8-18 years were included in the study (505 boys and 464 girls). Recruitment was done from 10 schools located in the Great Beirut and Mount-Lebanon areas. Non-fasting total cholesterol, triglycerides, and HDL-cholesterol (HDL-C) were measured. Non-HDL-C was calculated. Schools were categorized into 3 socioeconomic statuses (SESs; low, middle, and high). In the overall population, the prevalence of high non-HDL-C (>3.8 mmol/L), very high non-HDL-C (>4.9 mmol/L), and high triglycerides (>1.5 mmol/l) are respectively 9.2%, 1.24%, and 26.6%. There is no significant gender difference for non-HDL-C or triglycerides. Non-HDL-C and triglycerides are inversely correlated with age in girls (P < .0001 for both variables) but not in boys. They are also positively correlated with body mass index (BMI) in boys and girls (P < .0001 for all variables). There is no relationship between schools' socioeconomic process (SES) and non-HDL-C. However, triglycerides are higher in children from lower SES schools. After adjustment for age and body mass index (BMI), testosterone is inversely associated with triglycerides in boys (P < .0001). In a multivariate regression analysis, non-HDL-C is independently associated with age and BMI in girls (P < .0001 for both variables) but only with BMI in boys (P < .0001), whereas triglycerides are independently associated with BMI and schools' SES in both girls and boys. This study confirms, in our population, the association between obesity and both high non-HDL-C and triglycerides, and between high triglycerides and low SES. Copyright © 2016 National Lipid Association. Published by Elsevier Inc. All rights reserved.

  20. In the eye of the beholder: Inhomogeneous distribution of high-resolution shapes within the random-walk ensemble

    NASA Astrophysics Data System (ADS)

    Müller, Christian L.; Sbalzarini, Ivo F.; van Gunsteren, Wilfred F.; Žagrović, Bojan; Hünenberger, Philippe H.

    2009-06-01

    The concept of high-resolution shapes (also referred to as folds or states, depending on the context) of a polymer chain plays a central role in polymer science, structural biology, bioinformatics, and biopolymer dynamics. However, although the idea of shape is intuitively very useful, there is no unambiguous mathematical definition for this concept. In the present work, the distributions of high-resolution shapes within the ideal random-walk ensembles with N =3,…,6 beads (or up to N =10 for some properties) are investigated using a systematic (grid-based) approach based on a simple working definition of shapes relying on the root-mean-square atomic positional deviation as a metric (i.e., to define the distance between pairs of structures) and a single cutoff criterion for the shape assignment. Although the random-walk ensemble appears to represent the paramount of homogeneity and randomness, this analysis reveals that the distribution of shapes within this ensemble, i.e., in the total absence of interatomic interactions characteristic of a specific polymer (beyond the generic connectivity constraint), is significantly inhomogeneous. In particular, a specific (densest) shape occurs with a local probability that is 1.28, 1.79, 2.94, and 10.05 times (N =3,…,6) higher than the corresponding average over all possible shapes (these results can tentatively be extrapolated to a factor as large as about 1028 for N =100). The qualitative results of this analysis lead to a few rather counterintuitive suggestions, namely, that, e.g., (i) a fold classification analysis applied to the random-walk ensemble would lead to the identification of random-walk "folds;" (ii) a clustering analysis applied to the random-walk ensemble would also lead to the identification random-walk "states" and associated relative free energies; and (iii) a random-walk ensemble of polymer chains could lead to well-defined diffraction patterns in hypothetical fiber or crystal diffraction experiments

  1. Development of extended release dosage forms using non-uniform drug distribution techniques.

    PubMed

    Huang, Kuo-Kuang; Wang, Da-Peng; Meng, Chung-Ling

    2002-05-01

    Development of an extended release oral dosage form for nifedipine using the non-uniform drug distribution matrix method was conducted. The process conducted in a fluid bed processing unit was optimized by controlling the concentration gradient of nifedipine in the coating solution and the spray rate applied to the non-pareil beads. The concentration of nifedipine in the coating was controlled by instantaneous dilutions of coating solution with polymer dispersion transported from another reservoir into the coating solution at a controlled rate. The USP dissolution method equipped with paddles at 100 rpm in 0.1 N hydrochloric acid solution maintained at 37 degrees C was used for the evaluation of release rate characteristics. Results indicated that (1) an increase in the ethyl cellulose content in the coated beads decreased the nifedipine release rate, (2) incorporation of water-soluble sucrose into the formulation increased the release rate of nifedipine, and (3) adjustment of the spray coating solution and the transport rate of polymer dispersion could achieve a dosage form with a zero-order release rate. Since zero-order release rate and constant plasma concentration were achieved in this study using the non-uniform drug distribution technique, further studies to determine in vivo/in vitro correlation with various non-uniform drug distribution dosage forms will be conducted.

  2. A non-invasive Hall current distribution measurement system for Hall Effect thrusters

    NASA Astrophysics Data System (ADS)

    Mullins, Carl Raymond

    A direct, accurate method to measure thrust produced by a Hall Effect thruster on orbit does not currently exist. The ability to calculate produced thrust will enable timely and precise maneuvering of spacecraft---a capability particularly important to satellite formation flying. The means to determine thrust directly is achievable by remotely measuring the magnetic field of the thruster and solving the inverse magnetostatic problem for the Hall current density distribution. For this thesis, the magnetic field was measured by employing an array of eight tunneling magnetoresistive (TMR) sensors capable of milligauss sensitivity when placed in a high background field. The array was positioned outside the channel of a 1.5 kW Colorado State University Hall thruster equipped with a center-mounted electride cathode. In this location, the static magnetic field is approximately 30 Gauss, which is within the linear operating range of the TMR sensors. Furthermore, the induced field at this distance is greater than tens of milligauss, which is within the sensitivity range of the TMR sensors. Due to the nature of the inverse problem, the induced-field measurements do not provide the Hall current density by a simple inversion; however, a Tikhonov regularization of the induced field along with a non-negativity constraint and a zero boundary condition provides current density distributions. Our system measures the sensor outputs at 2 MHz allowing the determination of the Hall current density distribution as a function of time. These data are shown in contour plots in sequential frames. The measured ratios between the average Hall current and the discharge current ranged from 0.1 to 10 over a range of operating conditions from 1.3 kW to 2.2 kW. The temporal inverse solution at 2.0 kW exhibited a breathing mode of 37 kHz, which was in agreement with temporal measurements of the discharge current.

  3. The adaptive approach for storage assignment by mining data of warehouse management system for distribution centres

    NASA Astrophysics Data System (ADS)

    Ming-Huang Chiang, David; Lin, Chia-Ping; Chen, Mu-Chen

    2011-05-01

    Among distribution centre operations, order picking has been reported to be the most labour-intensive activity. Sophisticated storage assignment policies adopted to reduce the travel distance of order picking have been explored in the literature. Unfortunately, previous research has been devoted to locating entire products from scratch. Instead, this study intends to propose an adaptive approach, a Data Mining-based Storage Assignment approach (DMSA), to find the optimal storage assignment for newly delivered products that need to be put away when there is vacant shelf space in a distribution centre. In the DMSA, a new association index (AIX) is developed to evaluate the fitness between the put away products and the unassigned storage locations by applying association rule mining. With AIX, the storage location assignment problem (SLAP) can be formulated and solved as a binary integer programming. To evaluate the performance of DMSA, a real-world order database of a distribution centre is obtained and used to compare the results from DMSA with a random assignment approach. It turns out that DMSA outperforms random assignment as the number of put away products and the proportion of put away products with high turnover rates increase.

  4. Non-normal Distributions Commonly Used in Health, Education, and Social Sciences: A Systematic Review

    PubMed Central

    Bono, Roser; Blanca, María J.; Arnau, Jaume; Gómez-Benito, Juana

    2017-01-01

    Statistical analysis is crucial for research and the choice of analytical technique should take into account the specific distribution of data. Although the data obtained from health, educational, and social sciences research are often not normally distributed, there are very few studies detailing which distributions are most likely to represent data in these disciplines. The aim of this systematic review was to determine the frequency of appearance of the most common non-normal distributions in the health, educational, and social sciences. The search was carried out in the Web of Science database, from which we retrieved the abstracts of papers published between 2010 and 2015. The selection was made on the basis of the title and the abstract, and was performed independently by two reviewers. The inter-rater reliability for article selection was high (Cohen’s kappa = 0.84), and agreement regarding the type of distribution reached 96.5%. A total of 262 abstracts were included in the final review. The distribution of the response variable was reported in 231 of these abstracts, while in the remaining 31 it was merely stated that the distribution was non-normal. In terms of their frequency of appearance, the most-common non-normal distributions can be ranked in descending order as follows: gamma, negative binomial, multinomial, binomial, lognormal, and exponential. In addition to identifying the distributions most commonly used in empirical studies these results will help researchers to decide which distributions should be included in simulation studies examining statistical procedures. PMID:28959227

  5. Optimal random Lévy-loop searching: New insights into the searching behaviours of central-place foragers

    NASA Astrophysics Data System (ADS)

    Reynolds, A. M.

    2008-04-01

    A random Lévy-looping model of searching is devised and optimal random Lévy-looping searching strategies are identified for the location of a single target whose position is uncertain. An inverse-square power law distribution of loop lengths is shown to be optimal when the distance between the centre of the search and the target is much shorter than the size of the longest possible loop in the searching pattern. Optimal random Lévy-looping searching patterns have recently been observed in the flight patterns of honeybees (Apis mellifera) when attempting to locate their hive and when searching after a known food source becomes depleted. It is suggested that the searching patterns of desert ants (Cataglyphis) are consistent with the adoption of an optimal Lévy-looping searching strategy.

  6. Luminosity distance in Swiss-cheese cosmology with randomized voids and galaxy halos

    NASA Astrophysics Data System (ADS)

    Flanagan, Éanna É.; Kumar, Naresh; Wasserman, Ira

    2013-08-01

    We study the fluctuations in luminosity distance due to gravitational lensing produced both by galaxy halos and large-scale voids. Voids are represented via a “Swiss-cheese” model consisting of a ΛCDM Friedmann-Robertson-Walker background from which a number of randomly distributed, spherical regions of comoving radius 35 Mpc are removed. A fraction of the removed mass is then placed on the shells of the spheres, in the form of randomly located halos. The halos are assumed to be nonevolving and are modeled with Navarro-Frenk-White profiles of a fixed mass. The remaining mass is placed in the interior of the spheres, either smoothly distributed or as randomly located halos. We compute the distribution of magnitude shifts using a variant of the method of Holz and Wald [Phys. Rev. D 58, 063501 (1998)], which includes the effect of lensing shear. In the two models we consider, the standard deviation of this distribution is 0.065 and 0.072 magnitudes and the mean is -0.0010 and -0.0013 magnitudes, for voids of radius 35 Mpc and the sources at redshift 1.5, with the voids chosen so that 90% of the mass is on the shell today. The standard deviation due to voids and halos is a factor ˜3 larger than that due to 35 Mpc voids alone with a 1 Mpc shell thickness, which we studied in our previous work. We also study the effect of the existence of evacuated voids, by comparing to a model where all the halos are randomly distributed in the interior of the sphere with none on its surface. This does not significantly change the variance but does significantly change the demagnification tail. To a good approximation, the variance of the distribution depends only on the mean column density of halos (halo mass divided by its projected area), the concentration parameter of the halos, and the fraction of the mass density that is in the form of halos (as opposed to smoothly distributed); it is independent of how the halos are distributed in space. We derive an approximate analytic

  7. Lidar transmitter offers "non-diffracting" property through short distance in highly-dense random media

    NASA Astrophysics Data System (ADS)

    Alifu, Xiafukaiti; Ziqi, Peng; Shiina, Tatsuo

    2018-04-01

    Non-diffracting beam (NDB) is useful in lidar transmitter because of its high propagation efficiency and high resolution. We aimed to generate NDB in random media such as haze and cloud. The laboratory experiment was conducted with diluted processed milk (fat: 1.8%, 1.1μmφ). Narrow view angle detector of 5.5mrad was used to detect the forward scattering waveform. We obtained the central peak of NDB at the propagation distance of 5cm 30cm in random media by adjusting the concentration of <10%.

  8. The investigation of social networks based on multi-component random graphs

    NASA Astrophysics Data System (ADS)

    Zadorozhnyi, V. N.; Yudin, E. B.

    2018-01-01

    The methods of non-homogeneous random graphs calibration are developed for social networks simulation. The graphs are calibrated by the degree distributions of the vertices and the edges. The mathematical foundation of the methods is formed by the theory of random graphs with the nonlinear preferential attachment rule and the theory of Erdôs-Rényi random graphs. In fact, well-calibrated network graph models and computer experiments with these models would help developers (owners) of the networks to predict their development correctly and to choose effective strategies for controlling network projects.

  9. Online Distributed Learning Over Networks in RKH Spaces Using Random Fourier Features

    NASA Astrophysics Data System (ADS)

    Bouboulis, Pantelis; Chouvardas, Symeon; Theodoridis, Sergios

    2018-04-01

    We present a novel diffusion scheme for online kernel-based learning over networks. So far, a major drawback of any online learning algorithm, operating in a reproducing kernel Hilbert space (RKHS), is the need for updating a growing number of parameters as time iterations evolve. Besides complexity, this leads to an increased need of communication resources, in a distributed setting. In contrast, the proposed method approximates the solution as a fixed-size vector (of larger dimension than the input space) using Random Fourier Features. This paves the way to use standard linear combine-then-adapt techniques. To the best of our knowledge, this is the first time that a complete protocol for distributed online learning in RKHS is presented. Conditions for asymptotic convergence and boundness of the networkwise regret are also provided. The simulated tests illustrate the performance of the proposed scheme.

  10. Characterizing the strand-specific distribution of non-CpG methylation in human pluripotent cells.

    PubMed

    Guo, Weilong; Chung, Wen-Yu; Qian, Minping; Pellegrini, Matteo; Zhang, Michael Q

    2014-03-01

    DNA methylation is an important defense and regulatory mechanism. In mammals, most DNA methylation occurs at CpG sites, and asymmetric non-CpG methylation has only been detected at appreciable levels in a few cell types. We are the first to systematically study the strand-specific distribution of non-CpG methylation. With the divide-and-compare strategy, we show that CHG and CHH methylation are not intrinsically different in human embryonic stem cells (ESCs) and induced pluripotent stem cells (iPSCs). We also find that non-CpG methylation is skewed between the two strands in introns, especially at intron boundaries and in highly expressed genes. Controlling for the proximal sequences of non-CpG sites, we show that the skew of non-CpG methylation in introns is mainly guided by sequence skew. By studying subgroups of transposable elements, we also found that non-CpG methylation is distributed in a strand-specific manner in both short interspersed nuclear elements (SINE) and long interspersed nuclear elements (LINE), but not in long terminal repeats (LTR). Finally, we show that on the antisense strand of Alus, a non-CpG site just downstream of the A-box is highly methylated. Together, the divide-and-compare strategy leads us to identify regions with strand-specific distributions of non-CpG methylation in humans.

  11. NON-INVASIVE DETERMINATION OF THE LOCATION AND DISTRIBUTION OF FREE-PHASE DENSE NONAQUEOUS PHASE LIQUIDS (DNAPL) BY SEISMIC REFLECTION TECHNIQUES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael G. Waddell; William J. Domoracki; Tom J. Temples

    2001-12-01

    This annual technical progress report is for part of Task 4 (site evaluation), Task 5 (2D seismic design, acquisition, and processing), and Task 6 (2D seismic reflection, interpretation, and AVO analysis) on DOE contact number DE-AR26-98FT40369. The project had planned one additional deployment to another site other than Savannah River Site (SRS) or DOE Hanford Site. After the SUBCON midyear review in Albuquerque, NM, it was decided that two additional deployments would be performed. The first deployment is to test the feasibility of using non-invasive seismic reflection and AVO analysis as a monitoring tool to assist in determining the effectivenessmore » of Dynamic Underground Stripping (DUS) in removal of DNAPL. The second deployment is to the Department of Defense (DOD) Charleston Naval Weapons Station Solid Waste Management Unit 12 (SWMU-12), Charleston, SC to further test the technique to detect high concentrations of DNAPL. The Charleston Naval Weapons Station SWMU-12 site was selected in consultation with National Energy Technology Laboratory (NETL) and DOD Naval Facilities Engineering Command Southern Division (NAVFAC) personnel. Based upon the review of existing data and due to the shallow target depth, the project team collected three Vertical Seismic Profiles (VSP) and an experimental P-wave seismic reflection line. After preliminary data analysis of the VSP data and the experimental reflection line data, it was decided to proceed with Task 5 and Task 6. Three high resolution P-wave reflection profiles were collected with two objectives; (1) design the reflection survey to image a target depth of 20 feet below land surface to assist in determining the geologic controls on the DNAPL plume geometry, and (2) apply AVO analysis to the seismic data to locate the zone of high concentration of DNAPL. Based upon the results of the data processing and interpretation of the seismic data, the project team was able to map the channel that is controlling the DNAPL

  12. Physically transient photonics: random versus distributed feedback lasing based on nanoimprinted DNA.

    PubMed

    Camposeo, Andrea; Del Carro, Pompilio; Persano, Luana; Cyprych, Konrad; Szukalski, Adam; Sznitko, Lech; Mysliwiec, Jaroslaw; Pisignano, Dario

    2014-10-28

    Room-temperature nanoimprinted, DNA-based distributed feedback (DFB) laser operation at 605 nm is reported. The laser is made of a pure DNA host matrix doped with gain dyes. At high excitation densities, the emission of the untextured dye-doped DNA films is characterized by a broad emission peak with an overall line width of 12 nm and superimposed narrow peaks, characteristic of random lasing. Moreover, direct patterning of the DNA films is demonstrated with a resolution down to 100 nm, enabling the realization of both surface-emitting and edge-emitting DFB lasers with a typical line width of <0.3 nm. The resulting emission is polarized, with a ratio between the TE- and TM-polarized intensities exceeding 30. In addition, the nanopatterned devices dissolve in water within less than 2 min. These results demonstrate the possibility of realizing various physically transient nanophotonics and laser architectures, including random lasing and nanoimprinted devices, based on natural biopolymers.

  13. Simulating of the measurement-device independent quantum key distribution with phase randomized general sources

    PubMed Central

    Wang, Qin; Wang, Xiang-Bin

    2014-01-01

    We present a model on the simulation of the measurement-device independent quantum key distribution (MDI-QKD) with phase randomized general sources. It can be used to predict experimental observations of a MDI-QKD with linear channel loss, simulating corresponding values for the gains, the error rates in different basis, and also the final key rates. Our model can be applicable to the MDI-QKDs with arbitrary probabilistic mixture of different photon states or using any coding schemes. Therefore, it is useful in characterizing and evaluating the performance of the MDI-QKD protocol, making it a valuable tool in studying the quantum key distributions. PMID:24728000

  14. Towards an accurate real-time locator of infrasonic sources

    NASA Astrophysics Data System (ADS)

    Pinsky, V.; Blom, P.; Polozov, A.; Marcillo, O.; Arrowsmith, S.; Hofstetter, A.

    2017-11-01

    Infrasonic signals propagate from an atmospheric source via media with stochastic and fast space-varying conditions. Hence, their travel time, the amplitude at sensor recordings and even manifestation in the so-called "shadow zones" are random. Therefore, the traditional least-squares technique for locating infrasonic sources is often not effective, and the problem for the best solution must be formulated in probabilistic terms. Recently, a series of papers has been published about Bayesian Infrasonic Source Localization (BISL) method based on the computation of the posterior probability density function (PPDF) of the source location, as a convolution of a priori probability distribution function (APDF) of the propagation model parameters with likelihood function (LF) of observations. The present study is devoted to the further development of BISL for higher accuracy and stability of the source location results and decreasing of computational load. We critically analyse previous algorithms and propose several new ones. First of all, we describe the general PPDF formulation and demonstrate that this relatively slow algorithm might be among the most accurate algorithms, provided the adequate APDF and LF are used. Then, we suggest using summation instead of integration in a general PPDF calculation for increased robustness, but this leads us to the 3D space-time optimization problem. Two different forms of APDF approximation are considered and applied for the PPDF calculation in our study. One of them is previously suggested, but not yet properly used is the so-called "celerity-range histograms" (CRHs). Another is the outcome from previous findings of linear mean travel time for the four first infrasonic phases in the overlapping consecutive distance ranges. This stochastic model is extended here to the regional distance of 1000 km, and the APDF introduced is the probabilistic form of the junction between this travel time model and range-dependent probability

  15. Eternal non-Markovianity: from random unitary to Markov chain realisations.

    PubMed

    Megier, Nina; Chruściński, Dariusz; Piilo, Jyrki; Strunz, Walter T

    2017-07-25

    The theoretical description of quantum dynamics in an intriguing way does not necessarily imply the underlying dynamics is indeed intriguing. Here we show how a known very interesting master equation with an always negative decay rate [eternal non-Markovianity (ENM)] arises from simple stochastic Schrödinger dynamics (random unitary dynamics). Equivalently, it may be seen as arising from a mixture of Markov (semi-group) open system dynamics. Both these approaches lead to a more general family of CPT maps, characterized by a point within a parameter triangle. Our results show how ENM quantum dynamics can be realised easily in the laboratory. Moreover, we find a quantum time-continuously measured (quantum trajectory) realisation of the dynamics of the ENM master equation based on unitary transformations and projective measurements in an extended Hilbert space, guided by a classical Markov process. Furthermore, a Gorini-Kossakowski-Sudarshan-Lindblad (GKSL) representation of the dynamics in an extended Hilbert space can be found, with a remarkable property: there is no dynamics in the ancilla state. Finally, analogous constructions for two qubits extend these results from non-CP-divisible to non-P-divisible dynamics.

  16. Electromagnetic backscattering from a random distribution of lossy dielectric scatterers

    NASA Technical Reports Server (NTRS)

    Lang, R. H.

    1980-01-01

    Electromagnetic backscattering from a sparse distribution of discrete lossy dielectric scatterers occupying a region 5 was studied. The scatterers are assumed to have random position and orientation. Scattered fields are calculated by first finding the mean field and then by using it to define an equivalent medium within the volume 5. The scatterers are then viewed as being embedded in the equivalent medium; the distorted Born approximation is then used to find the scattered fields. This technique represents an improvement over the standard Born approximation since it takes into account the attenuation of the incident and scattered waves in the equivalent medium. The method is used to model a leaf canopy when the leaves are modeled by lossy dielectric discs.

  17. Non-Random Distribution of 5S rDNA Sites and Its Association with 45S rDNA in Plant Chromosomes.

    PubMed

    Roa, Fernando; Guerra, Marcelo

    2015-01-01

    5S and 45S rDNA sites are the best mapped chromosome regions in eukaryotic chromosomes. In this work, a database was built gathering information about the position and number of 5S rDNA sites in 784 plant species, aiming to identify patterns of distribution along the chromosomes and its correlation with the position of 45S rDNA sites. Data revealed that in most karyotypes (54.5%, including polyploids) two 5S rDNA sites (a single pair) are present, with 58.7% of all sites occurring in the short arm, mainly in the proximal region. In karyotypes of angiosperms with only 1 pair of sites (single sites) they are mostly found in the proximal region (52.0%), whereas in karyotypes with multiple sites the location varies according to the average chromosome size. Karyotypes with multiple sites and small chromosomes (<3 µm) often display proximal sites, while medium-sized (between 3 and 6 µm) and large chromosomes (>6 µm) more commonly show terminal or interstitial sites. In species with holokinetic chromosomes, the modal value of sites per karyotype was also 2, but they were found mainly in a terminal position. Adjacent 5S and 45S rDNA sites were often found in the short arm, reflecting the preferential distribution of both sites in this arm. The high frequency of genera with at least 1 species with adjacent 5S and 45S sites reveals that this association appeared several times during angiosperm evolution, but it has been maintained only rarely as the dominant array in plant genera. © 2015 S. Karger AG, Basel.

  18. Health Messaging and African-American Infant Sleep Location: A Randomized Controlled Trial.

    PubMed

    Moon, Rachel Y; Mathews, Anita; Joyner, Brandi L; Oden, Rosalind P; He, Jianping; McCarter, Robert

    2017-02-01

    Infant-parent bedsharing increases the risk of SIDS and other sleep-related deaths. Despite AAP recommendations to avoid bedsharing, public health efforts have been unsuccessful in changing behaviors. African-American infants are more than twice as likely to die from SIDS and other sleep-related deaths, and are also twice as likely to bedshare with their parents. Further, African-American parents have a high degree of self-efficacy with regards to preventing infant suffocation, but low self-efficacy with regards to SIDS risk reduction. It is unclear whether messages emphasizing suffocation prevention will decrease bedsharing. To evaluate the impact of specific health messages on African-American parental decisions regarding infant sleep location. We conducted a randomized, controlled trial of African-American mothers of infants. The control group received standard messaging emphasizing AAP-recommended safe sleep practices, including avoidance of bedsharing, for the purposes of SIDS risk reduction. The intervention group received enhanced messaging emphasizing safe sleep practices, including avoidance of bedsharing, for both SIDS risk reduction and suffocation prevention. Participants completed interviews at 2-3 weeks, 2-3 months, and 5-6 months after the infant's birth. 1194 mothers were enrolled in the study, and 637 completed all interviews. Bedsharing, both usually (aOR 1.005 [95 % CI 1.003, 1.006]) and last night (aOR 1.004 [95 % CI 1.002, 1.007]) increased slightly but statistically significantly with infant age (p < 0.001). Receipt of the enhanced message did not impact on sleep location. Maternal belief that bedsharing increased the risk of SIDS or suffocation declined over 6 months (p < 0.001) and did not differ by group assignment. African-American mothers who received an enhanced message about SIDS risk reduction and suffocation prevention were no less likely to bedshare with their infants. Clinical Trials.gov identifier NCT01361880.

  19. A distributed scheduling algorithm for heterogeneous real-time systems

    NASA Technical Reports Server (NTRS)

    Zeineldine, Osman; El-Toweissy, Mohamed; Mukkamala, Ravi

    1991-01-01

    Much of the previous work on load balancing and scheduling in distributed environments was concerned with homogeneous systems and homogeneous loads. Several of the results indicated that random policies are as effective as other more complex load allocation policies. The effects of heterogeneity on scheduling algorithms for hard real time systems is examined. A distributed scheduler specifically to handle heterogeneities in both nodes and node traffic is proposed. The performance of the algorithm is measured in terms of the percentage of jobs discarded. While a random task allocation is very sensitive to heterogeneities, the algorithm is shown to be robust to such non-uniformities in system components and load.

  20. Distribution and speciation of metals (Cu, Zn, Cd, and Pb) in agricultural and non-agricultural soils near a stream upriver from the Pearl River, China.

    PubMed

    Yang, Silin; Zhou, Dequn; Yu, Huayong; Wei, Rong; Pan, Bo

    2013-06-01

    The distribution and chemical speciation of typical metals (Cu, Zn, Cd and Pb) in agricultural and non-agricultural soils were investigated in the area of Nanpan River, upstream of the Pearl River. The investigated four metals showed higher concentrations in agricultural soils than in non-agricultural soils, and the site located in factory district contained metals much higher than the other sampling sites. These observations suggested that human activities, such as water irrigation, fertilizer and pesticide applications might have a major impact on the distribution of metals. Metal speciation analysis presented that Cu, Zn and Cd were dominated by the residual fraction, while Pb was dominated by the reducible fraction. Because of the low mobility of the metals in the investigated area, no remarkable difference could be observed between upstream and downstream separated by the factory site. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Sub-micron particle number size distribution characteristics at two urban locations in Leicester

    NASA Astrophysics Data System (ADS)

    Hama, Sarkawt M. L.; Cordell, Rebecca L.; Kos, Gerard P. A.; Weijers, E. P.; Monks, Paul S.

    2017-09-01

    The particle number size distribution (PNSD) of atmospheric particles not only provides information about sources and atmospheric processing of particles, but also plays an important role in determining regional lung dose. Owing to the importance of PNSD in understanding particulate pollution two short-term campaigns (March-June 2014) measurements of sub-micron PNSD were conducted at two urban background locations in Leicester, UK. At the first site, Leicester Automatic Urban Rural Network (AURN), the mean number concentrations of nucleation, Aitken, accumulation modes, the total particles, equivalent black carbon (eBC) mass concentrations were 2002, 3258, 1576, 6837 # cm-3, 1.7 μg m-3, respectively, and at the second site, Brookfield (BF), were 1455, 2407, 874, 4737 # cm-3, 0.77 μg m-3, respectively. The total particle number was dominated by the nucleation and Aitken modes, with both consisting of 77%, and 81% of total number concentrations at AURN and BF sites, respectively. This behaviour could be attributed to primary emissions (traffic) of ultrafine particles and the temporal evolution of mixing layer. The size distribution at the AURN site shows bimodal distribution at 22 nm with a minor peak at 70 nm. The size distribution at BF site, however, exhibits unimodal distribution at 35 nm. This study has for the first time investigated the effect of Easter holiday on PNSD in UK. The temporal variation of PNSD demonstrated a good degree of correlation with traffic-related pollutants (NOX, and eBC at both sites). The meteorological conditions, also had an impact on the PNSD and eBC at both sites. During the measurement period, the frequency of NPF events was calculated to be 13.3%, and 22.2% at AURN and BF sites, respectively. The average value of formation and growth rates of nucleation mode particles were 1.3, and 1.17 cm-3 s-1 and 7.42, and 5.3 nm h-1 at AURN, and BF sites, respectively. It can suggested that aerosol particles in Leicester originate mainly

  2. Pure random search for ambient sensor distribution optimisation in a smart home environment.

    PubMed

    Poland, Michael P; Nugent, Chris D; Wang, Hui; Chen, Liming

    2011-01-01

    Smart homes are living spaces facilitated with technology to allow individuals to remain in their own homes for longer, rather than be institutionalised. Sensors are the fundamental physical layer with any smart home, as the data they generate is used to inform decision support systems, facilitating appropriate actuator actions. Positioning of sensors is therefore a fundamental characteristic of a smart home. Contemporary smart home sensor distribution is aligned to either a) a total coverage approach; b) a human assessment approach. These methods for sensor arrangement are not data driven strategies, are unempirical and frequently irrational. This Study hypothesised that sensor deployment directed by an optimisation method that utilises inhabitants' spatial frequency data as the search space, would produce more optimal sensor distributions vs. the current method of sensor deployment by engineers. Seven human engineers were tasked to create sensor distributions based on perceived utility for 9 deployment scenarios. A Pure Random Search (PRS) algorithm was then tasked to create matched sensor distributions. The PRS method produced superior distributions in 98.4% of test cases (n=64) against human engineer instructed deployments when the engineers had no access to the spatial frequency data, and in 92.0% of test cases (n=64) when engineers had full access to these data. These results thus confirmed the hypothesis.

  3. Determining Optimal College Locations

    ERIC Educational Resources Information Center

    Schofer, J. P.

    1975-01-01

    Location can be a critical determinant of the success of a college. Central Place Theory, as developed in geographic studies of population distribution patterns, can provide insights into the problem of evaluating college locations. In this way preferences of students can be balanced against economic, academic, and political considerations.…

  4. Gossip and Distributed Kalman Filtering: Weak Consensus Under Weak Detectability

    NASA Astrophysics Data System (ADS)

    Kar, Soummya; Moura, José M. F.

    2011-04-01

    The paper presents the gossip interactive Kalman filter (GIKF) for distributed Kalman filtering for networked systems and sensor networks, where inter-sensor communication and observations occur at the same time-scale. The communication among sensors is random; each sensor occasionally exchanges its filtering state information with a neighbor depending on the availability of the appropriate network link. We show that under a weak distributed detectability condition: 1. the GIKF error process remains stochastically bounded, irrespective of the instability properties of the random process dynamics; and 2. the network achieves \\emph{weak consensus}, i.e., the conditional estimation error covariance at a (uniformly) randomly selected sensor converges in distribution to a unique invariant measure on the space of positive semi-definite matrices (independent of the initial state.) To prove these results, we interpret the filtered states (estimates and error covariances) at each node in the GIKF as stochastic particles with local interactions. We analyze the asymptotic properties of the error process by studying as a random dynamical system the associated switched (random) Riccati equation, the switching being dictated by a non-stationary Markov chain on the network graph.

  5. Stochastic space interval as a link between quantum randomness and macroscopic randomness?

    NASA Astrophysics Data System (ADS)

    Haug, Espen Gaarder; Hoff, Harald

    2018-03-01

    For many stochastic phenomena, we observe statistical distributions that have fat-tails and high-peaks compared to the Gaussian distribution. In this paper, we will explain how observable statistical distributions in the macroscopic world could be related to the randomness in the subatomic world. We show that fat-tailed (leptokurtic) phenomena in our everyday macroscopic world are ultimately rooted in Gaussian - or very close to Gaussian-distributed subatomic particle randomness, but they are not, in a strict sense, Gaussian distributions. By running a truly random experiment over a three and a half-year period, we observed a type of random behavior in trillions of photons. Combining our results with simple logic, we find that fat-tailed and high-peaked statistical distributions are exactly what we would expect to observe if the subatomic world is quantized and not continuously divisible. We extend our analysis to the fact that one typically observes fat-tails and high-peaks relative to the Gaussian distribution in stocks and commodity prices and many aspects of the natural world; these instances are all observable and documentable macro phenomena that strongly suggest that the ultimate building blocks of nature are discrete (e.g. they appear in quanta).

  6. Superstatistical generalised Langevin equation: non-Gaussian viscoelastic anomalous diffusion

    NASA Astrophysics Data System (ADS)

    Ślęzak, Jakub; Metzler, Ralf; Magdziarz, Marcin

    2018-02-01

    Recent advances in single particle tracking and supercomputing techniques demonstrate the emergence of normal or anomalous, viscoelastic diffusion in conjunction with non-Gaussian distributions in soft, biological, and active matter systems. We here formulate a stochastic model based on a generalised Langevin equation in which non-Gaussian shapes of the probability density function and normal or anomalous diffusion have a common origin, namely a random parametrisation of the stochastic force. We perform a detailed analysis demonstrating how various types of parameter distributions for the memory kernel result in exponential, power law, or power-log law tails of the memory functions. The studied system is also shown to exhibit a further unusual property: the velocity has a Gaussian one point probability density but non-Gaussian joint distributions. This behaviour is reflected in the relaxation from a Gaussian to a non-Gaussian distribution observed for the position variable. We show that our theoretical results are in excellent agreement with stochastic simulations.

  7. Random field assessment of nanoscopic inhomogeneity of bone.

    PubMed

    Dong, X Neil; Luo, Qing; Sparkman, Daniel M; Millwater, Harry R; Wang, Xiaodu

    2010-12-01

    Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to represent the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. Copyright © 2010 Elsevier Inc. All rights reserved.

  8. Random field assessment of nanoscopic inhomogeneity of bone

    PubMed Central

    Dong, X. Neil; Luo, Qing; Sparkman, Daniel M.; Millwater, Harry R.; Wang, Xiaodu

    2010-01-01

    Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to present the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. PMID:20817128

  9. A statistical evaluation of non-ergodic variogram estimators

    USGS Publications Warehouse

    Curriero, F.C.; Hohn, M.E.; Liebhold, A.M.; Lele, S.R.

    2002-01-01

    Geostatistics is a set of statistical techniques that is increasingly used to characterize spatial dependence in spatially referenced ecological data. A common feature of geostatistics is predicting values at unsampled locations from nearby samples using the kriging algorithm. Modeling spatial dependence in sampled data is necessary before kriging and is usually accomplished with the variogram and its traditional estimator. Other types of estimators, known as non-ergodic estimators, have been used in ecological applications. Non-ergodic estimators were originally suggested as a method of choice when sampled data are preferentially located and exhibit a skewed frequency distribution. Preferentially located samples can occur, for example, when areas with high values are sampled more intensely than other areas. In earlier studies the visual appearance of variograms from traditional and non-ergodic estimators were compared. Here we evaluate the estimators' relative performance in prediction. We also show algebraically that a non-ergodic version of the variogram is equivalent to the traditional variogram estimator. Simulations, designed to investigate the effects of data skewness and preferential sampling on variogram estimation and kriging, showed the traditional variogram estimator outperforms the non-ergodic estimators under these conditions. We also analyzed data on carabid beetle abundance, which exhibited large-scale spatial variability (trend) and a skewed frequency distribution. Detrending data followed by robust estimation of the residual variogram is demonstrated to be a successful alternative to the non-ergodic approach.

  10. Ice Water Classification Using Statistical Distribution Based Conditional Random Fields in RADARSAT-2 Dual Polarization Imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Li, F.; Zhang, S.; Hao, W.; Zhu, T.; Yuan, L.; Xiao, F.

    2017-09-01

    In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF) algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ) and show a robust distinction of ice and water.

  11. Learning Probabilities From Random Observables in High Dimensions: The Maximum Entropy Distribution and Others

    NASA Astrophysics Data System (ADS)

    Obuchi, Tomoyuki; Cocco, Simona; Monasson, Rémi

    2015-11-01

    We consider the problem of learning a target probability distribution over a set of N binary variables from the knowledge of the expectation values (with this target distribution) of M observables, drawn uniformly at random. The space of all probability distributions compatible with these M expectation values within some fixed accuracy, called version space, is studied. We introduce a biased measure over the version space, which gives a boost increasing exponentially with the entropy of the distributions and with an arbitrary inverse `temperature' Γ . The choice of Γ allows us to interpolate smoothly between the unbiased measure over all distributions in the version space (Γ =0) and the pointwise measure concentrated at the maximum entropy distribution (Γ → ∞ ). Using the replica method we compute the volume of the version space and other quantities of interest, such as the distance R between the target distribution and the center-of-mass distribution over the version space, as functions of α =(log M)/N and Γ for large N. Phase transitions at critical values of α are found, corresponding to qualitative improvements in the learning of the target distribution and to the decrease of the distance R. However, for fixed α the distance R does not vary with Γ which means that the maximum entropy distribution is not closer to the target distribution than any other distribution compatible with the observable values. Our results are confirmed by Monte Carlo sampling of the version space for small system sizes (N≤ 10).

  12. Where are the horses? With the sheep or cows? Uncertain host location, vector-feeding preferences and the risk of African horse sickness transmission in Great Britain

    PubMed Central

    Lo Iacono, Giovanni; Robin, Charlotte A.; Newton, J. Richard; Gubbins, Simon; Wood, James L. N.

    2013-01-01

    Understanding the influence of non-susceptible hosts on vector-borne disease transmission is an important epidemiological problem. However, investigation of its impact can be complicated by uncertainty in the location of the hosts. Estimating the risk of transmission of African horse sickness (AHS) in Great Britain (GB), a virus transmitted by Culicoides biting midges, provides an insightful example because: (i) the patterns of risk are expected to be influenced by the presence of non-susceptible vertebrate hosts (cattle and sheep) and (ii) incomplete information on the spatial distribution of horses is available because the GB National Equine Database records owner, rather than horse, locations. Here, we combine land-use data with available horse owner distributions and, using a Bayesian approach, infer a realistic distribution for the location of horses. We estimate the risk of an outbreak of AHS in GB, using the basic reproduction number (R0), and demonstrate that mapping owner addresses as a proxy for horse location significantly underestimates the risk. We clarify the role of non-susceptible vertebrate hosts by showing that the risk of disease in the presence of many hosts (susceptible and non-susceptible) can be ultimately reduced to two fundamental factors: first, the abundance of vectors and how this depends on host density, and, second, the differential feeding preference of vectors among animal species. PMID:23594817

  13. Evaluation of the locations of Kentucky's traffic crash data.

    DOT National Transportation Integrated Search

    2010-11-01

    An evaluation of a random sample of crashes from 2009 was performed to assess the current accuracy of the crash data's location information.The location of the crash was compared to the presumed location using several report data elements such as nea...

  14. Oral Health-Related Quality of Life in Edentulous Patients with Two- vs Four-Locator-Retained Mandibular Overdentures: A Prospective, Randomized, Crossover Study.

    PubMed

    Karbach, Julia; Hartmann, Sinsa; Jahn-Eimermacher, Antje; Wagner, Wilfried

    2015-01-01

    To compare the oral health-related quality of life (OHRQoL) in a prospective, randomized crossover trial in patients with mandibular overdentures retained with two or four locators. In 30 patients with edentulous mandibles, four implants (ICX-plus implants [Medentis Medical]) were placed in the intraforaminal area. Eight weeks after transgingival healing, patients were randomly assigned to have two or four implants incorporated in the prosthesis. After 3 months, the retention concepts were switched. The patients with a two-implant-supported overdenture had four implants incorporated, whereas patients with a four-implant-supported overdenture had two retention locators taken out. After 3 more months, all four implants were retained in the implant-supported overdenture in every patient. To measure OHRQoL of the patients, the Oral Health Impact Profile 14, German version (OHIP-14 G), was used. A considerable increase in OHRQoL could be seen in all patients after the prosthesis was placed on the implants. Also, a statistically significant difference of OHRQoL could be seen in the OHIP-14 G scores between two-implant and four-implant overdentures. Patients had a higher OHRQoL after incorporation of four implants in the overdenture compared with only two implants. Patients with implant-retained overdentures had better OHRQoL compared with those with conventional dentures. The number of incorporated implants in the locator-retained overdenture also influenced the increase in OHRQoL, with four implants having a statistically significant advantage over two implants.

  15. Calculation of momentum distribution function of a non-thermal fermionic dark matter

    NASA Astrophysics Data System (ADS)

    Biswas, Anirban; Gupta, Aritra

    2017-03-01

    The most widely studied scenario in dark matter phenomenology is the thermal WIMP scenario. Inspite of numerous efforts to detect WIMP, till now we have no direct evidence for it. A possible explanation for this non-observation of dark matter could be because of its very feeble interaction strength and hence, failing to thermalise with the rest of the cosmic soup. In other words, the dark matter might be of non-thermal origin where the relic density is obtained by the so-called freeze-in mechanism. Furthermore, if this non-thermal dark matter is itself produced substantially from the decay of another non-thermal mother particle, then their distribution functions may differ in both size and shape from the usual equilibrium distribution function. In this work, we have studied such a non-thermal (fermionic) dark matter scenario in the light of a new type of U(1)B-L model. The U(1)B-L model is interesting, since, besides being anomaly free, it can give rise to neutrino mass by Type II see-saw mechanism. Moreover, as we will show, it can accommodate a non-thermal fermionic dark matter as well. Starting from the collision terms, we have calculated the momentum distribution function for the dark matter by solving a coupled system of Boltzmann equations. We then used it to calculate the final relic abundance, as well as other relevant physical quantities. We have also compared our result with that obtained from solving the usual Boltzmann (or rate) equations directly in terms of comoving number density, Y. Our findings suggest that the latter approximation is valid only in cases where the system under study is close to equilibrium, and hence should be used with caution.

  16. Calculation of momentum distribution function of a non-thermal fermionic dark matter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biswas, Anirban; Gupta, Aritra, E-mail: anirbanbiswas@hri.res.in, E-mail: aritra@hri.res.in

    The most widely studied scenario in dark matter phenomenology is the thermal WIMP scenario. Inspite of numerous efforts to detect WIMP, till now we have no direct evidence for it. A possible explanation for this non-observation of dark matter could be because of its very feeble interaction strength and hence, failing to thermalise with the rest of the cosmic soup. In other words, the dark matter might be of non-thermal origin where the relic density is obtained by the so-called freeze-in mechanism. Furthermore, if this non-thermal dark matter is itself produced substantially from the decay of another non-thermal mother particle,more » then their distribution functions may differ in both size and shape from the usual equilibrium distribution function. In this work, we have studied such a non-thermal (fermionic) dark matter scenario in the light of a new type of U(1){sub B−L} model. The U(1){sub B−L} model is interesting, since, besides being anomaly free, it can give rise to neutrino mass by Type II see-saw mechanism. Moreover, as we will show, it can accommodate a non-thermal fermionic dark matter as well. Starting from the collision terms, we have calculated the momentum distribution function for the dark matter by solving a coupled system of Boltzmann equations. We then used it to calculate the final relic abundance, as well as other relevant physical quantities. We have also compared our result with that obtained from solving the usual Boltzmann (or rate) equations directly in terms of comoving number density, Y . Our findings suggest that the latter approximation is valid only in cases where the system under study is close to equilibrium, and hence should be used with caution.« less

  17. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty.

    PubMed

    Lash, Timothy L

    2007-11-26

    The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a qualitative description of study limitations. The latter approach is

  18. 42 CFR 421.505 - Termination and extension of non-random prepayment complex medical review.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... prepayment complex medical review for that provider or supplier may be extended. However, if the number of... complex medical review. 421.505 Section 421.505 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES... § 421.505 Termination and extension of non-random prepayment complex medical review. (a) Timeframe that...

  19. Recourse-based facility-location problems in hybrid uncertain environment.

    PubMed

    Wang, Shuming; Watada, Junzo; Pedrycz, Witold

    2010-08-01

    The objective of this paper is to study facility-location problems in the presence of a hybrid uncertain environment involving both randomness and fuzziness. A two-stage fuzzy-random facility-location model with recourse (FR-FLMR) is developed in which both the demands and costs are assumed to be fuzzy-random variables. The bounds of the optimal objective value of the two-stage FR-FLMR are derived. As, in general, the fuzzy-random parameters of the FR-FLMR can be regarded as continuous fuzzy-random variables with an infinite number of realizations, the computation of the recourse requires solving infinite second-stage programming problems. Owing to this requirement, the recourse function cannot be determined analytically, and, hence, the model cannot benefit from the use of techniques of classical mathematical programming. In order to solve the location problems of this nature, we first develop a technique of fuzzy-random simulation to compute the recourse function. The convergence of such simulation scenarios is discussed. In the sequel, we propose a hybrid mutation-based binary ant-colony optimization (MBACO) approach to the two-stage FR-FLMR, which comprises the fuzzy-random simulation and the simplex algorithm. A numerical experiment illustrates the application of the hybrid MBACO algorithm. The comparison shows that the hybrid MBACO finds better solutions than the one using other discrete metaheuristic algorithms, such as binary particle-swarm optimization, genetic algorithm, and tabu search.

  20. Infinite non-causality in active cancellation of random noise

    NASA Astrophysics Data System (ADS)

    Friot, Emmanuel

    2006-03-01

    Active cancellation of broadband random noise requires the detection of the incoming noise with some time advance. In an duct for example this advance must be larger than the delays in the secondary path from the control source to the error sensor. In this paper it is shown that, in some cases, the advance required for perfect noise cancellation is theoretically infinite because the inverse of the secondary path, which is required for control, can include an infinite non-causal response. This is shown to be the result of two mechanisms: in the single-channel case (one control source and one error sensor), this can arise because of strong echoes in the control path. In the multi-channel case this can arise even in free field simply because of an unfortunate placing of sensors and actuators. In the present paper optimal feedforward control is derived through analytical and numerical computations, in the time and frequency domains. It is shown that, in practice, the advance required for significant noise attenuation can be much larger than the secondary path delays. Practical rules are also suggested in order to prevent infinite non-causality from appearing.

  1. Breaking symmetry in non-planar bifurcations: distribution of flow and wall shear stress.

    PubMed

    Lu, Yiling; Lu, Xiyun; Zhuang, Lixian; Wang, Wen

    2002-01-01

    Non-planarity in blood vessels is known to influence arterial flows and wall shear stress. To gain insight, computational fluid dynamics (CFD) has been used to investigate effects of curvature and out-of-plane geometry on the distribution of fluid flows and wall shear stresses in a hypothetical non-planar bifurcation. Three-dimensional Navier-Stokes equations for a steady state Newtonian fluid were solved numerically using a finite element method. Non-planarity in one of the two daughter vessels is found to deflect flow from the inner wall of the vessel to the outer wall and to cause changes in the distribution of wall shear stresses. Results from this study agree to experimental observations and CFD simulations in the literature, and support the view that non-planarity in blood vessels is a factor with important haemodynamic significance and may play a key role in vascular biology and pathophysiology.

  2. Non-contact assessment of melanin distribution via multispectral temporal illumination coding

    NASA Astrophysics Data System (ADS)

    Amelard, Robert; Scharfenberger, Christian; Wong, Alexander; Clausi, David A.

    2015-03-01

    Melanin is a pigment that is highly absorptive in the UV and visible electromagnetic spectra. It is responsible for perceived skin tone, and protects against harmful UV effects. Abnormal melanin distribution is often an indicator for melanoma. We propose a novel approach for non-contact melanin distribution via multispectral temporal illumination coding to estimate the two-dimensional melanin distribution based on its absorptive characteristics. In the proposed system, a novel multispectral, cross-polarized, temporally-coded illumination sequence is synchronized with a camera to measure reflectance under both multispectral and ambient illumination. This allows us to eliminate the ambient illumination contribution from the acquired reflectance measurements, and also to determine the melanin distribution in an observed region based on the spectral properties of melanin using the Beer-Lambert law. Using this information, melanin distribution maps can be generated for objective, quantitative assessment of skin type of individuals. We show that the melanin distribution map correctly identifies areas with high melanin densities (e.g., nevi).

  3. Non-Speech Oro-Motor Exercises in Post-Stroke Dysarthria Intervention: A Randomized Feasibility Trial

    ERIC Educational Resources Information Center

    Mackenzie, C.; Muir, M.; Allen, C.; Jensen, A.

    2014-01-01

    Background: There has been little robust evaluation of the outcome of speech and language therapy (SLT) intervention for post-stroke dysarthria. Non-speech oro-motor exercises (NSOMExs) are a common component of dysarthria intervention. A feasibility study was designed and executed, with participants randomized into two groups, in one of which…

  4. Effects of vibration and shock on the performance of gas-bearing space-power Brayton cycle turbomachinery. Part 3: Sinusoidal and random vibration data reduction and evaluation, and random vibration probability analysis

    NASA Technical Reports Server (NTRS)

    Tessarzik, J. M.; Chiang, T.; Badgley, R. H.

    1973-01-01

    The random vibration response of a gas bearing rotor support system has been experimentally and analytically investigated in the amplitude and frequency domains. The NASA Brayton Rotating Unit (BRU), a 36,000 rpm, 10 KWe turbogenerator had previously been subjected in the laboratory to external random vibrations, and the response data recorded on magnetic tape. This data has now been experimentally analyzed for amplitude distribution and magnetic tape. This data has now been experimentally analyzed for amplitude distribution and frequency content. The results of the power spectral density analysis indicate strong vibration responses for the major rotor-bearing system components at frequencies which correspond closely to their resonant frequencies obtained under periodic vibration testing. The results of amplitude analysis indicate an increasing shift towards non-Gaussian distributions as the input level of external vibrations is raised. Analysis of axial random vibration response of the BRU was performed by using a linear three-mass model. Power spectral densities, the root-mean-square value of the thrust bearing surface contact were calculated for specified input random excitation.

  5. Effect of particle size distribution on permeability in the randomly packed porous media

    NASA Astrophysics Data System (ADS)

    Markicevic, Bojan

    2017-11-01

    An answer of how porous medium heterogeneity influences the medium permeability is still inconclusive, where both increase and decrease in the permeability value are reported. A numerical procedure is used to generate a randomly packed porous material consisting of spherical particles. Six different particle size distributions are used including mono-, bi- and three-disperse particles, as well as uniform, normal and log-normal particle size distribution with the maximum to minimum particle size ratio ranging from three to eight for different distributions. In all six cases, the average particle size is kept the same. For all media generated, the stochastic homogeneity is checked from distribution of three coordinates of particle centers, where uniform distribution of x-, y- and z- positions is found. The medium surface area remains essentially constant except for bi-modal distribution in which medium area decreases, while no changes in the porosity are observed (around 0.36). The fluid flow is solved in such domain, and after checking for the pressure axial linearity, the permeability is calculated from the Darcy law. The permeability comparison reveals that the permeability of the mono-disperse medium is smallest, and the permeability of all poly-disperse samples is less than ten percent higher. For bi-modal particles, the permeability is for a quarter higher compared to the other media which can be explained by volumetric contribution of larger particles and larger passages for fluid flow to take place.

  6. Distribution of Orientation Selectivity in Recurrent Networks of Spiking Neurons with Different Random Topologies

    PubMed Central

    Sadeh, Sadra; Rotter, Stefan

    2014-01-01

    Neurons in the primary visual cortex are more or less selective for the orientation of a light bar used for stimulation. A broad distribution of individual grades of orientation selectivity has in fact been reported in all species. A possible reason for emergence of broad distributions is the recurrent network within which the stimulus is being processed. Here we compute the distribution of orientation selectivity in randomly connected model networks that are equipped with different spatial patterns of connectivity. We show that, for a wide variety of connectivity patterns, a linear theory based on firing rates accurately approximates the outcome of direct numerical simulations of networks of spiking neurons. Distance dependent connectivity in networks with a more biologically realistic structure does not compromise our linear analysis, as long as the linearized dynamics, and hence the uniform asynchronous irregular activity state, remain stable. We conclude that linear mechanisms of stimulus processing are indeed responsible for the emergence of orientation selectivity and its distribution in recurrent networks with functionally heterogeneous synaptic connectivity. PMID:25469704

  7. Mean and Fluctuating Force Distribution in a Random Array of Spheres

    NASA Astrophysics Data System (ADS)

    Akiki, Georges; Jackson, Thomas; Balachandar, Sivaramakrishnan

    2015-11-01

    This study presents a numerical study of the force distribution within a cluster of mono-disperse spherical particles. A direct forcing immersed boundary method is used to calculate the forces on individual particles for a volume fraction range of [0.1, 0.4] and a Reynolds number range of [10, 625]. The overall drag is compared to several drag laws found in the literature. As for the fluctuation of the hydrodynamic streamwise force among individual particles, it is shown to have a normal distribution with a standard deviation that varies with the volume fraction only. The standard deviation remains approximately 25% of the mean streamwise force on a single sphere. The force distribution shows a good correlation between the location of two to three nearest upstream and downstream neighbors and the magnitude of the forces. A detailed analysis of the pressure and shear forces contributions calculated on a ghost sphere in the vicinity of a single particle in a uniform flow reveals a mapping of those contributions. The combination of the mapping and number of nearest neighbors leads to a first order correction of the force distribution within a cluster which can be used in Lagrangian-Eulerian techniques. We also explore the possibility of a binary force model that systematically accounts for the effect of the nearest neighbors. This work was supported by the National Science Foundation (NSF OISE-0968313) under Partnership for International Research and Education (PIRE) in Multiphase Flows at the University of Florida.

  8. Models for randomly distributed nanoscopic domains on spherical vesicles

    NASA Astrophysics Data System (ADS)

    Anghel, Vinicius N. P.; Bolmatov, Dima; Katsaras, John

    2018-06-01

    The existence of lipid domains in the plasma membrane of biological systems has proven controversial, primarily due to their nanoscopic size—a length scale difficult to interrogate with most commonly used experimental techniques. Scattering techniques have recently proven capable of studying nanoscopic lipid domains populating spherical vesicles. However, the development of analytical methods able of predicting and analyzing domain pair correlations from such experiments has not kept pace. Here, we developed models for the random distribution of monodisperse, circular nanoscopic domains averaged on the surface of a spherical vesicle. Specifically, the models take into account (i) intradomain correlations corresponding to form factors and interdomain correlations corresponding to pair distribution functions, and (ii) the analytical computation of interdomain correlations for cases of two and three domains on a spherical vesicle. In the case of more than three domains, these correlations are treated either by Monte Carlo simulations or by spherical analogs of the Ornstein-Zernike and Percus-Yevick (PY) equations. Importantly, the spherical analog of the PY equation works best in the case of nanoscopic size domains, a length scale that is mostly inaccessible by experimental approaches such as, for example, fluorescent techniques and optical microscopies. The analytical form factors and structure factors of nanoscopic domains populating a spherical vesicle provide a new and important framework for the quantitative analysis of experimental data from commonly studied phase-separated vesicles used in a wide range of biophysical studies.

  9. Magneto-transport properties of a random distribution of few-layer graphene patches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iacovella, Fabrice; Mitioglu, Anatolie; Pierre, Mathieu

    In this study, we address the electronic properties of conducting films constituted of an array of randomly distributed few layer graphene patches and investigate on their most salient galvanometric features in the moderate and extreme disordered limit. We demonstrate that, in annealed devices, the ambipolar behaviour and the onset of Landau level quantization in high magnetic field constitute robust hallmarks of few-layer graphene films. In the strong disorder limit, however, the magneto-transport properties are best described by a variable-range hopping behaviour. A large negative magneto-conductance is observed at the charge neutrality point, in consistency with localized transport regime.

  10. Time-evolution of grain size distributions in random nucleation and growth crystallization processes

    NASA Astrophysics Data System (ADS)

    Teran, Anthony V.; Bill, Andreas; Bergmann, Ralf B.

    2010-02-01

    We study the time dependence of the grain size distribution N(r,t) during crystallization of a d -dimensional solid. A partial differential equation, including a source term for nuclei and a growth law for grains, is solved analytically for any dimension d . We discuss solutions obtained for processes described by the Kolmogorov-Avrami-Mehl-Johnson model for random nucleation and growth (RNG). Nucleation and growth are set on the same footing, which leads to a time-dependent decay of both effective rates. We analyze in detail how model parameters, the dimensionality of the crystallization process, and time influence the shape of the distribution. The calculations show that the dynamics of the effective nucleation and effective growth rates play an essential role in determining the final form of the distribution obtained at full crystallization. We demonstrate that for one class of nucleation and growth rates, the distribution evolves in time into the logarithmic-normal (lognormal) form discussed earlier by Bergmann and Bill [J. Cryst. Growth 310, 3135 (2008)]. We also obtain an analytical expression for the finite maximal grain size at all times. The theory allows for the description of a variety of RNG crystallization processes in thin films and bulk materials. Expressions useful for experimental data analysis are presented for the grain size distribution and the moments in terms of fundamental and measurable parameters of the model.

  11. Broadband diffuse terahertz wave scattering by flexible metasurface with randomized phase distribution.

    PubMed

    Zhang, Yin; Liang, Lanju; Yang, Jing; Feng, Yijun; Zhu, Bo; Zhao, Junming; Jiang, Tian; Jin, Biaobing; Liu, Weiwei

    2016-05-26

    Suppressing specular electromagnetic wave reflection or backward radar cross section is important and of broad interests in practical electromagnetic engineering. Here, we present a scheme to achieve broadband backward scattering reduction through diffuse terahertz wave reflection by a flexible metasurface. The diffuse scattering of terahertz wave is caused by the randomized reflection phase distribution on the metasurface, which consists of meta-particles of differently sized metallic patches arranged on top of a grounded polyimide substrate simply through a certain computer generated pseudorandom sequence. Both numerical simulations and experimental results demonstrate the ultralow specular reflection over a broad frequency band and wide angle of incidence due to the re-distribution of the incident energy into various directions. The diffuse scattering property is also polarization insensitive and can be well preserved when the flexible metasurface is conformably wrapped on a curved reflective object. The proposed design opens up a new route for specular reflection suppression, and may be applicable in stealth and other technology in the terahertz spectrum.

  12. 3D vector distribution of the electro-magnetic fields on a random gold film

    NASA Astrophysics Data System (ADS)

    Canneson, Damien; Berini, Bruno; Buil, Stéphanie; Hermier, Jean-Pierre; Quélin, Xavier

    2018-05-01

    The 3D vector distribution of the electro-magnetic fields at the very close vicinity of the surface of a random gold film is studied. Such films are well known for their properties of light confinement and large fluctuations of local density of optical states. Using Finite-Difference Time-Domain simulations, we show that it is possible to determine the local orientation of the electro-magnetic fields. This allows us to obtain a complete characterization of the fields. Large fluctuations of their amplitude are observed as previously shown. Here, we demonstrate large variations of their direction depending both on the position on the random gold film, and on the distance to it. Such characterization could be useful for a better understanding of applications like the coupling of point-like dipoles to such films.

  13. E-learning in pediatric basic life support: a randomized controlled non-inferiority study.

    PubMed

    Krogh, Lise Qvirin; Bjørnshave, Katrine; Vestergaard, Lone Due; Sharma, Maja Bendtsen; Rasmussen, Stinne Eika; Nielsen, Henrik Vendelbo; Thim, Troels; Løfgren, Bo

    2015-05-01

    Dissemination of pediatric basic life support (PBLS) skills is recommended. E-learning is accessible and cost-effective, but it is currently unknown whether laypersons can learn PBLS through e-learning. The hypothesis of this study was to investigate whether e-learning PBLS is non-inferior to instructor-led training. Participants were recruited among child-minders and parents of children aged 0-6 years. Participants were randomized to either 2-h instructor-led training or e-learning using an e-learning program (duration 17 min) including an inflatable manikin. After training, participants were assessed in a simulated pediatric cardiac arrest scenario. Tests were video recorded and PBLS skills were assessed independently by two assessors blinded to training method. Primary outcome was the pass rate of the PBLS test (≥8 of 15 skills adequately performed) with a pre-specified non-inferiority margin of 20%. In total 160 participants were randomized 1:1. E-learning was non-inferior to instructor-led training (difference in pass rate -4%; 95% CI -9:0.5). Pass rates were 100% among instructor-led trained (n=67) and 96% among e-learned (n=71). E-learners median time spent on the e-learning program was 30 min (range: 15-120 min) and the median number of log-ons was 2 (range: 1-5). After the study, all participants felt that their skills had improved. E-learning PBLS is non-inferior to instructor-led training among child-minders and parents with children aged 0-6 years, although the pass rate was 4% (95% CI -9:0.5) lower with e-learning. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Assessing the quality of a non-randomized pragmatic trial for primary prevention of falls among older adults.

    PubMed

    Albert, Steven M; Edelstein, Offer; King, Jennifer; Flatt, Jason; Lin, Chyongchiou J; Boudreau, Robert; Newman, Anne B

    2015-01-01

    Current approaches to falls prevention mostly rely on secondary and tertiary prevention and target individuals at high risk of falls. An alternative is primary prevention, in which all seniors are screened, referred as appropriate, and educated regarding falls risk. Little information is available on research designs that allow investigation of this approach in the setting of aging services delivery, where randomization may not be possible. Healthy Steps for Older Adults, a statewide program of the Pennsylvania (PA) Department of Aging, involves a combination of education about falls and screening for balance problems, with referral to personal physicians and home safety assessments. We developed a non-randomized statewide trial, Falls Free PA, to assess its effectiveness in reducing falls incidence over 12 months. We recruited 814 seniors who completed the program (503 first-time participants, 311 people repeating the program) and 1,020 who did not participate in the program, from the same sites. We assessed the quality of this non-randomized design by examining recruitment, follow-up across study groups, and comparability at baseline. Of older adults approached in senior centers, 90.5 % (n = 2,219) signed informed consent, and 1,834 (82.4 %) completed baseline assessments and were eligible for follow-up. Attrition in the three groups over 12 months was low and non-differential (<10 % for withdrawal and <2 % for other loss to follow-up). Median follow-up, which involved standardized monthly assessment of falls, was 10 months in all study groups. At baseline, the groups did not differ in measures of health or falls risk factors. Comparable status at baseline, recruitment from common sites, and similar experience with retention suggest that the non-randomized design will be effective for assessment of this approach to primary prevention of falls.

  15. Numerical Modeling Describing the Effects of Heterogeneous Distributions of Asperities on the Quasi-static Evolution of Frictional Slip

    NASA Astrophysics Data System (ADS)

    Selvadurai, P. A.; Parker, J. M.; Glaser, S. D.

    2017-12-01

    A better understanding of how slip accumulates along faults and its relation to the breakdown of shear stress is beneficial to many engineering disciplines, such as, hydraulic fracture and understanding induced seismicity (among others). Asperities forming along a preexisting fault resist the relative motion of the two sides of the interface and occur due to the interaction of the surface topographies. Here, we employ a finite element model to simulate circular partial slip asperities along a nominally flat frictional interface. Shear behavior of our partial slip asperity model closely matched the theory described by Cattaneo. The asperity model was employed to simulate a small section of an experimental fault formed between two bodies of polymethyl methacrylate, which consisted of multiple asperities whose location and sizes were directly measured using a pressure sensitive film. The quasi-static shear behavior of the interface was modeled for cyclical loading conditions, and the frictional dissipation (hysteresis) was normal stress dependent. We further our understanding by synthetically modeling lognormal size distributions of asperities that were randomly distributed in space. Synthetic distributions conserved the real contact area and aspects of the size distributions from the experimental case, allowing us to compare the constitutive behaviors based solely on spacing effects. Traction-slip behavior of the experimental interface appears to be considerably affected by spatial clustering of asperities that was not present in the randomly spaced, synthetic asperity distributions. Estimates of bulk interfacial shear stiffness were determined from the constitutive traction-slip behavior and were comparable to the theoretical estimates of multi-contact interfaces with non-interacting asperities.

  16. Canadian Phase III Randomized Trial of Stereotactic Body Radiotherapy Versus Conventionally Hypofractionated Radiotherapy for Stage I, Medically Inoperable Non-Small-Cell Lung Cancer - Rationale and Protocol Design for the Ontario Clinical Oncology Group (OCOG)-LUSTRE Trial.

    PubMed

    Swaminath, Anand; Wierzbicki, Marcin; Parpia, Sameer; Wright, James R; Tsakiridis, Theodoros K; Okawara, Gordon S; Kundapur, Vijayananda; Bujold, Alexis; Ahmed, Naseer; Hirmiz, Khalid; Kurien, Elizabeth; Filion, Edith; Gabos, Zsolt; Faria, Sergio; Louie, Alexander V; Owen, Timothy; Wai, Elaine; Ramchandar, Kevin; Chan, Elisa K; Julian, Jim; Cline, Kathryn; Whelan, Timothy J

    2017-03-01

    We describe a Canadian phase III randomized controlled trial of stereotactic body radiotherapy (SBRT) versus conventionally hypofractionated radiotherapy (CRT) for the treatment of stage I medically inoperable non-small-cell lung cancer (OCOG-LUSTRE Trial). Eligible patients are randomized in a 2:1 fashion to either SBRT (48 Gy in 4 fractions for peripherally located lesions; 60 Gy in 8 fractions for centrally located lesions) or CRT (60 Gy in 15 fractions). The primary outcome of the study is 3-year local control, which we hypothesize will improve from 75% with CRT to 87.5% with SBRT. With 85% power to detect a difference of this magnitude (hazard ratio = 0.46), a 2-sided α = 0.05 and a 2:1 randomization, we require a sample size of 324 patients (216 SBRT, 108 CRT). Important secondary outcomes include overall survival, disease-free survival, toxicity, radiation-related treatment death, quality of life, and cost-effectiveness. A robust radiation therapy quality assurance program has been established to assure consistent and high quality SBRT and CRT delivery. Despite widespread interest and adoption of SBRT, there still remains a concern regarding long-term control and risks of toxicity (particularly in patients with centrally located lesions). The OCOG-LUSTRE study is the only randomized phase III trial testing SBRT in a medically inoperable population, and the results of this trial will attempt to prove that the benefits of SBRT outweigh the potential risks. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. [Location selection for Shenyang urban parks based on GIS and multi-objective location allocation model].

    PubMed

    Zhou, Yuan; Shi, Tie-Mao; Hu, Yuan-Man; Gao, Chang; Liu, Miao; Song, Lin-Qi

    2011-12-01

    Based on geographic information system (GIS) technology and multi-objective location-allocation (LA) model, and in considering of four relatively independent objective factors (population density level, air pollution level, urban heat island effect level, and urban land use pattern), an optimized location selection for the urban parks within the Third Ring of Shenyang was conducted, and the selection results were compared with the spatial distribution of existing parks, aimed to evaluate the rationality of the spatial distribution of urban green spaces. In the location selection of urban green spaces in the study area, the factor air pollution was most important, and, compared with single objective factor, the weighted analysis results of multi-objective factors could provide optimized spatial location selection of new urban green spaces. The combination of GIS technology with LA model would be a new approach for the spatial optimizing of urban green spaces.

  18. Examining drivers' eye glance patterns during distracted driving: Insights from scanning randomness and glance transition matrix.

    PubMed

    Wang, Yuan; Bao, Shan; Du, Wenjun; Ye, Zhirui; Sayer, James R

    2017-12-01

    Visual attention to the driving environment is of great importance for road safety. Eye glance behavior has been used as an indicator of distracted driving. This study examined and quantified drivers' glance patterns and features during distracted driving. Data from an existing naturalistic driving study were used. Entropy rate was calculated and used to assess the randomness associated with drivers' scanning patterns. A glance-transition proportion matrix was defined to quantity visual search patterns transitioning among four main eye glance locations while driving (i.e., forward on-road, phone, mirrors and others). All measurements were calculated within a 5s time window under both cell phone and non-cell phone use conditions. Results of the glance data analyses showed different patterns between distracted and non-distracted driving, featured by a higher entropy rate value and highly biased attention transferring between forward and phone locations during distracted driving. Drivers in general had higher number of glance transitions, and their on-road glance duration was significantly shorter during distracted driving when compared to non-distracted driving. Results suggest that drivers have a higher scanning randomness/disorder level and shift their main attention from surrounding areas towards phone area when engaging in visual-manual tasks. Drivers' visual search patterns during visual-manual distraction with a high scanning randomness and a high proportion of eye glance transitions towards the location of the phone provide insight into driver distraction detection. This will help to inform the design of in-vehicle human-machine interface/systems. Copyright © 2017. Published by Elsevier Ltd.

  19. Occurrence and distribution of extractable and non-extractable GDGTs in podzols: implications for the reconstruction of mean air temperature

    NASA Astrophysics Data System (ADS)

    Huguet, Arnaud; Fosse, Céline; Metzger, Pierre; Derenne, Sylvie

    2010-05-01

    Glycerol dialkyl glycerol tetraethers (GDGTs) are complex lipids of high molecular weight, present in cell membranes of archaea and some bacteria. Archaeal membranes are formed predominantly by isoprenoid GDGTs with acyclic or ring-containing biphytanyl chains. Another type of GDGTs with branched instead of isoprenoid alkyl chains was recently discovered in soils. Branched tetraethers were suggested to be produced by anaerobic bacteria and can be used to reconstruct past air temperature and soil pH. Lipids preserved in soils can take two broad chemical forms: extractable lipids, recoverable upon solvent extraction, and non-extractable lipids, linked to the organic or mineral matrix of soils. Moreover, within the extractable pool, core (i.e. "free") lipids and intact polar (i.e. "bound") lipids can be distinguished. These three lipid fractions may respond to environmental changes in different ways and the information derived from these three pools may differ. The aim of the present work was therefore to compare the abundance and distribution of the three GDGT pools in two contrasted podzols: a temperate podzol located 40 km north of Paris and a tropical podzol from the upper Amazon Basin. Five samples were collected from the whole profile of the temperate podzol including the litter layer. Five additional samples were obtained from three profiles of the tropical soil sequence, representative of the transition between a latosol and a well-developed podzol. Vertical and/or lateral variations in GDGT content and composition were highlighted. In particular, in the tropical sequence, GDGTs were present at relatively low concentrations in the early stages of podzolisation and were more abundant in the well-developed podzolic horizons, where higher acidity and increased bacterial activity may favour their stabilization. Concerning the temperate podzol, GDGT distribution was shown to vary greatly with depth in the soil profile, the methylation degree of bacterial GDGTs

  20. Sound Source Localization Using Non-Conformal Surface Sound Field Transformation Based on Spherical Harmonic Wave Decomposition

    PubMed Central

    Zhang, Lanyue; Ding, Dandan; Yang, Desen; Wang, Jia; Shi, Jie

    2017-01-01

    Spherical microphone arrays have been paid increasing attention for their ability to locate a sound source with arbitrary incident angle in three-dimensional space. Low-frequency sound sources are usually located by using spherical near-field acoustic holography. The reconstruction surface and holography surface are conformal surfaces in the conventional sound field transformation based on generalized Fourier transform. When the sound source is on the cylindrical surface, it is difficult to locate by using spherical surface conformal transform. The non-conformal sound field transformation by making a transfer matrix based on spherical harmonic wave decomposition is proposed in this paper, which can achieve the transformation of a spherical surface into a cylindrical surface by using spherical array data. The theoretical expressions of the proposed method are deduced, and the performance of the method is simulated. Moreover, the experiment of sound source localization by using a spherical array with randomly and uniformly distributed elements is carried out. Results show that the non-conformal surface sound field transformation from a spherical surface to a cylindrical surface is realized by using the proposed method. The localization deviation is around 0.01 m, and the resolution is around 0.3 m. The application of the spherical array is extended, and the localization ability of the spherical array is improved. PMID:28489065

  1. Mechanics of the Compression Wood Response: II. On the Location, Action, and Distribution of Compression Wood Formation.

    PubMed

    Archer, R R; Wilson, B F

    1973-04-01

    A new method for simulation of cross-sectional growth provided detailed information on the location of normal wood and compression wood increments in two tilted white pine (Pinus strobus L.) leaders. These data were combined with data on stiffness, slope, and curvature changes over a 16-week period to make the mechanical analysis. The location of compression wood changed from the under side to a flank side and then to the upper side of the leader as the geotropic stimulus decreased, owing to compression wood action. Its location shifted back to a flank side when the direction of movement of the leader reversed. A model for this action, based on elongation strains, was developed and predicted the observed curvature changes with elongation strains of 0.3 to 0.5%, or a maximal compressive stress of 60 to 300 kilograms per square centimeter. After tilting, new wood formation was distributed so as to maintain consistent strain levels along the leaders in bending under gravitational loads. The computed effective elastic moduli were about the same for the two leaders throughout the season.

  2. TYPE Ia SUPERNOVA COLORS AND EJECTA VELOCITIES: HIERARCHICAL BAYESIAN REGRESSION WITH NON-GAUSSIAN DISTRIBUTIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandel, Kaisey S.; Kirshner, Robert P.; Foley, Ryan J., E-mail: kmandel@cfa.harvard.edu

    2014-12-20

    We investigate the statistical dependence of the peak intrinsic colors of Type Ia supernovae (SNe Ia) on their expansion velocities at maximum light, measured from the Si II λ6355 spectral feature. We construct a new hierarchical Bayesian regression model, accounting for the random effects of intrinsic scatter, measurement error, and reddening by host galaxy dust, and implement a Gibbs sampler and deviance information criteria to estimate the correlation. The method is applied to the apparent colors from BVRI light curves and Si II velocity data for 79 nearby SNe Ia. The apparent color distributions of high-velocity (HV) and normal velocitymore » (NV) supernovae exhibit significant discrepancies for B – V and B – R, but not other colors. Hence, they are likely due to intrinsic color differences originating in the B band, rather than dust reddening. The mean intrinsic B – V and B – R color differences between HV and NV groups are 0.06 ± 0.02 and 0.09 ± 0.02 mag, respectively. A linear model finds significant slopes of –0.021 ± 0.006 and –0.030 ± 0.009 mag (10{sup 3} km s{sup –1}){sup –1} for intrinsic B – V and B – R colors versus velocity, respectively. Because the ejecta velocity distribution is skewed toward high velocities, these effects imply non-Gaussian intrinsic color distributions with skewness up to +0.3. Accounting for the intrinsic-color-velocity correlation results in corrections to A{sub V} extinction estimates as large as –0.12 mag for HV SNe Ia and +0.06 mag for NV events. Velocity measurements from SN Ia spectra have the potential to diminish systematic errors from the confounding of intrinsic colors and dust reddening affecting supernova distances.« less

  3. Leading non-Gaussian corrections for diffusion orientation distribution function.

    PubMed

    Jensen, Jens H; Helpern, Joseph A; Tabesh, Ali

    2014-02-01

    An analytical representation of the leading non-Gaussian corrections for a class of diffusion orientation distribution functions (dODFs) is presented. This formula is constructed from the diffusion and diffusional kurtosis tensors, both of which may be estimated with diffusional kurtosis imaging (DKI). By incorporating model-independent non-Gaussian diffusion effects, it improves on the Gaussian approximation used in diffusion tensor imaging (DTI). This analytical representation therefore provides a natural foundation for DKI-based white matter fiber tractography, which has potential advantages over conventional DTI-based fiber tractography in generating more accurate predictions for the orientations of fiber bundles and in being able to directly resolve intra-voxel fiber crossings. The formula is illustrated with numerical simulations for a two-compartment model of fiber crossings and for human brain data. These results indicate that the inclusion of the leading non-Gaussian corrections can significantly affect fiber tractography in white matter regions, such as the centrum semiovale, where fiber crossings are common. 2013 John Wiley & Sons, Ltd.

  4. Leading Non-Gaussian Corrections for Diffusion Orientation Distribution Function

    PubMed Central

    Jensen, Jens H.; Helpern, Joseph A.; Tabesh, Ali

    2014-01-01

    An analytical representation of the leading non-Gaussian corrections for a class of diffusion orientation distribution functions (dODFs) is presented. This formula is constructed out of the diffusion and diffusional kurtosis tensors, both of which may be estimated with diffusional kurtosis imaging (DKI). By incorporating model-independent non-Gaussian diffusion effects, it improves upon the Gaussian approximation used in diffusion tensor imaging (DTI). This analytical representation therefore provides a natural foundation for DKI-based white matter fiber tractography, which has potential advantages over conventional DTI-based fiber tractography in generating more accurate predictions for the orientations of fiber bundles and in being able to directly resolve intra-voxel fiber crossings. The formula is illustrated with numerical simulations for a two-compartment model of fiber crossings and for human brain data. These results indicate that the inclusion of the leading non-Gaussian corrections can significantly affect fiber tractography in white matter regions, such as the centrum semiovale, where fiber crossings are common. PMID:24738143

  5. Ant-inspired density estimation via random walks

    PubMed Central

    Musco, Cameron; Su, Hsin-Hao

    2017-01-01

    Many ant species use distributed population density estimation in applications ranging from quorum sensing, to task allocation, to appraisal of enemy colony strength. It has been shown that ants estimate local population density by tracking encounter rates: The higher the density, the more often the ants bump into each other. We study distributed density estimation from a theoretical perspective. We prove that a group of anonymous agents randomly walking on a grid are able to estimate their density within a small multiplicative error in few steps by measuring their rates of encounter with other agents. Despite dependencies inherent in the fact that nearby agents may collide repeatedly (and, worse, cannot recognize when this happens), our bound nearly matches what would be required to estimate density by independently sampling grid locations. From a biological perspective, our work helps shed light on how ants and other social insects can obtain relatively accurate density estimates via encounter rates. From a technical perspective, our analysis provides tools for understanding complex dependencies in the collision probabilities of multiple random walks. We bound the strength of these dependencies using local mixing properties of the underlying graph. Our results extend beyond the grid to more general graphs, and we discuss applications to size estimation for social networks, density estimation for robot swarms, and random walk-based sampling for sensor networks. PMID:28928146

  6. Ant-inspired density estimation via random walks.

    PubMed

    Musco, Cameron; Su, Hsin-Hao; Lynch, Nancy A

    2017-10-03

    Many ant species use distributed population density estimation in applications ranging from quorum sensing, to task allocation, to appraisal of enemy colony strength. It has been shown that ants estimate local population density by tracking encounter rates: The higher the density, the more often the ants bump into each other. We study distributed density estimation from a theoretical perspective. We prove that a group of anonymous agents randomly walking on a grid are able to estimate their density within a small multiplicative error in few steps by measuring their rates of encounter with other agents. Despite dependencies inherent in the fact that nearby agents may collide repeatedly (and, worse, cannot recognize when this happens), our bound nearly matches what would be required to estimate density by independently sampling grid locations. From a biological perspective, our work helps shed light on how ants and other social insects can obtain relatively accurate density estimates via encounter rates. From a technical perspective, our analysis provides tools for understanding complex dependencies in the collision probabilities of multiple random walks. We bound the strength of these dependencies using local mixing properties of the underlying graph. Our results extend beyond the grid to more general graphs, and we discuss applications to size estimation for social networks, density estimation for robot swarms, and random walk-based sampling for sensor networks.

  7. Estimating Non-Normal Latent Trait Distributions within Item Response Theory Using True and Estimated Item Parameters

    ERIC Educational Resources Information Center

    Sass, D. A.; Schmitt, T. A.; Walker, C. M.

    2008-01-01

    Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…

  8. Smoothing the redshift distributions of random samples for the baryon acoustic oscillations: applications to the SDSS-III BOSS DR12 and QPM mock samples

    NASA Astrophysics Data System (ADS)

    Wang, Shao-Jiang; Guo, Qi; Cai, Rong-Gen

    2017-12-01

    We investigate the impact of different redshift distributions of random samples on the baryon acoustic oscillations (BAO) measurements of D_V(z)r_d^fid/r_d from the two-point correlation functions of galaxies in the Data Release 12 of the Baryon Oscillation Spectroscopic Survey (BOSS). Big surveys, such as BOSS, usually assign redshifts to the random samples by randomly drawing values from the measured redshift distributions of the data, which would necessarily introduce fiducial signals of fluctuations into the random samples, weakening the signals of BAO, if the cosmic variance cannot be ignored. We propose a smooth function of redshift distribution that fits the data well to populate the random galaxy samples. The resulting cosmological parameters match the input parameters of the mock catalogue very well. The significance of BAO signals has been improved by 0.33σ for a low-redshift sample and by 0.03σ for a constant-stellar-mass sample, though the absolute values do not change significantly. Given the precision of the measurements of current cosmological parameters, it would be appreciated for the future improvements on the measurements of galaxy clustering.

  9. Non-Intrusive, Distributed Gas Sensing Technology for Advanced Spacesuits

    NASA Technical Reports Server (NTRS)

    Delgado, Jesus; Phillips, Straun; Rubtsov, Vladimir; Chullen, Cinda

    2015-01-01

    Chemical sensors for monitoring gas composition, including oxygen, humidity, carbon dioxide, and trace contaminants are needed to characterize and validate spacesuit design and operating parameters. This paper reports on the first prototypes of a non-intrusive gas sensing technology based on flexible sensitive patches positioned inside spacesuit prototypes and interrogated by optical fibers routed outside the suit, taking advantage of the transparent materials of the suit prototypes. The sensitive patches are based on luminescent materials whose emission parameters vary with the partial pressure of a specific gas. Patches sensitive to carbon dioxide, humidity, oxygen, and ammonia have been developed, and their preliminary characterization in the laboratory using Mark III-like helmet parts is described. The first prototype system consists of a four-channel fiber optic luminescent detector that can be used to monitor any of the selected target gases at four locations. To switch from one gas to another we replace the (disposable) sensor patches and adjust the system settings. Repeatability among sensitive patches and of sensor performance from location to location has been confirmed, assuring that suit engineers will have flexibility in selecting multiple sensing points, fitting the sensor elements into the spacesuit, and easily repositioning the sensor elements as desired. The evaluation of the first prototype for monitoring carbon dioxide during washout studies in a space suit prototype is presented.

  10. Non-Intrusive, Distributed Gas Sensing Technology for Advanced Spacesuits

    NASA Technical Reports Server (NTRS)

    Delgado, Jesus; Phillips, Straun; Rubtsov, Vladimir; Chullen, Cinda

    2015-01-01

    Chemical sensors for monitoring gas composition, including oxygen, humidity, carbon dioxide, and trace contaminants, are needed to characterize and validate spacesuit design and operating parameters. This paper reports on the first prototypes of a non-intrusive gas sensing technology based on flexible sensitive patches positioned inside spacesuit prototypes and interrogated via optical fibers routed outside the suit, taking advantage of the transparent materials of the suit prototypes. The sensitive patches are based on luminescent materials whose emission parameters vary with the partial pressure of a specific gas. Patches sensitive to carbon dioxide, humidity, and temperature have been developed, and their preliminary laboratory characterization in Mark III-like helmet parts is described. The first prototype system consists of a four-channel fiber optic luminescent detector that can be used to monitor any of the selected target gases at four locations. To switch from one gas to another we replace the (disposable) sensor patches and adjust the system settings. Repeatability among sensitive patches and of sensor performance from location to location has been confirmed, assuring that suit engineers will have flexibility in selecting multiple sensing points, fitting the sensor elements into the spacesuit, and easily repositioning the sensor elements as desired. The evaluation of the first prototype for monitoring carbon dioxide during washout studies in a spacesuit prototype is presented.

  11. Robust multivariate nonparametric tests for detection of two-sample location shift in clinical trials

    PubMed Central

    Jiang, Xuejun; Guo, Xu; Zhang, Ning; Wang, Bo

    2018-01-01

    This article presents and investigates performance of a series of robust multivariate nonparametric tests for detection of location shift between two multivariate samples in randomized controlled trials. The tests are built upon robust estimators of distribution locations (medians, Hodges-Lehmann estimators, and an extended U statistic) with both unscaled and scaled versions. The nonparametric tests are robust to outliers and do not assume that the two samples are drawn from multivariate normal distributions. Bootstrap and permutation approaches are introduced for determining the p-values of the proposed test statistics. Simulation studies are conducted and numerical results are reported to examine performance of the proposed statistical tests. The numerical results demonstrate that the robust multivariate nonparametric tests constructed from the Hodges-Lehmann estimators are more efficient than those based on medians and the extended U statistic. The permutation approach can provide a more stringent control of Type I error and is generally more powerful than the bootstrap procedure. The proposed robust nonparametric tests are applied to detect multivariate distributional difference between the intervention and control groups in the Thai Healthy Choices study and examine the intervention effect of a four-session motivational interviewing-based intervention developed in the study to reduce risk behaviors among youth living with HIV. PMID:29672555

  12. Minimizing effects of methodological decisions on interpretation and prediction in species distribution studies: An example with background selection

    USGS Publications Warehouse

    Jarnevich, Catherine S.; Talbert, Marian; Morisette, Jeffrey T.; Aldridge, Cameron L.; Brown, Cynthia; Kumar, Sunil; Manier, Daniel; Talbert, Colin; Holcombe, Tracy R.

    2017-01-01

    Evaluating the conditions where a species can persist is an important question in ecology both to understand tolerances of organisms and to predict distributions across landscapes. Presence data combined with background or pseudo-absence locations are commonly used with species distribution modeling to develop these relationships. However, there is not a standard method to generate background or pseudo-absence locations, and method choice affects model outcomes. We evaluated combinations of both model algorithms (simple and complex generalized linear models, multivariate adaptive regression splines, Maxent, boosted regression trees, and random forest) and background methods (random, minimum convex polygon, and continuous and binary kernel density estimator (KDE)) to assess the sensitivity of model outcomes to choices made. We evaluated six questions related to model results, including five beyond the common comparison of model accuracy assessment metrics (biological interpretability of response curves, cross-validation robustness, independent data accuracy and robustness, and prediction consistency). For our case study with cheatgrass in the western US, random forest was least sensitive to background choice and the binary KDE method was least sensitive to model algorithm choice. While this outcome may not hold for other locations or species, the methods we used can be implemented to help determine appropriate methodologies for particular research questions.

  13. Vertical Distribution of Radiation Stress for Non-linear Shoaling Waves

    NASA Astrophysics Data System (ADS)

    Webb, B. M.; Slinn, D. N.

    2004-12-01

    The flux of momentum directed shoreward by an incident wave field, commonly referred to as the radiation stress, plays a significant role in nearshore circulation and, therefore, has a profound impact on the transport of pollutants, biota, and sediment in nearshore systems. Having received much attention since the seminal work of Longuet-Higgins and Stewart in the early 1960's, use of the radiation stress concept continues to be refined and evidence of its utility is widespread in literature pertaining to coastal and ocean science. A number of investigations, both numerical and analytical in nature, have used the concept of the radiation stress to derive appropriate forcing mechanisms that initiate cross-shore and longshore circulation, but typically in a depth-averaged sense due to a lack of information concerning the vertical distribution of the wave stresses. While depth-averaged nearshore circulation models are still widely used today, advancements in technology have permitted the adaptation of three-dimensional (3D) modeling techniques to study flow properties of complex nearshore circulation systems. It has been shown that the resulting circulation in these 3D models is very sensitive to the vertical distribution of the nearshore forcing, which have often been implemented as either depth-uniform or depth-linear distributions. Recently, analytical expressions describing the vertical structure of radiation stress components have appeared in the literature (see Mellor, 2003; Xia et al., 2004) but do not fully describe the magnitude and structure in the region bound by the trough and crest of non-linear, propagating waves. Utilizing a three-dimensional, non-linear, numerical model that resolves the time-dependent free surface, we present mean flow properties resulting from a simulation of Visser's (1984, 1991) laboratory experiment on uniform longshore currents. More specifically, we provide information regarding the vertical distribution of radiation stress

  14. Improvement of Characteristics of Clayey Soil Mixed with Randomly Distributed Natural Fibers

    NASA Astrophysics Data System (ADS)

    Maity, J.; Chattopadhyay, B. C.; Mukherjee, S. P.

    2017-11-01

    In subgrade construction for flexible road pavement, properties of clayey soils available locally can be improved by providing randomly distributed fibers in the soil. The fibers added in subgrade constructions are expected to provide better compact interlocking system between the fiber and the soil grain, greater resistance to deformation and quicker dissipation of pore water pressure, thus helping consolidation and strengthening. Many natural fibers like jute, coir, sabai grass etc. which are economical and eco-friendly, are grown in abundance in India. If suitable they can be used as additive material in the subgrade soil to result in increase in strength and decrease in deformability. Such application will also reduce the cost of construction of roads, by providing lesser thickness of pavement layer. In this paper, the efficacy of using natural jute, coir or sabai grass fibers with locally available clayey soil has been studied. A series of Standard Proctor test, Soaked and Unsoaked California Bearing Ratio (CBR) test, and Unconfined Compressive Strength test were done on locally available clayey soil mixed with different types of natural fiber for various length and proportion to study the improvement of strength properties of fiber-soil composites placed at optimum moisture content. From the test results, it was observed that there was a substantial increase in CBR value for the clayey soil when mixed with increasing percentage of all three types of randomly distributed natural fibers up to 2% of the dry weight of soil. The CBR attains maximum value when the length for all types of fibers mixed with the clay taken in this study, attains a value of 10 mm.

  15. IMAGING AND SPECTROSCOPIC OBSERVATIONS OF A TRANSIENT CORONAL LOOP: EVIDENCE FOR THE NON-MAXWELLIAN κ-DISTRIBUTIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dudík, Jaroslav; Mackovjak, Šimon; Dzifčáková, Elena

    2015-07-10

    We report on the Solar Dynamics Observatory/Atmospheric Imaging Assembly (AIA) and Hinode/EUV Imaging Spectrograph (EIS) observations of a transient coronal loop. The loop brightens up in the same location after the disappearance of an arcade formed during a B8.9-class microflare 3 hr earlier. EIS captures this loop during its brightening phase, as observed in most of the AIA filters. We use the AIA data to study the evolution of the loop, as well as to perform the differential emission measure (DEM) diagnostics as a function of κ. The Fe xi–Fe xiii lines observed by EIS are used to perform themore » diagnostics of electron density and subsequently the diagnostics of κ. Using ratios involving the Fe xi 257.772 Å self-blend, we diagnose κ ≲ 2, i.e., an extremely non-Maxwellian distribution. Using the predicted Fe line intensities derived from the DEMs as a function of κ, we show that, with decreasing κ, all combinations of ratios of line intensities converge to the observed values, confirming the diagnosed κ ≲ 2. These results represent the first positive diagnostics of κ-distributions in the solar corona despite the limitations imposed by calibration uncertainties.« less

  16. Comparative analysis of ferroelectric domain statistics via nonlinear diffraction in random nonlinear materials.

    PubMed

    Wang, B; Switowski, K; Cojocaru, C; Roppo, V; Sheng, Y; Scalora, M; Kisielewski, J; Pawlak, D; Vilaseca, R; Akhouayri, H; Krolikowski, W; Trull, J

    2018-01-22

    We present an indirect, non-destructive optical method for domain statistic characterization in disordered nonlinear crystals having homogeneous refractive index and spatially random distribution of ferroelectric domains. This method relies on the analysis of the wave-dependent spatial distribution of the second harmonic, in the plane perpendicular to the optical axis in combination with numerical simulations. We apply this technique to the characterization of two different media, Calcium Barium Niobate and Strontium Barium Niobate, with drastically different statistical distributions of ferroelectric domains.

  17. Random and non-random mating populations: Evolutionary dynamics in meiotic drive.

    PubMed

    Sarkar, Bijan

    2016-01-01

    Game theoretic tools are utilized to analyze a one-locus continuous selection model of sex-specific meiotic drive by considering nonequivalence of the viabilities of reciprocal heterozygotes that might be noticed at an imprinted locus. The model draws attention to the role of viability selections of different types to examine the stable nature of polymorphic equilibrium. A bridge between population genetics and evolutionary game theory has been built up by applying the concept of the Fundamental Theorem of Natural Selection. In addition to pointing out the influences of male and female segregation ratios on selection, configuration structure reveals some noted results, e.g., Hardy-Weinberg frequencies hold in replicator dynamics, occurrence of faster evolution at the maximized variance fitness, existence of mixed Evolutionarily Stable Strategy (ESS) in asymmetric games, the tending evolution to follow not only a 1:1 sex ratio but also a 1:1 different alleles ratio at particular gene locus. Through construction of replicator dynamics in the group selection framework, our selection model introduces a redefining bases of game theory to incorporate non-random mating where a mating parameter associated with population structure is dependent on the social structure. Also, the model exposes the fact that the number of polymorphic equilibria will depend on the algebraic expression of population structure. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Discriminative Random Field Models for Subsurface Contamination Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Arshadi, M.; Abriola, L. M.; Miller, E. L.; De Paolis Kaluza, C.

    2017-12-01

    Application of flow and transport simulators for prediction of the release, entrapment, and persistence of dense non-aqueous phase liquids (DNAPLs) and associated contaminant plumes is a computationally intensive process that requires specification of a large number of material properties and hydrologic/chemical parameters. Given its computational burden, this direct simulation approach is particularly ill-suited for quantifying both the expected performance and uncertainty associated with candidate remediation strategies under real field conditions. Prediction uncertainties primarily arise from limited information about contaminant mass distributions, as well as the spatial distribution of subsurface hydrologic properties. Application of direct simulation to quantify uncertainty would, thus, typically require simulating multiphase flow and transport for a large number of permeability and release scenarios to collect statistics associated with remedial effectiveness, a computationally prohibitive process. The primary objective of this work is to develop and demonstrate a methodology that employs measured field data to produce equi-probable stochastic representations of a subsurface source zone that capture the spatial distribution and uncertainty associated with key features that control remediation performance (i.e., permeability and contamination mass). Here we employ probabilistic models known as discriminative random fields (DRFs) to synthesize stochastic realizations of initial mass distributions consistent with known, and typically limited, site characterization data. Using a limited number of full scale simulations as training data, a statistical model is developed for predicting the distribution of contaminant mass (e.g., DNAPL saturation and aqueous concentration) across a heterogeneous domain. Monte-Carlo sampling methods are then employed, in conjunction with the trained statistical model, to generate realizations conditioned on measured borehole data

  19. Variation of mutational burden in healthy human tissues suggests non-random strand segregation and allows measuring somatic mutation rates.

    PubMed

    Werner, Benjamin; Sottoriva, Andrea

    2018-06-01

    The immortal strand hypothesis poses that stem cells could produce differentiated progeny while conserving the original template strand, thus avoiding accumulating somatic mutations. However, quantitating the extent of non-random DNA strand segregation in human stem cells remains difficult in vivo. Here we show that the change of the mean and variance of the mutational burden with age in healthy human tissues allows estimating strand segregation probabilities and somatic mutation rates. We analysed deep sequencing data from healthy human colon, small intestine, liver, skin and brain. We found highly effective non-random DNA strand segregation in all adult tissues (mean strand segregation probability: 0.98, standard error bounds (0.97,0.99)). In contrast, non-random strand segregation efficiency is reduced to 0.87 (0.78,0.88) in neural tissue during early development, suggesting stem cell pool expansions due to symmetric self-renewal. Healthy somatic mutation rates differed across tissue types, ranging from 3.5 × 10-9/bp/division in small intestine to 1.6 × 10-7/bp/division in skin.

  20. Probability distribution and statistical properties of spherically compensated cosmic regions in ΛCDM cosmology

    NASA Astrophysics Data System (ADS)

    Alimi, Jean-Michel; de Fromont, Paul

    2018-04-01

    The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.

  1. Cyber-Physical Trade-Offs in Distributed Detection Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S; Yao, David K. Y.; Chin, J. C.

    2010-01-01

    We consider a network of sensors that measure the scalar intensity due to the background or a source combined with background, inside a two-dimensional monitoring area. The sensor measurements may be random due to the underlying nature of the source and background or due to sensor errors or both. The detection problem is infer the presence of a source of unknown intensity and location based on sensor measurements. In the conventional approach, detection decisions are made at the individual sensors, which are then combined at the fusion center, for example using the majority rule. With increased communication and computation costs,more » we show that a more complex fusion algorithm based on measurements achieves better detection performance under smooth and non-smooth source intensity functions, Lipschitz conditions on probability ratios and a minimum packing number for the state-space. We show that these conditions for trade-offs between the cyber costs and physical detection performance are applicable for two detection problems: (i) point radiation sources amidst background radiation, and (ii) sources and background with Gaussian distributions.« less

  2. Interdependent encoding of pitch, timbre and spatial location in auditory cortex

    PubMed Central

    Bizley, Jennifer K.; Walker, Kerry M. M.; Silverman, Bernard W.; King, Andrew J.; Schnupp, Jan W. H.

    2009-01-01

    Because we can perceive the pitch, timbre and spatial location of a sound source independently, it seems natural to suppose that cortical processing of sounds might separate out spatial from non-spatial attributes. Indeed, recent studies support the existence of anatomically segregated ‘what’ and ‘where’ cortical processing streams. However, few attempts have been made to measure the responses of individual neurons in different cortical fields to sounds that vary simultaneously across spatial and non-spatial dimensions. We recorded responses to artificial vowels presented in virtual acoustic space to investigate the representations of pitch, timbre and sound source azimuth in both core and belt areas of ferret auditory cortex. A variance decomposition technique was used to quantify the way in which altering each parameter changed neural responses. Most units were sensitive to two or more of these stimulus attributes. Whilst indicating that neural encoding of pitch, location and timbre cues is distributed across auditory cortex, significant differences in average neuronal sensitivity were observed across cortical areas and depths, which could form the basis for the segregation of spatial and non-spatial cues at higher cortical levels. Some units exhibited significant non-linear interactions between particular combinations of pitch, timbre and azimuth. These interactions were most pronounced for pitch and timbre and were less commonly observed between spatial and non-spatial attributes. Such non-linearities were most prevalent in primary auditory cortex, although they tended to be small compared with stimulus main effects. PMID:19228960

  3. 47 CFR 54.309 - Calculation and distribution of forward-looking support for non-rural carriers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Calculation and distribution of forward-looking... Areas § 54.309 Calculation and distribution of forward-looking support for non-rural carriers. (a...) Distribution of total support available per state. The total amount of support available per State calculated...

  4. A Randomized Trial Comparing Mail versus In-Office Distribution of the CAHPS Clinician and Group Survey

    PubMed Central

    Anastario, Michael P; Rodriguez, Hector P; Gallagher, Patricia M; Cleary, Paul D; Shaller, Dale; Rogers, William H; Bogen, Karen; Safran, Dana Gelb

    2010-01-01

    Objective To assess the effect of survey distribution protocol (mail versus handout) on data quality and measurement of patient care experiences. Data Sources/Study Setting Multisite randomized trial of survey distribution protocols. Analytic sample included 2,477 patients of 15 clinicians at three practice sites in New York State. Data Collection/Extraction Methods Mail and handout distribution modes were alternated weekly at each site for 6 weeks. Principal Findings Handout protocols yielded an incomplete distribution rate (74 percent) and lower overall response rates (40 percent versus 58 percent) compared with mail. Handout distribution rates decreased over time and resulted in more favorable survey scores compared with mailed surveys. There were significant mode–physician interaction effects, indicating that data cannot simply be pooled and adjusted for mode. Conclusions In-office survey distribution has the potential to bias measurement and comparison of physicians and sites on patient care experiences. Incomplete distribution rates observed in-office, together with between-office differences in distribution rates and declining rates over time suggest staff may be burdened by the process and selective in their choice of patients. Further testing with a larger physician and site sample is important to definitively establish the potential role for in-office distribution in obtaining reliable, valid assessment of patient care experiences. PMID:20579126

  5. Lateral interactions and non-equilibrium in surface kinetics

    NASA Astrophysics Data System (ADS)

    Menzel, Dietrich

    2016-08-01

    Work modelling reactions between surface species frequently use Langmuir kinetics, assuming that the layer is in internal equilibrium, and that the chemical potential of adsorbates corresponds to that of an ideal gas. Coverage dependences of reacting species and of site blocking are usually treated with simple power law coverage dependences (linear in the simplest case), neglecting that lateral interactions are strong in adsorbate and co-adsorbate layers which may influence kinetics considerably. My research group has in the past investigated many co-adsorbate systems and simple reactions in them. We have collected a number of examples where strong deviations from simple coverage dependences exist, in blocking, promoting, and selecting reactions. Interactions can range from those between next neighbors to larger distances, and can be quite complex. In addition, internal equilibrium in the layer as well as equilibrium distributions over product degrees of freedom can be violated. The latter effect leads to non-equipartition of energy over molecular degrees of freedom (for products) or non-equal response to those of reactants. While such behavior can usually be described by dynamic or kinetic models, the deeper reasons require detailed theoretical analysis. Here, a selection of such cases is reviewed to exemplify these points.

  6. Role of location-dependent transverse wind on root-mean-square bandwidth of temporal light-flux fluctuations in the turbulent atmosphere.

    PubMed

    Chen, Chunyi; Yang, Huamin

    2017-11-01

    The root-mean-square (RMS) bandwidth of temporal light-flux fluctuations is formulated for both plane and spherical waves propagating in the turbulent atmosphere with location-dependent transverse wind. Two path weighting functions characterizing the joint contributions of turbulent eddies and transverse winds at various locations toward the RMS bandwidth are derived. Based on the developed formulations, the roles of variations in both the direction and magnitude of transverse wind velocity with locations over a path on the RMS bandwidth are elucidated. For propagation paths between ground and space, comparisons of the RMS bandwidth computed based on the Bufton wind profile with that calculated by assuming a nominal constant transverse wind velocity are made to exemplify the effect that location dependence of transverse wind velocity has on the RMS bandwidth. Moreover, an expression for the weighted RMS transverse wind velocity has been derived, which can be used as a nominal constant transverse wind velocity over a path for accurately determining the RMS bandwidth.

  7. The Common Good: The Inclusion of Non-Catholic Students in Catholic Schools

    ERIC Educational Resources Information Center

    Donlevy, J. Kent

    2008-01-01

    This paper offers that liberal and communitarian concepts of the common good are exemplified in the Catholic school's policy of the inclusion of non-Catholic students. In particular, the liberal concepts of personal autonomy, individual rights and freedoms, and the principles of fairness, justice, equality and respect for diversity--as democratic…

  8. Precision of EM Simulation Based Wireless Location Estimation in Multi-Sensor Capsule Endoscopy.

    PubMed

    Khan, Umair; Ye, Yunxing; Aisha, Ain-Ul; Swar, Pranay; Pahlavan, Kaveh

    2018-01-01

    In this paper, we compute and examine two-way localization limits for an RF endoscopy pill as it passes through an individuals gastrointestinal (GI) tract. We obtain finite-difference time-domain and finite element method-based simulation results position assessment employing time of arrival (TOA). By means of a 3-D human body representation from a full-wave simulation software and lognormal models for TOA propagation from implant organs to body surface, we calculate bounds on location estimators in three digestive organs: stomach, small intestine, and large intestine. We present an investigation of the causes influencing localization precision, consisting of a range of organ properties; peripheral sensor array arrangements, number of pills in cooperation, and the random variations in transmit power of sensor nodes. We also perform a localization precision investigation for the situation where the transmission signal of the antenna is arbitrary with a known probability distribution. The computational solver outcome shows that the number of receiver antennas on the exterior of the body has higher impact on the precision of the location than the amount of capsules in collaboration within the GI region. The large intestine is influenced the most by the transmitter power probability distribution.

  9. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  10. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    PubMed Central

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information. PMID:26823196

  11. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption.

    PubMed

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-29

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  12. Nature and nurture in the family physician's choice of practice location.

    PubMed

    Orzanco, Maria Gabriela; Lovato, Chris; Bates, Joanna; Slade, Steve; Grand'Maison, Paul; Vanasse, Alain

    2011-01-01

    An understanding of the contextual, professional, and personal factors that affect choice of practice location for physicians is needed to support successful strategies in addressing geographic maldistribution of physicians. This study compared two categories of predictors of family practice location in non-metropolitan areas among undergraduate medical students: individual characteristics (nature), and the rural program component of their training program (nurture). The study aimed to identify factors that predict the location of practice 2 years post-residency training and determine the predictive value of combining nature and nurture variables using administrative data from two undergraduate medical education programs. Databases were developed from available administrative sources for a retrospective analysis of two undergraduate medical education programs in Canada: Université de Sherbrooke (UdeS) and University of British Columbia (UBC). Both schools have a strong mandate to evaluate the impact of their programs on physician distribution. The dependent variable was location of practice 2 years after completing postgraduate training in family medicine. Independent variables included individual and program characteristics. Separate analyses were conducted for each program using multiple logistic regression. The nature and nurture variables considered in the models explained only 21% to 27% of the variance in the eventual location of practice of family physician graduates. For UdeS, having an address in a rural/small-town environment at application to medical school (OR=2.61, 95% CI: 1.24-6.06) and for UBC, location of high school in a rural/small town (OR=4.03, 95% CI: 1.05-15.41), both increased the chances of practicing in a non-metropolitan area. For UdeS the nurture variable (ie length of clerkship in a non-metropolitan area) was the most significant predictor (OR=1.14, 95% CI: 1.067-1.22). For both medical schools, adding a single nurture variable to the

  13. Models for disaster relief shelter location and supply routing.

    DOT National Transportation Integrated Search

    2013-01-01

    This project focuses on the development of a natural disaster response planning model that determines where to locate points of distribution for relief supplies after a disaster occurs. Advance planning (selecting locations for points of distribution...

  14. The Contribution of Non-Representational Theories in Education: Some Affective, Ethical and Political Implications

    ERIC Educational Resources Information Center

    Zembylas, Michalinos

    2017-01-01

    This paper follows recent debates around theorizations of "affect" and its distinction from "emotion" in the context of non-representational theories (NRT) to exemplify how the ontologization of affects creates important openings of ethical and political potential in educators' efforts to make productive interventions in…

  15. Human X-chromosome inactivation pattern distributions fit a model of genetically influenced choice better than models of completely random choice

    PubMed Central

    Renault, Nisa K E; Pritchett, Sonja M; Howell, Robin E; Greer, Wenda L; Sapienza, Carmen; Ørstavik, Karen Helene; Hamilton, David C

    2013-01-01

    In eutherian mammals, one X-chromosome in every XX somatic cell is transcriptionally silenced through the process of X-chromosome inactivation (XCI). Females are thus functional mosaics, where some cells express genes from the paternal X, and the others from the maternal X. The relative abundance of the two cell populations (X-inactivation pattern, XIP) can have significant medical implications for some females. In mice, the ‘choice' of which X to inactivate, maternal or paternal, in each cell of the early embryo is genetically influenced. In humans, the timing of XCI choice and whether choice occurs completely randomly or under a genetic influence is debated. Here, we explore these questions by analysing the distribution of XIPs in large populations of normal females. Models were generated to predict XIP distributions resulting from completely random or genetically influenced choice. Each model describes the discrete primary distribution at the onset of XCI, and the continuous secondary distribution accounting for changes to the XIP as a result of development and ageing. Statistical methods are used to compare models with empirical data from Danish and Utah populations. A rigorous data treatment strategy maximises information content and allows for unbiased use of unphased XIP data. The Anderson–Darling goodness-of-fit statistics and likelihood ratio tests indicate that a model of genetically influenced XCI choice better fits the empirical data than models of completely random choice. PMID:23652377

  16. Formation of randomly distributed nano-tubes, -rods and -plates of n-type and p-type bismuth telluride via molecular legation

    NASA Astrophysics Data System (ADS)

    Ram, Jasa; Ghosal, Partha

    2015-08-01

    Randomly distributed nanotubes, nanorods and nanoplates of Bi0.5Sb1.5Te3 and Bi2Te2.7Se0.3 ternary compounds have been synthesized via a high yield solvo-thermal process. Prior to solvo-thermal heating at 230 °C for crystallization, we ensured molecular legation in room temperature reaction by complete reduction of precursor materials, dissolved in ethylene glycol and confirmed it by replicating Raman spectra of amorphous and crystalline materials. These nanomaterials have also been characterized using XRD, FE-SEM, EDS and TEM. Possible formation mechanism is also discussed. This single process will enable development of thermoelectric modules and random distribution of diverse morphology will be beneficial in retaining nano-crystallite sizes.

  17. Which Type of Risk Information to Use for Whom? Moderating Role of Outcome-Relevant Involvement in the Effects of Statistical and Exemplified Risk Information on Risk Perceptions.

    PubMed

    So, Jiyeon; Jeong, Se-Hoon; Hwang, Yoori

    2017-04-01

    The extant empirical research examining the effectiveness of statistical and exemplar-based health information is largely inconsistent. Under the premise that the inconsistency may be due to an unacknowledged moderator (O'Keefe, 2002), this study examined a moderating role of outcome-relevant involvement (Johnson & Eagly, 1989) in the effects of statistical and exemplified risk information on risk perception. Consistent with predictions based on elaboration likelihood model (Petty & Cacioppo, 1984), findings from an experiment (N = 237) concerning alcohol consumption risks showed that statistical risk information predicted risk perceptions of individuals with high, rather than low, involvement, while exemplified risk information predicted risk perceptions of those with low, rather than high, involvement. Moreover, statistical risk information contributed to negative attitude toward drinking via increased risk perception only for highly involved individuals, while exemplified risk information influenced the attitude through the same mechanism only for individuals with low involvement. Theoretical and practical implications for health risk communication are discussed.

  18. Permutational distribution of the log-rank statistic under random censorship with applications to carcinogenicity assays.

    PubMed

    Heimann, G; Neuhaus, G

    1998-03-01

    In the random censorship model, the log-rank test is often used for comparing a control group with different dose groups. If the number of tumors is small, so-called exact methods are often applied for computing critical values from a permutational distribution. Two of these exact methods are discussed and shown to be incorrect. The correct permutational distribution is derived and studied with respect to its behavior under unequal censoring in the light of recent results proving that the permutational version and the unconditional version of the log-rank test are asymptotically equivalent even under unequal censoring. The log-rank test is studied by simulations of a realistic scenario from a bioassay with small numbers of tumors.

  19. Location Distribution Optimization of Photographing Sites for Indoor Panorama Modeling

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Wu, J.; Zhang, Y.; Zhang, X.; Xin, Z.; Liu, J.

    2017-09-01

    Generally, panoramas image modeling is costly and time-consuming because of photographing continuously to capture enough photos along the routes, especially in complicated indoor environment. Thus, difficulty follows for a wider applications of panoramic image modeling for business. It is indispensable to make a feasible arrangement of panorama sites locations because the locations influence the clarity, coverage and the amount of panoramic images under the condition of certain device. This paper is aim to propose a standard procedure to generate the specific location and total amount of panorama sites in indoor panoramas modeling. Firstly, establish the functional relationship between one panorama site and its objectives. Then, apply the relationship to panorama sites network. We propose the Distance Clarity function (FC and Fe) manifesting the mathematical relationship between panoramas and objectives distance or obstacle distance. The Distance Buffer function (FB) is modified from traditional buffer method to generate the coverage of panorama site. Secondly, transverse every point in possible area to locate possible panorama site, calculate the clarity and coverage synthetically. Finally select as little points as possible to satiate clarity requirement preferentially and then the coverage requirement. In the experiments, detailed parameters of camera lens are given. Still, more experiments parameters need trying out given that relationship between clarity and distance is device dependent. In short, through the function FC, Fe and FB, locations of panorama sites can be generated automatically and accurately.

  20. Non-random mating and convergence over time for alcohol consumption, smoking, and exercise: the Nord-Trøndelag Health Study.

    PubMed

    Ask, Helga; Rognmo, Kamilla; Torvik, Fartein Ask; Røysamb, Espen; Tambs, Kristian

    2012-05-01

    Spouses tend to have similar lifestyles. We explored the degree to which spouse similarity in alcohol use, smoking, and physical exercise is caused by non-random mating or convergence. We used data collected for the Nord-Trøndelag Health Study from 1984 to 1986 and prospective registry information about when and with whom people entered marriage/cohabitation between 1970 and 2000. Our sample included 19,599 married/cohabitating couples and 1,551 future couples that were to marry/cohabitate in the 14-16 years following data collection. All couples were grouped according to the duration between data collection and entering into marriage/cohabitation. Age-adjusted polychoric spouse correlations were used as the dependent variables in non-linear segmented regression analysis; the independent variable was time. The results indicate that spouse concordance in lifestyle is due to both non-random mating and convergence. Non-random mating appeared to be strongest for smoking. Convergence in alcohol use and smoking was evident during the period prior to marriage/cohabitation, whereas convergence in exercise was evident throughout life. Reduced spouse similarity in smoking with relationship duration may reflect secular trends.