This report is a description of field work and data analysis results comparing a design comparable to systematic site selection with one based on random selection of sites. The report is expected to validate the use of random site selection in the bioassessment program for the O...
Wampler, Peter J; Rediske, Richard R; Molla, Azizur R
2013-01-18
A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This method provides an important technique that can be applied to other developing countries where a randomized study design is needed but infrastructure is lacking to implement more traditional participant selection methods.
SELECTING SITES FOR COMPARISON WITH CREATED WETLANDS
The paper describes the method used for selecting natural wetlands to compare with created wetlands. The results of the selection process and the advantages and disadvantages of the method are discussed. The random site selection method required extensive field work and may have ...
Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling
Barranca, Victor J.; Kovačič, Gregor; Zhou, Douglas; Cai, David
2016-01-01
Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging. PMID:27555464
Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E
2001-01-01
Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.
Rational group decision making: A random field Ising model at T = 0
NASA Astrophysics Data System (ADS)
Galam, Serge
1997-02-01
A modified version of a finite random field Ising ferromagnetic model in an external magnetic field at zero temperature is presented to describe group decision making. Fields may have a non-zero average. A postulate of minimum inter-individual conflicts is assumed. Interactions then produce a group polarization along one very choice which is however randomly selected. A small external social pressure is shown to have a drastic effect on the polarization. Individual bias related to personal backgrounds, cultural values and past experiences are introduced via quenched local competing fields. They are shown to be instrumental in generating a larger spectrum of collective new choices beyond initial ones. In particular, compromise is found to results from the existence of individual competing bias. Conflict is shown to weaken group polarization. The model yields new psychosociological insights about consensus and compromise in groups.
ERIC Educational Resources Information Center
Windschitl, Mark; Dvornich, Karen; Ryken, Amy E.; Tudor, Margaret; Koehler, Gary
2007-01-01
Field investigations are not characterized by randomized and manipulated control group experiments; however, most school science and high-stakes tests recognize only this paradigm of investigation. Scientists in astronomy, genetics, field biology, oceanography, geology, and meteorology routinely select naturally occurring events and conditions and…
Open-field behavior of house mice selectively bred for high voluntary wheel-running.
Bronikowski, A M; Carter, P A; Swallow, J G; Girard, I A; Rhodes, J S; Garland, T
2001-05-01
Open-field behavioral assays are commonly used to test both locomotor activity and emotionality in rodents. We performed open-field tests on house mice (Mus domesticus) from four replicate lines genetically selected for high voluntary wheel-running for 22 generations and from four replicate random-bred control lines. Individual mice were recorded by video camera for 3 min in a 1-m2 open-field arena on 2 consecutive days. Mice from selected lines showed no statistical differences from control mice with respect to distance traveled, defecation, time spent in the interior, or average distance from the center of the arena during the trial. Thus, we found little evidence that open-field behavior, as traditionally defined, is genetically correlated with wheel-running behavior. This result is a useful converse test of classical studies that report no increased wheel-running in mice selected for increased open-field activity. However, mice from selected lines turned less in their travel paths than did control-line mice, and females from selected lines had slower travel times (longer latencies) to reach the wall. We discuss these results in the context of the historical open-field test and newly defined measures of open-field activity.
Experimental Evaluation of Field Trips on Instruction in Vocational Agriculture.
ERIC Educational Resources Information Center
McCaslin, Norval L.
To determine the effect of field trips on student achievement in each of four subject matter areas in vocational agriculture, 12 schools offering approved programs were randomly selected and divided into a treatment group and a control group. Uniform teaching outlines and reference materials were provided to each group. While no field trips were…
USDA-ARS?s Scientific Manuscript database
The genetic effects of long term random mating and natural selection aided by genetic male sterility (gms) were evaluated in two soybean [Glycine max (L.) Merr.] populations designated: RSII and RSIII. These populations were evaluated in the field at three locations each with two replications. Genot...
Vegetative propagation of butternut (Juglans cinerea) field results
Paula M. Pijut
2004-01-01
Juglans cinerea L. is a hardwood species valued for its wood and edible nuts. Butternut canker disease (Sirococcus clavigignenti-juglandacearum) threatens its survival. Vegetative propagation will be required to produce clones of genotypes selected for resistance to butternut canker disease. In 2000, 10 trees were randomly selected...
Patching, Geoffrey R.; Rahm, Johan; Jansson, Märit; Johansson, Maria
2017-01-01
Accurate assessment of people’s preferences for different outdoor lighting applications is increasingly considered important in the development of new urban environments. Here a new method of random environmental walking is proposed to complement current methods of assessing urban lighting applications, such as self-report questionnaires. The procedure involves participants repeatedly walking between different lighting applications by random selection of a lighting application and preferred choice or by random selection of a lighting application alone. In this manner, participants are exposed to all lighting applications of interest more than once and participants’ preferences for the different lighting applications are reflected in the number of times they walk to each lighting application. On the basis of an initial simulation study, to explore the feasibility of this approach, a comprehensive field test was undertaken. The field test included random environmental walking and collection of participants’ subjective ratings of perceived pleasantness (PP), perceived quality, perceived strength, and perceived flicker of four lighting applications. The results indicate that random environmental walking can reveal participants’ preferences for different lighting applications that, in the present study, conformed to participants’ ratings of PP and perceived quality of the lighting applications. As a complement to subjectively stated environmental preferences, random environmental walking has the potential to expose behavioral preferences for different lighting applications. PMID:28337163
Makowski, David; Bancal, Rémi; Bensadoun, Arnaud; Monod, Hervé; Messéan, Antoine
2017-09-01
According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non-genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non-GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene-flow model (a zero-inflated Poisson model) simulating cross-pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene-flow model. © 2017 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Lauterbach, S.; Fina, M.; Wagner, W.
2018-04-01
Since structural engineering requires highly developed and optimized structures, the thickness dependency is one of the most controversially debated topics. This paper deals with stability analysis of lightweight thin structures combined with arbitrary geometrical imperfections. Generally known design guidelines only consider imperfections for simple shapes and loading, whereas for complex structures the lower-bound design philosophy still holds. Herein, uncertainties are considered with an empirical knockdown factor representing a lower bound of existing measurements. To fully understand and predict expected bearable loads, numerical investigations are essential, including geometrical imperfections. These are implemented into a stand-alone program code with a stochastic approach to compute random fields as geometric imperfections that are applied to nodes of the finite element mesh of selected structural examples. The stochastic approach uses the Karhunen-Loève expansion for the random field discretization. For this approach, the so-called correlation length l_c controls the random field in a powerful way. This parameter has a major influence on the buckling shape, and also on the stability load. First, the impact of the correlation length is studied for simple structures. Second, since most structures for engineering devices are more complex and combined structures, these are intensively discussed with the focus on constrained random fields for e.g. flange-web-intersections. Specific constraints for those random fields are pointed out with regard to the finite element model. Further, geometrical imperfections vanish where the structure is supported.
ERIC Educational Resources Information Center
Castillo, Jose M.; Curtis, Michael J.; Gelley, Cheryl
2012-01-01
Every 5 years, the National Association of School Psychologists (NASP) conducts a national study of the field. Surveys are sent to randomly selected regular members of NASP to gather information on school psychologists' demographic characteristics, context for professional practices, and professional practices. The latest iteration of the national…
ERIC Educational Resources Information Center
Piper, Martha K.
Thirty-six students enrolled in an elementary science methods course were randomly selected and given an instrument using Osgood's semantic differential approach the first week of class, the sixth week on campus prior to field experiences, and the thirteenth week following field experiences. The elementary teachers who had observed the university…
Volpe, Giorgio; Volpe, Giovanni; Gigan, Sylvain
2014-01-01
The motion of particles in random potentials occurs in several natural phenomena ranging from the mobility of organelles within a biological cell to the diffusion of stars within a galaxy. A Brownian particle moving in the random optical potential associated to a speckle pattern, i.e., a complex interference pattern generated by the scattering of coherent light by a random medium, provides an ideal model system to study such phenomena. Here, we derive a theory for the motion of a Brownian particle in a speckle field and, in particular, we identify its universal characteristic timescale. Based on this theoretical insight, we show how speckle light fields can be used to control the anomalous diffusion of a Brownian particle and to perform some basic optical manipulation tasks such as guiding and sorting. Our results might broaden the perspectives of optical manipulation for real-life applications. PMID:24496461
What Every Public School Physical Educator Should Know about the Hiring Process
ERIC Educational Resources Information Center
Stier, William F., Jr.; Schneider, Robert C.
2007-01-01
A national survey of high school principals was conducted to determine whether they agreed or disagreed with selected practices and procedures used to hire high school physical education teachers. A survey instrument, developed with the help of experts in the field and consisting of 29 items, was sent to 400 randomly selected principals. Useable…
Relationship of field and LiDAR estimates of forest canopy cover with snow accumulation and melt
Mariana Dobre; William J. Elliot; Joan Q. Wu; Timothy E. Link; Brandon Glaza; Theresa B. Jain; Andrew T. Hudak
2012-01-01
At the Priest River Experimental Forest in northern Idaho, USA, snow water equivalent (SWE) was recorded over a period of six years on random, equally-spaced plots in ~4.5 ha small watersheds (n=10). Two watersheds were selected as controls and eight as treatments, with two watersheds randomly assigned per treatment as follows: harvest (2007) followed by mastication (...
Vortex-Core Reversal Dynamics: Towards Vortex Random Access Memory
NASA Astrophysics Data System (ADS)
Kim, Sang-Koog
2011-03-01
An energy-efficient, ultrahigh-density, ultrafast, and nonvolatile solid-state universal memory is a long-held dream in the field of information-storage technology. The magnetic random access memory (MRAM) along with a spin-transfer-torque switching mechanism is a strong candidate-means of realizing that dream, given its nonvolatility, infinite endurance, and fast random access. Magnetic vortices in patterned soft magnetic dots promise ground-breaking applications in information-storage devices, owing to the very stable twofold ground states of either their upward or downward core magnetization orientation and plausible core switching by in-plane alternating magnetic fields or spin-polarized currents. However, two technologically most important but very challenging issues --- low-power recording and reliable selection of each memory cell with already existing cross-point architectures --- have not yet been resolved for the basic operations in information storage, that is, writing (recording) and readout. Here, we experimentally demonstrate a magnetic vortex random access memory (VRAM) in the basic cross-point architecture. This unique VRAM offers reliable cell selection and low-power-consumption control of switching of out-of-plane core magnetizations using specially designed rotating magnetic fields generated by two orthogonal and unipolar Gaussian-pulse currents along with optimized pulse width and time delay. Our achievement of a new device based on a new material, that is, a medium composed of patterned vortex-state disks, together with the new physics on ultrafast vortex-core switching dynamics, can stimulate further fruitful research on MRAMs that are based on vortex-state dot arrays.
Busi, Roberto; Powles, Stephen B
2016-09-01
Weeds can be a greater constraint to crop production than animal pests and pathogens. Pre-emergence herbicides are crucial in many cropping systems to control weeds that have evolved resistance to selective post-emergence herbicides. In this study we assessed the potential to evolve resistance to the pre-emergence herbicides prosulfocarb + S-metolachlor or pyroxasulfone in 50 individual field Lolium rigidum populations collected in a random survey in Western Australia prior to commercialisation of these pre-emergence herbicides. This study shows for the first time that in randomly collected L. rigidum field populations the selection with either prosulfocarb + S-metolachlor or pyroxasulfone can result in concomitant evolution of resistance to both prosulfocarb + S-metolachlor and pyroxasulfone after three generations. In the major weed L. rigidum, traits conferring resistance to new herbicides can be present before herbicide commercialisation. Proactive and multidisciplinary research (evolutionary ecology, modelling and molecular biology) is required to detect and analyse resistant populations before they can appear in the field. Several studies show that evolved cross-resistance in weeds is complex and often unpredictable. Thus, long-term management of cross-resistant weeds must be achieved through heterogeneity of selection by effective chemical, cultural and physical weed control strategies that can delay herbicide resistance evolution. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Durner, Maximilian; Márton, Zoltán.; Hillenbrand, Ulrich; Ali, Haider; Kleinsteuber, Martin
2017-03-01
In this work, a new ensemble method for the task of category recognition in different environments is presented. The focus is on service robotic perception in an open environment, where the robot's task is to recognize previously unseen objects of predefined categories, based on training on a public dataset. We propose an ensemble learning approach to be able to flexibly combine complementary sources of information (different state-of-the-art descriptors computed on color and depth images), based on a Markov Random Field (MRF). By exploiting its specific characteristics, the MRF ensemble method can also be executed as a Dynamic Classifier Selection (DCS) system. In the experiments, the committee- and topology-dependent performance boost of our ensemble is shown. Despite reduced computational costs and using less information, our strategy performs on the same level as common ensemble approaches. Finally, the impact of large differences between datasets is analyzed.
ERIC Educational Resources Information Center
Papadopoulos, Pantelis M.; Lagkas, Thomas D.; Demetriadis, Stavros N.
2012-01-01
This study provides field research evidence on the efficiency of a "free-selection" peer review assignment protocol as compared to the typically implemented "assigned-pair" protocol. The study employed 54 sophomore students who were randomly assigned into three groups: Assigned-Pair (AP) (the teacher assigns student works for review to student…
ERIC Educational Resources Information Center
Alqahtani, Abdulmuhsen Ayedh
2014-01-01
The current study aims at exploring Kuwaiti families' educational investment behavior pursuant to the selection of a specific private school for their children from the private school market. Using the quantitative approach and the principles of marketing research, a survey was administered to a randomly selected sample of Kuwaiti families (n =…
NASA Technical Reports Server (NTRS)
Buehler, Martin G. (Inventor); Blaes, Brent R. (Inventor); Lieneweg, Udo (Inventor)
1994-01-01
A particle sensor array which in a preferred embodiment comprises a static random access memory having a plurality of ion-sensitive memory cells, each such cell comprising at least one pull-down field effect transistor having a sensitive drain surface area (such as by bloating) and at least one pull-up field effect transistor having a source connected to an offset voltage. The sensitive drain surface area and the offset voltage are selected for memory cell upset by incident ions such as alpha-particles. The static random access memory of the present invention provides a means for selectively biasing the memory cells into the same state in which each of the sensitive drain surface areas is reverse biased and then selectively reducing the reversed bias on these sensitive drain surface areas for increasing the upset sensitivity of the cells to ions. The resulting selectively sensitive memory cells can be used in a number of applications. By way of example, the present invention can be used for measuring the linear energy transfer of ion particles, as well as a device for assessing the resistance of CMOS latches to Cosmic Ray induced single event upsets. The sensor of the present invention can also be used to determine the uniformity of an ion beam.
Visual evoked potentials and selective attention to points in space
NASA Technical Reports Server (NTRS)
Van Voorhis, S.; Hillyard, S. A.
1977-01-01
Visual evoked potentials (VEPs) were recorded to sequences of flashes delivered to the right and left visual fields while subjects responded promptly to designated stimuli in one field at a time (focused attention), in both fields at once (divided attention), or to neither field (passive). Three stimulus schedules were used: the first was a replication of a previous study (Eason, Harter, and White, 1969) where left- and right-field flashes were delivered quasi-independently, while in the other two the flashes were delivered to the two fields in random order (Bernoulli sequence). VEPs to attended-field stimuli were enhanced at both occipital (O2) and central (Cz) recording sites under all stimulus sequences, but different components were affected at the two scalp sites. It was suggested that the VEP at O2 may reflect modality-specific processing events, while the response at Cz, like its auditory homologue, may index more general aspects of selective attention.
Ghayab, Hadi Ratham Al; Li, Yan; Abdulla, Shahab; Diykh, Mohammed; Wan, Xiangkui
2016-06-01
Electroencephalogram (EEG) signals are used broadly in the medical fields. The main applications of EEG signals are the diagnosis and treatment of diseases such as epilepsy, Alzheimer, sleep problems and so on. This paper presents a new method which extracts and selects features from multi-channel EEG signals. This research focuses on three main points. Firstly, simple random sampling (SRS) technique is used to extract features from the time domain of EEG signals. Secondly, the sequential feature selection (SFS) algorithm is applied to select the key features and to reduce the dimensionality of the data. Finally, the selected features are forwarded to a least square support vector machine (LS_SVM) classifier to classify the EEG signals. The LS_SVM classifier classified the features which are extracted and selected from the SRS and the SFS. The experimental results show that the method achieves 99.90, 99.80 and 100 % for classification accuracy, sensitivity and specificity, respectively.
Localized surface plasmon enhanced cellular imaging using random metallic structures
NASA Astrophysics Data System (ADS)
Son, Taehwang; Lee, Wonju; Kim, Donghyun
2017-02-01
We have studied fluorescence cellular imaging with randomly distributed localized near-field induced by silver nano-islands. For the fabrication of nano-islands, a 10-nm silver thin film evaporated on a BK7 glass substrate with an adhesion layer of 2-nm thick chromium. Micrometer sized silver square pattern was defined using e-beam lithography and then the film was annealed at 200°C. Raw images were restored using electric field distribution produced on the surface of random nano-islands. Nano-islands were modeled from SEM images. 488-nm p-polarized light source was set to be incident at 60°. Simulation results show that localized electric fields were created among nano-islands and that their average size was found to be 135 nm. The feasibility was tested using conventional total internal reflection fluorescence microscopy while the angle of incidence was adjusted to maximize field enhancement. Mouse microphage cells were cultured on nano-islands, and actin filaments were selectively stained with FITC-conjugated phalloidin. Acquired images were deconvolved based on linear imaging theory, in which molecular distribution was sampled by randomly distributed localized near-field and blurred by point spread function of far-field optics. The optimum fluorophore distribution was probabilistically estimated by repetitively matching a raw image. The deconvolved images are estimated to have a resolution in the range of 100-150 nm largely determined by the size of localized near-fields. We also discuss and compare the results with images acquired with periodic nano-aperture arrays in various optical configurations to excite localized plasmonic fields and to produce super-resolved molecular images.
Guided transect sampling - a new design combining prior information and field surveying
Anna Ringvall; Goran Stahl; Tomas Lamas
2000-01-01
Guided transect sampling is a two-stage sampling design in which prior information is used to guide the field survey in the second stage. In the first stage, broad strips are randomly selected and divided into grid-cells. For each cell a covariate value is estimated from remote sensing data, for example. The covariate is the basis for subsampling of a transect through...
Cosmic ray sources, acceleration and propagation
NASA Technical Reports Server (NTRS)
Ptuskin, V. S.
1986-01-01
A review is given of selected papers on the theory of cosmic ray (CR) propagation and acceleration. The high isotropy and a comparatively large age of galactic CR are explained by the effective interaction of relativistic particles with random and regular electromagnetic fields in interstellar medium. The kinetic theory of CR propagation in the Galaxy is formulated similarly to the elaborate theory of CR propagation in heliosphere. The substantial difference between these theories is explained by the necessity to take into account in some cases the collective effects due to a rather high density of relativisitc particles. In particular, the kinetic CR stream instability and the hydrodynamic Parker instability is studied. The interaction of relativistic particles with an ensemble of given weak random magnetic fields is calculated by perturbation theory. The theory of CR transfer is considered to be basically completed for this case. The main problem consists in poor information about the structure of the regular and the random galactic magnetic fields. An account is given of CR transfer in a turbulent medium.
Random forest feature selection approach for image segmentation
NASA Astrophysics Data System (ADS)
Lefkovits, László; Lefkovits, Szidónia; Emerich, Simina; Vaida, Mircea Florin
2017-03-01
In the field of image segmentation, discriminative models have shown promising performance. Generally, every such model begins with the extraction of numerous features from annotated images. Most authors create their discriminative model by using many features without using any selection criteria. A more reliable model can be built by using a framework that selects the important variables, from the point of view of the classification, and eliminates the unimportant once. In this article we present a framework for feature selection and data dimensionality reduction. The methodology is built around the random forest (RF) algorithm and its variable importance evaluation. In order to deal with datasets so large as to be practically unmanageable, we propose an algorithm based on RF that reduces the dimension of the database by eliminating irrelevant features. Furthermore, this framework is applied to optimize our discriminative model for brain tumor segmentation.
Service-Oriented Node Scheduling Scheme for Wireless Sensor Networks Using Markov Random Field Model
Cheng, Hongju; Su, Zhihuang; Lloret, Jaime; Chen, Guolong
2014-01-01
Future wireless sensor networks are expected to provide various sensing services and energy efficiency is one of the most important criterions. The node scheduling strategy aims to increase network lifetime by selecting a set of sensor nodes to provide the required sensing services in a periodic manner. In this paper, we are concerned with the service-oriented node scheduling problem to provide multiple sensing services while maximizing the network lifetime. We firstly introduce how to model the data correlation for different services by using Markov Random Field (MRF) model. Secondly, we formulate the service-oriented node scheduling issue into three different problems, namely, the multi-service data denoising problem which aims at minimizing the noise level of sensed data, the representative node selection problem concerning with selecting a number of active nodes while determining the services they provide, and the multi-service node scheduling problem which aims at maximizing the network lifetime. Thirdly, we propose a Multi-service Data Denoising (MDD) algorithm, a novel multi-service Representative node Selection and service Determination (RSD) algorithm, and a novel MRF-based Multi-service Node Scheduling (MMNS) scheme to solve the above three problems respectively. Finally, extensive experiments demonstrate that the proposed scheme efficiently extends the network lifetime. PMID:25384005
He, Yi; Xiao, Yi; Liwo, Adam; Scheraga, Harold A
2009-10-01
We explored the energy-parameter space of our coarse-grained UNRES force field for large-scale ab initio simulations of protein folding, to obtain good initial approximations for hierarchical optimization of the force field with new virtual-bond-angle bending and side-chain-rotamer potentials which we recently introduced to replace the statistical potentials. 100 sets of energy-term weights were generated randomly, and good sets were selected by carrying out replica-exchange molecular dynamics simulations of two peptides with a minimal alpha-helical and a minimal beta-hairpin fold, respectively: the tryptophan cage (PDB code: 1L2Y) and tryptophan zipper (PDB code: 1LE1). Eight sets of parameters produced native-like structures of these two peptides. These eight sets were tested on two larger proteins: the engrailed homeodomain (PDB code: 1ENH) and FBP WW domain (PDB code: 1E0L); two sets were found to produce native-like conformations of these proteins. These two sets were tested further on a larger set of nine proteins with alpha or alpha + beta structure and found to locate native-like structures of most of them. These results demonstrate that, in addition to finding reasonable initial starting points for optimization, an extensive search of parameter space is a powerful method to produce a transferable force field. Copyright 2009 Wiley Periodicals, Inc.
Entropy of level-cut random Gaussian structures at different volume fractions
NASA Astrophysics Data System (ADS)
Marčelja, Stjepan
2017-10-01
Cutting random Gaussian fields at a given level can create a variety of morphologically different two- or several-phase structures that have often been used to describe physical systems. The entropy of such structures depends on the covariance function of the generating Gaussian random field, which in turn depends on its spectral density. But the entropy of level-cut structures also depends on the volume fractions of different phases, which is determined by the selection of the cutting level. This dependence has been neglected in earlier work. We evaluate the entropy of several lattice models to show that, even in the cases of strongly coupled systems, the dependence of the entropy of level-cut structures on molar fractions of the constituents scales with the simple ideal noninteracting system formula. In the last section, we discuss the application of the results to binary or ternary fluids and microemulsions.
Stratified random selection of watersheds allowed us to compare geographically-independent classification schemes based on watershed storage (wetland + lake area/watershed area) and forest fragmentation with a geographically-based classification scheme within the Northern Lakes a...
Essays on Child Development in Developing Countries
ERIC Educational Resources Information Center
Humpage, Sarah Davidson
2013-01-01
This dissertation presents the results of three field experiments implemented to evaluate the effectiveness of strategies to improve the health or education of children in developing countries. In Guatemala, community health workers at randomly selected clinics were given patient tracking lists to improve their ability to remind parents when their…
A Graph Theory Practice on Transformed Image: A Random Image Steganography
Thanikaiselvan, V.; Arulmozhivarman, P.; Subashanthini, S.; Amirtharajan, Rengarajan
2013-01-01
Modern day information age is enriched with the advanced network communication expertise but unfortunately at the same time encounters infinite security issues when dealing with secret and/or private information. The storage and transmission of the secret information become highly essential and have led to a deluge of research in this field. In this paper, an optimistic effort has been taken to combine graceful graph along with integer wavelet transform (IWT) to implement random image steganography for secure communication. The implementation part begins with the conversion of cover image into wavelet coefficients through IWT and is followed by embedding secret image in the randomly selected coefficients through graph theory. Finally stegoimage is obtained by applying inverse IWT. This method provides a maximum of 44 dB peak signal to noise ratio (PSNR) for 266646 bits. Thus, the proposed method gives high imperceptibility through high PSNR value and high embedding capacity in the cover image due to adaptive embedding scheme and high robustness against blind attack through graph theoretic random selection of coefficients. PMID:24453857
2010-10-17
conditions between active, recently active and randomly selected non-burrow locations at this site. Field surveys were completed in three study areas ...at the installation. On average, burrow sites had a much higher overall occurrence of longleaf pine and significantly lower total basal area as...East Area burrow and non-burrow locations, Camp Shelby, MS 2007 .................. 10 2 Mean basal area per acre for pine and hardwood on T-44, Mars
Ablation effects of noninvasive radiofrequency field-induced hyperthermia on liver cancer cells.
Chen, Kaiyun; Zhu, Shuguang; Xiang, Guoan; Duan, Xiaopeng; He, Jiwen; Chen, Guihua
2016-05-01
To have in-depth analysis of clinical ablation effect of noninvasive radiofrequency field-induced hyperthermia on liver cancer cells, this paper collected liver cancer patients' treatment information from 10 hospitals during January 2010 and December 2011, from which 1050 cases of patients were randomly selected as study object of observation group who underwent noninvasive radiofrequency field-induced hyperthermia treatment; in addition, 500 cases of liver cancer patients were randomly selected as study object of control group who underwent clinical surgical treatment. After treatment was completed, three years of return visit were done, survival rates of the two groups of patients after 1 year, 2 years, and 3 years were compared, and clinical effects of radiofrequency ablation of liver cancer were evaluated. Zoom results show that the two groups are similar in terms of survival rate, and the difference is without statistical significance. 125 patients in observation group had varying degrees of adverse reactions, while 253 patients in control group had adverse reactions. There was difference between groups P < 0.05, with significant statistical significance. It can be concluded that radiofrequency ablation of liver cancer is more secure. Therefore, the results of this study fully demonstrate that liver cancer treatment with noninvasive radiofrequency field-induced hyperthermia is with safety effect and satisfactory survival rate, thus with relatively high clinical value in clinical practice.
Accuracy of genomic selection in European maize elite breeding populations.
Zhao, Yusheng; Gowda, Manje; Liu, Wenxin; Würschum, Tobias; Maurer, Hans P; Longin, Friedrich H; Ranc, Nicolas; Reif, Jochen C
2012-03-01
Genomic selection is a promising breeding strategy for rapid improvement of complex traits. The objective of our study was to investigate the prediction accuracy of genomic breeding values through cross validation. The study was based on experimental data of six segregating populations from a half-diallel mating design with 788 testcross progenies from an elite maize breeding program. The plants were intensively phenotyped in multi-location field trials and fingerprinted with 960 SNP markers. We used random regression best linear unbiased prediction in combination with fivefold cross validation. The prediction accuracy across populations was higher for grain moisture (0.90) than for grain yield (0.58). The accuracy of genomic selection realized for grain yield corresponds to the precision of phenotyping at unreplicated field trials in 3-4 locations. As for maize up to three generations are feasible per year, selection gain per unit time is high and, consequently, genomic selection holds great promise for maize breeding programs.
Influence of the Pedagogical Context on Students' Evaluation of Teaching
ERIC Educational Resources Information Center
Luna, Edna; Aramburo, Vicente; Cordero, Graciela
2010-01-01
The purpose of this study was to compare the characteristics of teaching performance in accordance with the opinion of students of different academic fields and curriculum stages in a Mexican state public university. The sample was composed of 729 randomly-selected courses, distributed over four semester periods. Descriptive and comparative…
ERIC Educational Resources Information Center
Walton, Gregory M.; Logel, Christine; Peach, Jennifer M.; Spencer, Steven J.; Zanna, Mark P.
2015-01-01
In a randomized-controlled trial, we tested 2 brief interventions designed to mitigate the effects of a "chilly climate" women may experience in engineering, especially in male-dominated fields. Participants were students entering a selective university engineering program. The "social-belonging intervention" aimed to protect…
Monitoring Achievement of Educational Governance/Management Policy Goals.
ERIC Educational Resources Information Center
Sederberg, Charles H.; Hendrix, Vernon L.
This paper reports on a field test of a system for monitoring the achievement of selected educational governance/management goals. The study entailed (1) collection of enrollment, revenue, expenditure, and teacher assignment data from a stratified random sample of Minnesota districts for a seven-year period, 1969-70 through 1975-76; (2) reduction…
Multiplexed time-lapse photomicrography of cultured cells.
Heye, R R; Kiebler, E W; Arnzen, R J; Tolmach, L J
1982-01-01
A system of cinemicrography has been developed in which a single microscope and 16 mm camera are multiplexed to produce a time-lapse photographic record of many fields simultaneously. The field coordinates and focus are selected via a control console and entered into the memory of a dedicated microcomputer; they are then automatically recalled in sequence, thus permitting the photographing of additional fields in the interval between exposures of any given field. Sequential exposures of each field are isolated in separate sections of the film by means of a specially designed random-access camera that is also controlled by the microcomputer. The need to unscramble frames is thereby avoided, and the developed film can be directly analysed.
Random waves in the brain: Symmetries and defect generation in the visual cortex
NASA Astrophysics Data System (ADS)
Schnabel, M.; Kaschube, M.; Löwel, S.; Wolf, F.
2007-06-01
How orientation maps in the visual cortex of the brain develop is a matter of long standing debate. Experimental and theoretical evidence suggests that their development represents an activity-dependent self-organization process. Theoretical analysis [1] exploring this hypothesis predicted that maps at an early developmental stage are realizations of Gaussian random fields exhibiting a rigorous lower bound for their densities of topological defects, called pinwheels. As a consequence, lower pinwheel densities, if observed in adult animals, are predicted to develop through the motion and annihilation of pinwheel pairs. Despite of being valid for a large class of developmental models this result depends on the symmetries of the models and thus of the predicted random field ensembles. In [1] invariance of the orientation map's statistical properties under independent space rotations and orientation shifts was assumed. However, full rotation symmetry appears to be broken by interactions of cortical neurons, e.g. selective couplings between groups of neurons with collinear orientation preferences [2]. A recently proposed new symmetry, called shift-twist symmetry [3], stating that spatial rotations have to occur together with orientation shifts in order to be an appropriate symmetry transformation, is more consistent with this organization. Here we generalize our random field approach to this important symmetry class. We propose a new class of shift-twist symmetric Gaussian random fields and derive the general correlation functions of this ensemble. It turns out that despite strong effects of the shift-twist symmetry on the structure of the correlation functions and on the map layout the lower bound on the pinwheel densities remains unaffected, predicting pinwheel annihilation in systems with low pinwheel densities.
Energy parasites trigger oncogene mutation.
Pokorný, Jiří; Pokorný, Jan; Jandová, Anna; Kobilková, Jitka; Vrba, Jan; Vrba, Jan
2016-10-01
Cancer initialization can be explained as a result of parasitic virus energy consumption leading to randomized genome chemical bonding. Analysis of experimental data on cell-mediated immunity (CMI) containing about 12,000 cases of healthy humans, cancer patients and patients with precancerous cervical lesions disclosed that the specific cancer and the non-specific lactate dehydrogenase-elevating (LDH) virus antigen elicit similar responses. The specific antigen is effective only in cancer type of its origin but the non-specific antigen in all examined cancers. CMI results of CIN patients display both healthy and cancer state. The ribonucleic acid (RNA) of the LDH virus parasitizing on energy reduces the ratio of coherent/random oscillations. Decreased effect of coherent cellular electromagnetic field on bonding electrons in biological macromolecules leads to elevating probability of random genome reactions. Overlapping of wave functions in biological macromolecules depends on energy of the cellular electromagnetic field which supplies energy to bonding electrons for selective chemical bonds. CMI responses of cancer and LDH virus antigens in all examined healthy, precancerous and cancer cases point to energy mechanism in cancer initiation. Dependence of the rate of biochemical reactions on biological electromagnetic field explains yet unknown mechanism of genome mutation.
NASA Astrophysics Data System (ADS)
Chemura, Abel; Mutanga, Onisimo; Dube, Timothy
2017-08-01
Water management is an important component in agriculture, particularly for perennial tree crops such as coffee. Proper detection and monitoring of water stress therefore plays an important role not only in mitigating the associated adverse impacts on crop growth and productivity but also in reducing expensive and environmentally unsustainable irrigation practices. Current methods for water stress detection in coffee production mainly involve monitoring plant physiological characteristics and soil conditions. In this study, we tested the ability of selected wavebands in the VIS/NIR range to predict plant water content (PWC) in coffee using the random forest algorithm. An experiment was set up such that coffee plants were exposed to different levels of water stress and reflectance and plant water content measured. In selecting appropriate parameters, cross-correlation identified 11 wavebands, reflectance difference identified 16 and reflectance sensitivity identified 22 variables related to PWC. Only three wavebands (485 nm, 670 nm and 885 nm) were identified by at least two methods as significant. The selected wavebands were trained (n = 36) and tested on independent data (n = 24) after being integrated into the random forest algorithm to predict coffee PWC. The results showed that the reflectance sensitivity selected bands performed the best in water stress detection (r = 0.87, RMSE = 4.91% and pBias = 0.9%), when compared to reflectance difference (r = 0.79, RMSE = 6.19 and pBias = 2.5%) and cross-correlation selected wavebands (r = 0.75, RMSE = 6.52 and pBias = 1.6). These results indicate that it is possible to reliably predict PWC using wavebands in the VIS/NIR range that correspond with many of the available multispectral scanners using random forests and further research at field and landscape scale is required to operationalize these findings.
Accuracy of a novel multi-sensor board for measuring physical activity and energy expenditure
Lester, Jonathan; Migotsky, Sean; Goh, Jorming; Higgins, Lisa; Borriello, Gaetano
2011-01-01
The ability to relate physical activity to health depends on accurate measurement. Yet, none of the available methods are fully satisfactory due to several factors. This study examined the accuracy of a multi-sensor board (MSB) that infers activity types (sitting, standing, walking, stair climbing, and running) and estimates energy expenditure in 57 adults (32 females) 39.2 ± 13.5 years. In the laboratory, subjects walked and ran on a treadmill over a select range of speeds and grades for 3 min each (six stages in random order) while connected to a stationary calorimeter, preceded and followed by brief sitting and standing. On a different day, subjects completed scripted activities in the field connected to a portable calorimeter. The MSB was attached to a strap at the right hip. Subjects repeated one condition (randomly selected) on the third day. Accuracy of inferred activities compared with recorded activities (correctly identified activities/total activities × 100) was 97 and 84% in the laboratory and field, respectively. Absolute accuracy of energy expenditure [100 – absolute value (kilocalories MSB – kilocalories calorimeter/kilocalories calorimeter) × 100] was 89 and 76% in the laboratory and field, the later being different (P < 0.05) from the calorimeter. Test–retest reliability for energy expenditure was significant in both settings (P < 0.0001; r = 0.97). In general, the MSB provides accurate measures of activity type in laboratory and field settings and energy expenditure during treadmill walking and running although the device underestimates energy expenditure in the field. PMID:21249383
ERIC Educational Resources Information Center
Azar, Ali Sorayyaei; Hashim, Azirah
2014-01-01
The classes, purposes and characteristics associated with the review article in the field of applied linguistics were analyzed. The data were collected from a randomly selected corpus of thirty two review articles from a discipline-related key journal in applied linguistics. The findings revealed that different sub-genres can be identified within…
Lonely Days and Lonely Nights: Completing the Doctoral Dissertation.
ERIC Educational Resources Information Center
Germeroth, Darla
A study examined areas of the doctoral dissertation process that are often problematic for the Ph.D./Ed.D. candidate in the field of communication. Subjects, 250 randomly selected Speech Communication Association members holding a Ph.D. or an Ed.D. were surveyed. Of the 250 surveys mailed, 137 were returned, representing a 54.8% return rate.…
Characteristics and Clinical Practices of Marriage and Family Therapists: A National Survey
ERIC Educational Resources Information Center
Northey, William F., Jr.
2002-01-01
This report presents data from a telephone survey of a randomly selected sample of 292 marriage and family therapists (MFTs) who were Clinical Members of the American Association for Marriage and Family Therapy. The study, designed to better understand the current state of the field of MFT, provides descriptive data on the demographic…
Study Habits and Academic Achievement of Kashmiri & Ladakhi Adolescent Girls: A Comparative Study
ERIC Educational Resources Information Center
Nadeem, N. A.; Puja, Javeed Ahamd; Bhat, Shabir Ahmad
2014-01-01
The present study was conducted to study the Study Habits and Academic Achievement of Adolescents girls in Jammu and Kashmir. 400 sample subjects were selected randomly from two ethnic groups' viz. Kashmiri and Ladakhi. The investigators used Palsane & Sharma's study habits inventory (PSSHI) to collect data from the field. Certain statistical…
Why Students Return for a Master's Degree in Sport Management
ERIC Educational Resources Information Center
Lewis, Benjamin A.; Quarterman, Jerome
2006-01-01
The purpose of this study was to determine the relative importance of choice factors that were most important to students who decided to matriculate in the field of sport management for a master's degree. A survey questionnaire was mailed to the program or department chairs of 12 randomly selected universities listed on the NASSM web site during…
Research Agendas and Pedagogical Applications: What "Public Relations Review" Tells Us.
ERIC Educational Resources Information Center
Thomsen, Steven R.
A study explored the research agenda of "Public Relations Review," the oldest scholarly journal in the public relations field. To provide a descriptive and inferential analysis of the content of the journal from 1985 to 1994, four volumes were selected at random (1985, 1987, 1991, and 1993) and all the articles in them were analyzed.…
Candidatus Liberibacter asiaticus (CLas)titer in field HLB-exposed commercial citrus cultivars
USDA-ARS?s Scientific Manuscript database
Eight Indian River groves with four or more diverse scions planted in close proximity were surveyed. Twenty trees of each scion in each grove were randomly selected to avoid bias and edge effects and an HLB diagnostic leaf sample was collected from each. CLas 16S rDNA primers were used in qPCR, a...
Cross-Validation of a PACER Prediction Equation for Assessing Aerobic Capacity in Hungarian Youth
ERIC Educational Resources Information Center
Saint-Maurice, Pedro F.; Welk, Gregory J.; Finn, Kevin J.; Kaj, Mónika
2015-01-01
Purpose: The purpose of this article was to evaluate the validity of the Progressive Aerobic Cardiovascular and Endurance Run (PACER) test in a sample of Hungarian youth. Method: Approximately 500 participants (aged 10-18 years old) were randomly selected across Hungary to complete both laboratory (maximal treadmill protocol) and field assessments…
Quantitative Trait Inheritance in a Forty-Year-Old Longleaf Pine Partial Diallel Test
Michael Stine; Jim Roberds; C. Dana Nelson; David P. Gwaze; Todd Shupe; Les Groom
2002-01-01
A longleaf pine (Pinus palustris Mill.) 13 parent partial diallel field experiment was established at two locations on the Harrison Experimental Forest in 1960. Parent trees were randomly selected from a natural population growing on the Harrison Experimental Forest, near Gulfport, Miss. Distance between trees chosen as parents ranged from 13 to 357...
A dose optimization method for electron radiotherapy using randomized aperture beams
NASA Astrophysics Data System (ADS)
Engel, Konrad; Gauer, Tobias
2009-09-01
The present paper describes the entire optimization process of creating a radiotherapy treatment plan for advanced electron irradiation. Special emphasis is devoted to the selection of beam incidence angles and beam energies as well as to the choice of appropriate subfields generated by a refined version of intensity segmentation and a novel random aperture approach. The algorithms have been implemented in a stand-alone programme using dose calculations from a commercial treatment planning system. For this study, the treatment planning system Pinnacle from Philips has been used and connected to the optimization programme using an ASCII interface. Dose calculations in Pinnacle were performed by Monte Carlo simulations for a remote-controlled electron multileaf collimator (MLC) from Euromechanics. As a result, treatment plans for breast cancer patients could be significantly improved when using randomly generated aperture beams. The combination of beams generated through segmentation and randomization achieved the best results in terms of target coverage and sparing of critical organs. The treatment plans could be further improved by use of a field reduction algorithm. Without a relevant loss in dose distribution, the total number of MLC fields and monitor units could be reduced by up to 20%. In conclusion, using randomized aperture beams is a promising new approach in radiotherapy and exhibits potential for further improvements in dose optimization through a combination of randomized electron and photon aperture beams.
Seismic random noise attenuation method based on empirical mode decomposition of Hausdorff dimension
NASA Astrophysics Data System (ADS)
Yan, Z.; Luan, X.
2017-12-01
Introduction Empirical mode decomposition (EMD) is a noise suppression algorithm by using wave field separation, which is based on the scale differences between effective signal and noise. However, since the complexity of the real seismic wave field results in serious aliasing modes, it is not ideal and effective to denoise with this method alone. Based on the multi-scale decomposition characteristics of the signal EMD algorithm, combining with Hausdorff dimension constraints, we propose a new method for seismic random noise attenuation. First of all, We apply EMD algorithm adaptive decomposition of seismic data and obtain a series of intrinsic mode function (IMF)with different scales. Based on the difference of Hausdorff dimension between effectively signals and random noise, we identify IMF component mixed with random noise. Then we use threshold correlation filtering process to separate the valid signal and random noise effectively. Compared with traditional EMD method, the results show that the new method of seismic random noise attenuation has a better suppression effect. The implementation process The EMD algorithm is used to decompose seismic signals into IMF sets and analyze its spectrum. Since most of the random noise is high frequency noise, the IMF sets can be divided into three categories: the first category is the effective wave composition of the larger scale; the second category is the noise part of the smaller scale; the third category is the IMF component containing random noise. Then, the third kind of IMF component is processed by the Hausdorff dimension algorithm, and the appropriate time window size, initial step and increment amount are selected to calculate the Hausdorff instantaneous dimension of each component. The dimension of the random noise is between 1.0 and 1.05, while the dimension of the effective wave is between 1.05 and 2.0. On the basis of the previous steps, according to the dimension difference between the random noise and effective signal, we extracted the sample points, whose fractal dimension value is less than or equal to 1.05 for the each IMF components, to separate the residual noise. Using the IMF components after dimension filtering processing and the effective wave IMF components after the first selection for reconstruction, we can obtained the results of de-noising.
The impact of traffic sign deficit on road traffic accidents in Nigeria.
Ezeibe, Christian; Ilo, Chukwudi; Oguonu, Chika; Ali, Alphonsus; Abada, Ifeanyi; Ezeibe, Ezinwanne; Oguonu, Chukwunonso; Abada, Felicia; Izueke, Edwin; Agbo, Humphrey
2018-04-04
This study assesses the impact of traffic sign deficit on road traffic accidents in Nigeria. The participants were 720 commercial vehicle drivers. While simple random sampling was used to select 6 out of 137 federal highways, stratified random sampling was used to select six categories of commercial vehicle drivers. The study used qual-dominant mixed methods approach comprising key informant interviews; group interviews; field observation; policy appraisal and secondary literature on traffic signs. Result shows that the failure of government to provide and maintain traffic signs in order to guide road users through the numerous accident black spots on the highways is the major cause of road accidents in Nigeria. The study argues that provision and maintenance of traffic signs present opportunity to promoting safety on the highways and achieving the sustainable development goals.
Resonant spin tunneling in randomly oriented nanospheres of Mn 12 acetate
Lendínez, S.; Zarzuela, R.; Tejada, J.; ...
2015-01-06
We report measurements and theoretical analysis of resonant spin tunneling in randomly oriented nanospheres of a molecular magnet. Amorphous nanospheres of Mn₁₂ acetate have been fabricated and characterized by chemical, infrared, TEM, X-ray, and magnetic methods. Magnetic measurements have revealed sharp tunneling peaks in the field derivative of the magnetization that occur at the typical resonant field values for the Mn₁₂ acetate crystal in the field parallel to the easy axis.Theoretical analysis is provided that explains these observations. We argue that resonant spin tunneling in a molecular magnet can be established in a powder sample, without the need for amore » single crystal and without aligning the easy magnetization axes of the molecules. This is confirmed by re-analyzing the old data on a powdered sample of non-oriented micron-size crystals of Mn₁₂ acetate. In conclusion, our findings can greatly simplify the selection of candidates for quantum spin tunneling among newly synthesized molecular magnets.« less
Resonant spin tunneling in randomly oriented nanospheres of Mn 12 acetate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lendínez, S.; Zarzuela, R.; Tejada, J.
We report measurements and theoretical analysis of resonant spin tunneling in randomly oriented nanospheres of a molecular magnet. Amorphous nanospheres of Mn₁₂ acetate have been fabricated and characterized by chemical, infrared, TEM, X-ray, and magnetic methods. Magnetic measurements have revealed sharp tunneling peaks in the field derivative of the magnetization that occur at the typical resonant field values for the Mn₁₂ acetate crystal in the field parallel to the easy axis.Theoretical analysis is provided that explains these observations. We argue that resonant spin tunneling in a molecular magnet can be established in a powder sample, without the need for amore » single crystal and without aligning the easy magnetization axes of the molecules. This is confirmed by re-analyzing the old data on a powdered sample of non-oriented micron-size crystals of Mn₁₂ acetate. In conclusion, our findings can greatly simplify the selection of candidates for quantum spin tunneling among newly synthesized molecular magnets.« less
The students' intentions and satisfaction with the field of study and university
NOORAFSHAN, ALI; POURAHMAD, SAEEDEH; SAGHEB, MOHAMMAD MAHDI; DEHGHANI NAZHVANI, ALI; DEHSHAHRI, ALI; ABDOLLAHI, MANIJEH; MOHEBBI, ZEYNAB; KESHTKARAN, ZAHRA; AHMADI, AFSANEH; KAVOUSIPOUR, SOMAYEH; FARAHMAND, FARIBA; KHORRAMI, HAMID REZA; SOLTANI, ROBABEH; KARBALAY DOUST, SAIED
2014-01-01
Introduction: The present study aimed to find an appropriate method to inform senior high school students to correctly select their academic field of study and their intentions. Methods: This is a descriptive-analytic and cross-sectional study. A verified questionnaire was given to a total of 2600 students selected by stratified random sampling method (ten different colleges and entrance year from the 1st to 4th are considered as the strata). The position of the present field of study (major) among the list of the fields in the entrance exam was asked. The students’ methods of familiarity with different fields of study in Shiraz University of Medical Sciences (SUMS), the reasons for their selection, the students’ motivation and insistence on studying in the same field and university were asked in the questionnaire. Data were analyzed using independent two samples t-test, Analysis of Variance (ANOVA) and Chi-Square test. Results: The most significant references for university field selection were high school teachers, the students' parents and the adjacency of university to one's living place. Also, the results revealed the good reputation of SUMS in the first year and its downward trend during the following years. 59.4% of the 1st year students were satisfied with their field of study and SUMS. 31.8% were satisfied with the university but not with their fields of study. 6.4% were dissatisfied with the university but not with their fields of study. 2% of the students were dissatisfied with both their fields of study and university. Dissatisfaction with SUMS and field of study increased little by little so that the results obtained among the students who had entered the university earlier (in the 4th year of their study) showed nearly 16.3% dissatisfaction with both the university and the study fields. Conclusion: The methods for introducing the university are recommended to be revised. PMID:25512943
Decision tree modeling using R.
Zhang, Zhongheng
2016-08-01
In machine learning field, decision tree learner is powerful and easy to interpret. It employs recursive binary partitioning algorithm that splits the sample in partitioning variable with the strongest association with the response variable. The process continues until some stopping criteria are met. In the example I focus on conditional inference tree, which incorporates tree-structured regression models into conditional inference procedures. While growing a single tree is subject to small changes in the training data, random forests procedure is introduced to address this problem. The sources of diversity for random forests come from the random sampling and restricted set of input variables to be selected. Finally, I introduce R functions to perform model based recursive partitioning. This method incorporates recursive partitioning into conventional parametric model building.
Nonrandom Distribution of Virulences Within Two Field Collections of Uromyces appendiculatus.
Groth, J V; Ozmon, E A
2002-07-01
ABSTRACT Two collections of urediniospores of Uromyces appendiculatus, each from a different commercial bean field, were characterized for associations of virulence among individuals within each collection. Four bean (Phaseolus vulgaris) lines with distinct, race-specific resistance to which virulence in each population was polymorphic were used to obtain measures of all six possible pairwise virulence associations for each collection. We inoculated one of the lines and collected urediniospores only from the segment of the population that was virulent on that line. This segment, when compared with nonselected collections from susceptible Pinto 111, gave a direct measure of degree of association as the change in frequency of virulence observed. Plants of the second bean line were inoculated in separate sets with both selected and unselected collections. Frequencies of virulence were estimated from the numbers of susceptible-type and resistant-type infections. Reciprocals of each pairing also were made. For collection P21, all virulences were significantly associated, either positively or negatively, except one pair (in one direction of selection only); whereas, for collection M5, all virulences were significantly associated. Virulence association in P21 was shown to be the result of predominance of phenotypes with certain combinations of virulence by inoculation of the four bean lines with 10 randomly chosen single-uredinial individuals. In support of this, a large random-mated F1 population derived from each collection showed much less virulence association, with the majority of pairs of virulences showing nonsignificant changes in virulence frequency after passage through the first line. Random mating also significantly changed virulence frequency from that of the original population in all instances. Changes were in both directions, suggesting either that virulences were not all recessive, or that heterozygote frequency was sometimes above and sometimes below the Hardy-Weinberg expectation in the field populations.
Gooding, Lori F; Mori-Inoue, Satoko
2011-01-01
The purpose of this study was to examine the effect of video exposure on music therapy students' perceptions of clinical applications of popular music in the field of music therapy. Fifty-one participants were randomly divided into two groups and exposed to a popular song in either audio-only or music video format. Participants were asked to indicate clinical applications; specifically, participants chose: (a) possible population(s), (b) most appropriate population(s), (c) possible age range(s), (d) most appropriate age ranges, (e) possible goal area(s) and (f) most appropriate goal area. Data for each of these categories were compiled and analyzed, with no significant differences found in the choices made by the audio-only and video groups. Three items, (a) selection of the bereavement population, (b) selection of bereavement as the most appropriate population and (c) selection of the age ranges of pre teen/mature adult, were additionally selected for further analysis due to their relationship to the video content. Analysis results revealed a significant difference between the video and audio-only groups for the selection of these specific items, with the video group's selections more closely aligned to the video content. Results of this pilot study suggest that music video exposure to popular music can impact how students choose to implement popular songs in the field of music therapy.
Mayer, Paul M; Smith, Levica M; Ford, Robert G; Watterson, Dustin C; McCutchen, Marshall D; Ryan, Mark R
2009-04-01
Predation selects against conspicuous colors in bird eggs and nests, while thermoregulatory constraints select for nest-building behavior that regulates incubation temperatures. We present results that suggest a trade-off between nest crypticity and thermoregulation of eggs based on selection of nest materials by piping plovers (Charadrius melodus), a ground-nesting bird that constructs simple, pebble-lined nests highly vulnerable to predators and exposed to temperature extremes. Piping plovers selected pebbles that were whiter and appeared closer in color to eggs than randomly available pebbles, suggesting a crypsis function. However, nests that were more contrasting in color to surrounding substrates were at greater risk of predation, suggesting an alternate strategy driving selection of white rocks. Near-infrared reflectance of nest pebbles was higher than randomly available pebbles, indicating a direct physical mechanism for heat control through pebble selection. Artificial nests constructed of randomly available pebbles heated more quickly and conferred heat to model eggs, causing eggs to heat more rapidly than in nests constructed from piping plover nest pebbles. Thermal models and field data indicated that temperatures inside nests may remain up to 2-6 degrees C cooler than surrounding substrates. Thermal models indicated that nests heat especially rapidly if not incubated, suggesting that nest construction behavior may serve to keep eggs cooler during the unattended laying period. Thus, pebble selection suggests a potential trade-off between maximizing heat reflectance to improve egg microclimate and minimizing conspicuous contrast of nests with the surrounding substrate to conceal eggs from predators. Nest construction behavior that employs light-colored, thermally reflective materials may represent an evolutionary response by birds and other egg-laying organisms to egg predation and heat stress.
1995-04-13
rhodamine-coupled goat anti -mouse antibody . A rare , fused Cl 1 D giant cell was selected to show (Al, while extensive fusion was common throughout the...mouse anti - MHV-AS9 antiserum. To quantify the lev el of susceptibility of cells to MHV infection , ten randomly selected fields for each sample...named CealO) was discovered and found to be co-expressed with MHVR in the CI 1 D and F40 lines of mouse fibroblasts. A monoclonal anti - MHVR
Nonvolatile random access memory
NASA Technical Reports Server (NTRS)
Wu, Jiin-Chuan (Inventor); Stadler, Henry L. (Inventor); Katti, Romney R. (Inventor)
1994-01-01
A nonvolatile magnetic random access memory can be achieved by an array of magnet-Hall effect (M-H) elements. The storage function is realized with a rectangular thin-film ferromagnetic material having an in-plane, uniaxial anisotropy and inplane bipolar remanent magnetization states. The thin-film magnetic element is magnetized by a local applied field, whose direction is used to form either a 0 or 1 state. The element remains in the 0 or 1 state until a switching field is applied to change its state. The stored information is detcted by a Hall-effect sensor which senses the fringing field from the magnetic storage element. The circuit design for addressing each cell includes transistor switches for providing a current of selected polarity to store a binary digit through a separate conductor overlying the magnetic element of the cell. To read out a stored binary digit, transistor switches are employed to provide a current through a row of Hall-effect sensors connected in series and enabling a differential voltage amplifier connected to all Hall-effect sensors of a column in series. To avoid read-out voltage errors due to shunt currents through resistive loads of the Hall-effect sensors of other cells in the same column, at least one transistor switch is provided between every pair of adjacent cells in every row which are not turned on except in the row of the selected cell.
Moore, M A; Katzgraber, Helmut G
2014-10-01
Starting from preferences on N proposed policies obtained via questionnaires from a sample of the electorate, an Ising spin-glass model in a field can be constructed from which a political party could find the subset of the proposed policies which would maximize its appeal, form a coherent choice in the eyes of the electorate, and have maximum overlap with the party's existing policies. We illustrate the application of the procedure by simulations of a spin glass in a random field on scale-free networks.
Application of Propiconazole and Pseudomonas Cichorii for Control of Oak Wilt in Texas Live Oaks
A. Dan Wilson; D.G. Lester
1995-01-01
The efficacy of two formulations of propiconazole, Banner and Tilt, and biocontrol agent (Pseudomonas cichorii) for Control of oak wilt was tested in a natural mature stand of live oaks at a location near Yoakum, Texas with a predominantly sandy soil type. The field plots, established 15 March 85, consisted of five randomly selected plot locations...
ERIC Educational Resources Information Center
Moxley, Linda S.
In October 1975 a questionnaire was sent to 200 members randomly selected from the "Directory of Faculty Members Teaching in the Field of Higher Education" to determine satisfaction with their teaching role. The research was designed to test Herzberg's theory, which states that "hygiene factors" (job context) are related to…
ERIC Educational Resources Information Center
Ferguson, M. A.; Valenti, JoAnn Myer
Using radon (a naturally-occurring radioactive gas linked to lung cancer) as the health risk factor, a study examined which risk-taking tendencies interact with different health-risk message strategies. A phone survey pretested 837 randomly selected homeowners from three Florida counties with the highest levels of radon in the state (706 agreed to…
Vocational Schools: Relation of Curricula in the Fields of Commerce, Banking and Management.
ERIC Educational Resources Information Center
Nizan, Esther
The purpose of this survey was to present the views of the economy regarding the desirable vocational and personal qualifications of those working or preparing to work in office administration. From a list of Israeli businesses employing office workers, 60 were randomly selected and placed into one of four groups depending on the number of…
Five-Year-Old Cottonwood Plantation on a Clay Site: Growth, Yield, and Soil Properties
R. M. Krinard; H. E. Kennedy
1980-01-01
A random sample of Stoneville select cottonwood (Populus deltoides Bartr.) clones planted on recent old-field clay soils at 12- by 12- foot spacing averaged 75-percent survival after five years. The growth and yield was about half that expected from planted cottonwood on medium-textured soils. Soil moisture analysis showed more height growth in years...
Examining the Use of Web-Based Tests for Testing Academic Vocabulary in EAP Instruction
ERIC Educational Resources Information Center
Dashtestani, Reza
2015-01-01
Interest in Web-based and computer-assisted language testing is growing in the field of English for academic purposes (EAP). In this study, four groups of undergraduate EAP students (n = 120), each group consisted of 30 students, were randomly selected from four different disciplines, i.e. biology, political sciences, psychology, and law. The four…
Preliminary classification of forest vegetation of the Kenai Peninsula, Alaska.
K.M. Reynolds
1990-01-01
A total of 5,597 photo points was systematically located on 1:60,000-scale high altitude photographs of the Kenai Peninsula, Alaska; photo interpretation was used to classify the vegetation at each grid position. Of the total grid points, 12.3 percent were classified as timberland; 129 photo points within the timberland class were randomly selected for field survey....
ERIC Educational Resources Information Center
Hamid, Malai Hayati Sheikh; Shahrill, Masitah; Matzin, Rohani; Mahalle, Salwa; Mundia, Lawrence
2013-01-01
The cross-sectional field survey examined the roles of mathematics anxiety, self-esteem, proactive coping, and test stress in mathematics achievement among 204 (151 females) randomly selected Year 8-10 Brunei secondary school students. The negative dimensions of mathematics anxiety, self-esteem, and proactive coping correlated negatively with…
Jiao, Shengwu; Guo, Yumin; Huettmann, Falk; Lei, Guangchun
2014-07-01
Avian nest-site selection is an important research and management subject. The hooded crane (Grus monacha) is a vulnerable (VU) species according to the IUCN Red List. Here, we present the first long-term Chinese legacy nest data for this species (1993-2010) with publicly available metadata. Further, we provide the first study that reports findings on multivariate nest habitat preference using such long-term field data for this species. Our work was carried out in Northeastern China, where we found and measured 24 nests and 81 randomly selected control plots and their environmental parameters in a vast landscape. We used machine learning (stochastic boosted regression trees) to quantify nest selection. Our analysis further included varclust (R Hmisc) and (TreenNet) to address statistical correlations and two-way interactions. We found that from an initial list of 14 measured field variables, water area (+), water depth (+) and shrub coverage (-) were the main explanatory variables that contributed to hooded crane nest-site selection. Agricultural sites played a smaller role in the selection of these nests. Our results are important for the conservation management of cranes all over East Asia and constitute a defensible and quantitative basis for predictive models.
A New Random Walk for Replica Detection in WSNs.
Aalsalem, Mohammed Y; Khan, Wazir Zada; Saad, N M; Hossain, Md Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram
2016-01-01
Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical.
A New Random Walk for Replica Detection in WSNs
Aalsalem, Mohammed Y.; Saad, N. M.; Hossain, Md. Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram
2016-01-01
Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical. PMID:27409082
Dobbertin, Matthias; Hug, Christian; Mizoue, Nobuya
2004-11-01
In this study we used photographs of tree crowns to test whether the assessment methods for tree defoliation in Switzerland have changed over time. We randomly selected 24 series of slides of Norway spruce with field assessments made between 1986 and 1995. The slides were randomly arranged and assessed by three experts without prior knowledge of the year when the slide was taken or the tree number. Defoliation was assessed using the Swiss reference photo guide. Although the correlations between the field assessments and slide assessments were high (Spearman's rank correlation coefficient ranged between 0.79 and 0.83), we found significant differences between field and slide assessments (4.3 to 9% underprediction by the slide assessors) and between the slide assessments. However, no significant trends in field assessment methods could be detected. When the mean differences between field and slide assessments were subtracted, in some years, field assessors consistently underpredicted (1990, 1992) or overpredicted defoliation (1987, 1991). Defoliation tended to be overpredicted in slides taken against the light, and underpredicted for trees with more than 25% crown overlap. We conclude that slide series can be used to detect changes in assessment methods. However, potential observer bias calls for more objective methods of assessment.
NASA Astrophysics Data System (ADS)
Flores-Tavizón, Edith; Mokgalaka-Matlala, Ntebogeng S.; Elizalde Galindo, José T.; Castillo-Michelle, Hiram; Peralta-Videa, Jose R.; Gardea-Torresdey, Jorge L.
2012-04-01
Magnetic field is closely related to the cell metabolism of plants [N. A. Belyavskaya, Adv. Space Res. 34, 1566 (2004)]. In order to see the effect of magnetic field on the plant growth, arsenic uptake, and total amylolytic activity of mesquite (Prosopis juliflora x P. velutina) seeds, ten sets of 80 seeds were selected to be oriented with the long axis parallel or randomly oriented to an external magnetic field. The external magnetic field magnitude was 1 T, and the exposition time t = 30 min. Then, the seeds were stored for three days in a plastic bag and then sown on paper towels in a modified Hoagland's nutrient solution. After three days of germination in the dark and three days in light, seedlings were grown hydroponically in modified Hoagland's nutrient solution (high PO42-) containing 0, 10, or 20 ppm of arsenic as As (III) and (V). The results show that the germination ratios, growth, elongation, arsenic uptake, and total amylolytic activity of the long axis oriented mesquite seeds were much higher than those of the randomly oriented seeds. Also, these two sets of seeds showed higher properties than the ones that were not exposed to external magnetic field.
Metabolite and transcript markers for the prediction of potato drought tolerance.
Sprenger, Heike; Erban, Alexander; Seddig, Sylvia; Rudack, Katharina; Thalhammer, Anja; Le, Mai Q; Walther, Dirk; Zuther, Ellen; Köhl, Karin I; Kopka, Joachim; Hincha, Dirk K
2018-04-01
Potato (Solanum tuberosum L.) is one of the most important food crops worldwide. Current potato varieties are highly susceptible to drought stress. In view of global climate change, selection of cultivars with improved drought tolerance and high yield potential is of paramount importance. Drought tolerance breeding of potato is currently based on direct selection according to yield and phenotypic traits and requires multiple trials under drought conditions. Marker-assisted selection (MAS) is cheaper, faster and reduces classification errors caused by noncontrolled environmental effects. We analysed 31 potato cultivars grown under optimal and reduced water supply in six independent field trials. Drought tolerance was determined as tuber starch yield. Leaf samples from young plants were screened for preselected transcript and nontargeted metabolite abundance using qRT-PCR and GC-MS profiling, respectively. Transcript marker candidates were selected from a published RNA-Seq data set. A Random Forest machine learning approach extracted metabolite and transcript markers for drought tolerance prediction with low error rates of 6% and 9%, respectively. Moreover, by combining transcript and metabolite markers, the prediction error was reduced to 4.3%. Feature selection from Random Forest models allowed model minimization, yielding a minimal combination of only 20 metabolite and transcript markers that were successfully tested for their reproducibility in 16 independent agronomic field trials. We demonstrate that a minimum combination of transcript and metabolite markers sampled at early cultivation stages predicts potato yield stability under drought largely independent of seasonal and regional agronomic conditions. © 2017 The Authors. Plant Biotechnology Journal published by Society for Experimental Biology and The Association of Applied Biologists and John Wiley & Sons Ltd.
47 CFR 1.1602 - Designation for random selection.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...
47 CFR 1.1602 - Designation for random selection.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...
Applying a weighted random forests method to extract karst sinkholes from LiDAR data
NASA Astrophysics Data System (ADS)
Zhu, Junfeng; Pierskalla, William P.
2016-02-01
Detailed mapping of sinkholes provides critical information for mitigating sinkhole hazards and understanding groundwater and surface water interactions in karst terrains. LiDAR (Light Detection and Ranging) measures the earth's surface in high-resolution and high-density and has shown great potentials to drastically improve locating and delineating sinkholes. However, processing LiDAR data to extract sinkholes requires separating sinkholes from other depressions, which can be laborious because of the sheer number of the depressions commonly generated from LiDAR data. In this study, we applied the random forests, a machine learning method, to automatically separate sinkholes from other depressions in a karst region in central Kentucky. The sinkhole-extraction random forest was grown on a training dataset built from an area where LiDAR-derived depressions were manually classified through a visual inspection and field verification process. Based on the geometry of depressions, as well as natural and human factors related to sinkholes, 11 parameters were selected as predictive variables to form the dataset. Because the training dataset was imbalanced with the majority of depressions being non-sinkholes, a weighted random forests method was used to improve the accuracy of predicting sinkholes. The weighted random forest achieved an average accuracy of 89.95% for the training dataset, demonstrating that the random forest can be an effective sinkhole classifier. Testing of the random forest in another area, however, resulted in moderate success with an average accuracy rate of 73.96%. This study suggests that an automatic sinkhole extraction procedure like the random forest classifier can significantly reduce time and labor costs and makes its more tractable to map sinkholes using LiDAR data for large areas. However, the random forests method cannot totally replace manual procedures, such as visual inspection and field verification.
47 CFR 1.1603 - Conduct of random selection.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...
47 CFR 1.1603 - Conduct of random selection.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...
Cross-talk free selective reconstruction of individual objects from multiplexed optical field data
NASA Astrophysics Data System (ADS)
Zea, Alejandro Velez; Barrera, John Fredy; Torroba, Roberto
2018-01-01
In this paper we present a data multiplexing method for simultaneous storage in a single package composed by several optical fields of tridimensional (3D) objects, and their individual cross-talk free retrieval. Optical field data are extracted from off axis Fourier holograms, and then sampled by multiplying them with random binary masks. The resulting sampled optical fields can be used to reconstruct the original objects. Sampling causes a loss of quality that can be controlled by the number of white pixels in the binary masks and by applying a padding procedure on the optical field data. This process can be performed using a different binary mask for each optical field, and then added to form a multiplexed package. With the adequate choice of sampling and padding, we can achieve a volume reduction in the multiplexed package over the addition of all individual optical fields. Moreover, the package can be multiplied by a binary mask to select a specific optical field, and after the reconstruction procedure, the corresponding 3D object is recovered without any cross-talk. We demonstrate the effectiveness of our proposal for data compression with a comparison with discrete cosine transform filtering. Experimental results confirm the validity of our proposal.
Blencowe, Natalie S; Cook, Jonathan A; Pinkney, Thomas; Rogers, Chris; Reeves, Barnaby C; Blazeby, Jane M
2017-04-01
Randomized controlled trials in surgery are notoriously difficult to design and conduct due to numerous methodological and cultural challenges. Over the last 5 years, several UK-based surgical trial-related initiatives have been funded to address these issues. These include the development of Surgical Trials Centers and Surgical Specialty Leads (individual surgeons responsible for championing randomized controlled trials in their specialist fields), both funded by the Royal College of Surgeons of England; networks of research-active surgeons in training; and investment in methodological research relating to surgical randomized controlled trials (to address issues such as recruitment, blinding, and the selection and standardization of interventions). This article discusses these initiatives more in detail and provides exemplar cases to illustrate how the methodological challenges have been tackled. The initiatives have surpassed expectations, resulting in a renaissance in surgical research throughout the United Kingdom, such that the number of patients entering surgical randomized controlled trials has doubled.
Probabilistic Structures Analysis Methods (PSAM) for select space propulsion system components
NASA Technical Reports Server (NTRS)
1991-01-01
The basic formulation for probabilistic finite element analysis is described and demonstrated on a few sample problems. This formulation is based on iterative perturbation that uses the factorized stiffness on the unperturbed system as the iteration preconditioner for obtaining the solution to the perturbed problem. This approach eliminates the need to compute, store and manipulate explicit partial derivatives of the element matrices and force vector, which not only reduces memory usage considerably, but also greatly simplifies the coding and validation tasks. All aspects for the proposed formulation were combined in a demonstration problem using a simplified model of a curved turbine blade discretized with 48 shell elements, and having random pressure and temperature fields with partial correlation, random uniform thickness, and random stiffness at the root.
Rahbar, Mohammad H.; Wyatt, Gwen; Sikorskii, Alla; Victorson, David; Ardjomand-Hessabi, Manouchehr
2011-01-01
Background Multisite randomized clinical trials allow for increased research collaboration among investigators and expedite data collection efforts. As a result, government funding agencies typically look favorably upon this approach. As the field of complementary and alternative medicine (CAM) continues to evolve, so do increased calls for the use of more rigorous study design and trial methodologies, which can present challenges for investigators. Purpose To describe the processes involved in the coordination and management of a multisite randomized clinical trial of a CAM intervention. Methods Key aspects related to the coordination and management of a multisite CAM randomized clinical trial are presented, including organizational and site selection considerations, recruitment concerns and issues related to data collection and randomization to treatment groups. Management and monitoring of data, as well as quality assurance procedures are described. Finally, a real world perspective is shared from a recently conducted multisite randomized clinical trial of reflexology for women diagnosed with advanced breast cancer. Results The use of multiple sites in the conduct of CAM-based randomized clinical trials can provide an efficient, collaborative and robust approach to study coordination and data collection that maximizes efficiency and ensures the quality of results. Conclusions Multisite randomized clinical trial designs can offer the field of CAM research a more standardized and efficient approach to examine the effectiveness of novel therapies and treatments. Special attention must be given to intervention fidelity, consistent data collection and ensuring data quality. Assessment and reporting of quantitative indicators of data quality should be required. PMID:21664296
NASA Astrophysics Data System (ADS)
Elbakary, M. I.; Alam, M. S.; Aslan, M. S.
2008-03-01
In a FLIR image sequence, a target may disappear permanently or may reappear after some frames and crucial information such as direction, position and size related to the target are lost. If the target reappears at a later frame, it may not be tracked again because the 3D orientation, size and location of the target might be changed. To obtain information about the target before disappearing and to detect the target after reappearing, distance classifier correlation filter (DCCF) is trained manualy by selecting a number of chips randomly. This paper introduces a novel idea to eliminates the manual intervention in training phase of DCCF. Instead of selecting the training chips manually and selecting the number of the training chips randomly, we adopted the K-means algorithm to cluster the training frames and based on the number of clusters we select the training chips such that a training chip for each cluster. To detect and track the target after reappearing in the field-ofview ,TBF and DCCF are employed. The contduced experiemnts using real FLIR sequences show results similar to the traditional agorithm but eleminating the manual intervention is the advantage of the proposed algorithm.
Predicting the accuracy of ligand overlay methods with Random Forest models.
Nandigam, Ravi K; Evans, David A; Erickson, Jon A; Kim, Sangtae; Sutherland, Jeffrey J
2008-12-01
The accuracy of binding mode prediction using standard molecular overlay methods (ROCS, FlexS, Phase, and FieldCompare) is studied. Previous work has shown that simple decision tree modeling can be used to improve accuracy by selection of the best overlay template. This concept is extended to the use of Random Forest (RF) modeling for template and algorithm selection. An extensive data set of 815 ligand-bound X-ray structures representing 5 gene families was used for generating ca. 70,000 overlays using four programs. RF models, trained using standard measures of ligand and protein similarity and Lipinski-related descriptors, are used for automatically selecting the reference ligand and overlay method maximizing the probability of reproducing the overlay deduced from X-ray structures (i.e., using rmsd < or = 2 A as the criteria for success). RF model scores are highly predictive of overlay accuracy, and their use in template and method selection produces correct overlays in 57% of cases for 349 overlay ligands not used for training RF models. The inclusion in the models of protein sequence similarity enables the use of templates bound to related protein structures, yielding useful results even for proteins having no available X-ray structures.
JoAnn M. Hanowski; Gerald J. Niemi
1995-01-01
We established bird monitoring programs in two regions of Minnesota: the Chippewa National Forest and the Superior National Forest. The experimental design defined forest cover types as strata in which samples of forest stands were randomly selected. Subsamples (3 point counts) were placed in each stand to maximize field effort and to assess within-stand and between-...
Carbaryl applied at reduced dosage rates for control of western spruce budworm
George P. Markin; David R. Johnson
1983-01-01
Carbaryl is registered for control of the western spruce budworm (Choristoneura occidentalis Freeman), at the dosage rate of 1.12 kg per hectare. That rate and two lower ones were field tested in western Montana in July 1979 to determine if a lower rate would be as effective as the registered dosage. Each dosage was applied to five randomly selected...
1982-03-01
Most of the KEPONE was exported for use in the Caribbean and Central American banana fields, and in other countries for the control of potato beetles...quadrat: species diversity, stem density (by species), height (mean height of 10 randomly selected stems), flowering phenology (number of flowering stems
ERIC Educational Resources Information Center
General Accounting Office, Washington, DC. Div. of Human Resources.
Questionnaires gathered opinions of all Occupational Safety and Health Administration (OSHA) field supervisors and a randomly selected sample of one-third of the compliance officers about OSHA's approach to improving workplace safety and health. Major topics addressed were enforcement, safety and health standards, education and training, employer…
Xiaoqian Sun; Zhuoqiong He; John Kabrick
2008-01-01
This paper presents a Bayesian spatial method for analysing the site index data from the Missouri Ozark Forest Ecosystem Project (MOFEP). Based on ecological background and availability, we select three variables, the aspect class, the soil depth and the land type association as covariates for analysis. To allow great flexibility of the smoothness of the random field,...
Afshari, Daryoush; Moradian, Nasrin; Khalili, Majid; Razazian, Nazanin; Bostani, Arash; Hoseini, Jamal; Moradian, Mohamad; Ghiasian, Masoud
2016-10-01
Evidence is mounting that magnet therapy could alleviate the symptoms of multiple sclerosis (MS). This study was performed to test the effects of the pulsing magnetic fields on the paresthesia in MS patients. This study has been conducted as a randomized, double-blind, parallel-group clinical trial during the April 2012 to October 2013. The subjects were selected among patients referred to MS clinic of Imam Reza Hospital; affiliated to Kermanshah University of Medical Sciences, Iran. Sixty three patients with MS were included in the study and randomly were divided into two groups, 35 patients were exposed to a magnetic pulsing field of 4mT intensity and 15-Hz frequency sinusoidal wave for 20min per session 2 times per week over a period of 2 months involving 16 sessions and 28 patients was exposed to a magnetically inactive field (placebo) for 20min per session 2 times per week over a period of 2 months involving 16 sessions. The severity of paresthesia was measured by the numerical rating scale (NRS) at 30, 60days. The study primary end point was NRS change between baseline and 60days. The secondary outcome was NRS change between baseline and 30days. Patients exposing to magnetic field showed significant paresthesia improvement compared with the group of patients exposing to placebo. According to our results pulsed magnetic therapy could alleviate paresthesia in MS patients .But trials with more patients and longer duration are mandatory to describe long-term effects. Copyright © 2016 Elsevier B.V. All rights reserved.
Comparing two-zone models of dust exposure.
Jones, Rachael M; Simmons, Catherine E; Boelter, Fred W
2011-09-01
The selection and application of mathematical models to work tasks is challenging. Previously, we developed and evaluated a semi-empirical two-zone model that predicts time-weighted average (TWA) concentrations (Ctwa) of dust emitted during the sanding of drywall joint compound. Here, we fit the emission rate and random air speed variables of a mechanistic two-zone model to testing event data and apply and evaluate the model using data from two field studies. We found that the fitted random air speed values and emission rate were sensitive to (i) the size of the near-field and (ii) the objective function used for fitting, but this did not substantially impact predicted dust Ctwa. The mechanistic model predictions were lower than the semi-empirical model predictions and measured respirable dust Ctwa at Site A but were within an acceptable range. At Site B, a 10.5 m3 room, the mechanistic model did not capture the observed difference between PBZ and area Ctwa. The model predicted uniform mixing and predicted dust Ctwa up to an order of magnitude greater than was measured. We suggest that applications of the mechanistic model be limited to contexts where the near-field volume is very small relative to the far-field volume.
Optically-Induced Cell Fusion on Cell Pairing Microstructures
NASA Astrophysics Data System (ADS)
Yang, Po-Fu; Wang, Chih-Hung; Lee, Gwo-Bin
2016-02-01
Cell fusion is a critical operation for numerous biomedical applications including cell reprogramming, hybridoma formation, cancer immunotherapy, and tissue regeneration. However, unstable cell contact and random cell pairings have limited efficiency and yields when utilizing traditional methods. Furthermore, it is challenging to selectively perform cell fusion within a group of cells. This study reports a new approach called optically-induced cell fusion (OICF), which integrates cell-pairing microstructures with an optically-induced, localized electrical field. By projecting light patterns onto a photoconductive film (hydrogen-rich, amorphous silicon) coated on an indium-tin-oxide (ITO) glass while an alternating current electrical field was applied between two such ITO glass slides, “virtual” electrodes could be generated that could selectively fuse pairing cells. At 10 kHz, a 57% cell paring rate and an 87% fusion efficiency were successfully achieved at a driving voltage of 20 Vpp, suggesting that this new technology could be promising for selective cell fusion within a group of cells.
Neutrality and evolvability of designed protein sequences
NASA Astrophysics Data System (ADS)
Bhattacherjee, Arnab; Biswas, Parbati
2010-07-01
The effect of foldability on protein’s evolvability is analyzed by a two-prong approach consisting of a self-consistent mean-field theory and Monte Carlo simulations. Theory and simulation models representing protein sequences with binary patterning of amino acid residues compatible with a particular foldability criteria are used. This generalized foldability criterion is derived using the high temperature cumulant expansion approximating the free energy of folding. The effect of cumulative point mutations on these designed proteins is studied under neutral condition. The robustness, protein’s ability to tolerate random point mutations is determined with a selective pressure of stability (ΔΔG) for the theory designed sequences, which are found to be more robust than that of Monte Carlo and mean-field-biased Monte Carlo generated sequences. The results show that this foldability criterion selects viable protein sequences more effectively compared to the Monte Carlo method, which has a marked effect on how the selective pressure shapes the evolutionary sequence space. These observations may impact de novo sequence design and its applications in protein engineering.
Probabilistic Modeling of Settlement Risk at Land Disposal Facilities - 12304
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foye, Kevin C.; Soong, Te-Yang
2012-07-01
The long-term reliability of land disposal facility final cover systems - and therefore the overall waste containment - depends on the distortions imposed on these systems by differential settlement/subsidence. The evaluation of differential settlement is challenging because of the heterogeneity of the waste mass (caused by inconsistent compaction, void space distribution, debris-soil mix ratio, waste material stiffness, time-dependent primary compression of the fine-grained soil matrix, long-term creep settlement of the soil matrix and the debris, etc.) at most land disposal facilities. Deterministic approaches to long-term final cover settlement prediction are not able to capture the spatial variability in the wastemore » mass and sub-grade properties which control differential settlement. An alternative, probabilistic solution is to use random fields to model the waste and sub-grade properties. The modeling effort informs the design, construction, operation, and maintenance of land disposal facilities. A probabilistic method to establish design criteria for waste placement and compaction is introduced using the model. Random fields are ideally suited to problems of differential settlement modeling of highly heterogeneous foundations, such as waste. Random fields model the seemingly random spatial distribution of a design parameter, such as compressibility. When used for design, the use of these models prompts the need for probabilistic design criteria. It also allows for a statistical approach to waste placement acceptance criteria. An example design evaluation was performed, illustrating the use of the probabilistic differential settlement simulation methodology to assemble a design guidance chart. The purpose of this design evaluation is to enable the designer to select optimal initial combinations of design slopes and quality control acceptance criteria that yield an acceptable proportion of post-settlement slopes meeting some design minimum. For this specific example, relative density, which can be determined through field measurements, was selected as the field quality control parameter for waste placement. This technique can be extended to include a rigorous performance-based methodology using other parameters (void space criteria, debris-soil mix ratio, pre-loading, etc.). As shown in this example, each parameter range, or sets of parameter ranges can be selected such that they can result in an acceptable, long-term differential settlement according to the probabilistic model. The methodology can also be used to re-evaluate the long-term differential settlement behavior at closed land disposal facilities to identify, if any, problematic facilities so that remedial action (e.g., reinforcement of upper and intermediate waste layers) can be implemented. Considering the inherent spatial variability in waste and earth materials and the need for engineers to apply sound quantitative practices to engineering analysis, it is important to apply the available probabilistic techniques to problems of differential settlement. One such method to implement probability-based differential settlement analyses for the design of landfill final covers has been presented. The design evaluation technique presented is one tool to bridge the gap from deterministic practice to probabilistic practice. (authors)« less
Clustering of galaxies near damped Lyman-alpha systems with (z) = 2.6
NASA Technical Reports Server (NTRS)
Wolfe, A. M
1993-01-01
The galaxy two-point correlation function, xi, at (z) = 2.6 is determined by comparing the number of Ly-alpha-emitting galaxies in narrowband CCD fields selected for the presence of damped L-alpha absorption to their number in randomly selected control fields. Comparisons between the presented determination of (xi), a density-weighted volume average of xi, and model predictions for (xi) at large redshifts show that models in which the clustering pattern is fixed in proper coordinates are highly unlikely, while better agreement is obtained if the clustering pattern is fixed in comoving coordinates. Therefore, clustering of Ly-alpha-emitting galaxies around damped Ly-alpha systems at large redshifts is strong. It is concluded that the faint blue galaxies are drawn from a parent population different from normal galaxies, the presumed offspring of damped Ly-alpha systems.
NASA Astrophysics Data System (ADS)
Kim, Sang-Koog; Lee, Ki-Suk; Yu, Young-Sang; Choi, Youn-Seok
2008-01-01
The authors investigated the technological utility of counterclockwise (CCW) and clockwise (CW) circular-rotating fields (HCCW and HCW) and spin-polarized currents with an angular frequency ωH close to the vortex eigenfrequency ωD, for the reliable, low-power, and selective switching of the bistate magnetization (M) orientations of a vortex core (VC) in an array of soft magnetic nanoelements. CCW and CW circular gyrotropic motions in response to HCCW and HCW, respectively, show remarkably contrasting resonant behaviors, (i.e., extremely large-amplitude resonance versus small-amplitude nonresonance), depending on the M orientation of a given VC. Owing to this asymmetric resonance characteristics, the HCCW(HCW) with ωH˜ωD can be used to effectively switch only the up (down) core to its downward (upward) M orientation, selectively, by sufficiently low field (˜10Oe) and current density (˜107A/cm2). This work provides a reliable, low power, effective means of information storage, information recording, and information readout in vortex-based random access memory, simply called VRAM.
A software system for the simulation of chest lesions
NASA Astrophysics Data System (ADS)
Ryan, John T.; McEntee, Mark; Barrett, Saoirse; Evanoff, Michael; Manning, David; Brennan, Patrick
2007-03-01
We report on the development of a novel software tool for the simulation of chest lesions. This software tool was developed for use in our study to attain optimal ambient lighting conditions for chest radiology. This study involved 61 consultant radiologists from the American Board of Radiology. Because of its success, we intend to use the same tool for future studies. The software has two main functions: the simulation of lesions and retrieval of information for ROC (Receiver Operating Characteristic) and JAFROC (Jack-Knife Free Response ROC) analysis. The simulation layer operates by randomly selecting an image from a bank of reportedly normal chest x-rays. A random location is then generated for each lesion, which is checked against a reference lung-map. If the location is within the lung fields, as derived from the lung-map, a lesion is superimposed. Lesions are also randomly selected from a bank of manually created chest lesion images. A blending algorithm determines which are the best intensity levels for the lesion to sit naturally within the chest x-ray. The same software was used to run a study for all 61 radiologists. A sequence of images is displayed in random order. Half of these images had simulated lesions, ranging from subtle to obvious, and half of the images were normal. The operator then selects locations where he/she thinks lesions exist and grades the lesion accordingly. We have found that this software was very effective in this study and intend to use the same principles for future studies.
Kirsten Gallo; Steven H. Lanigan; Peter Eldred; Sean N. Gordon; Chris Moyer
2005-01-01
We aggregated road, vegetation, and inchannel data to assess the condition of sixth-field watersheds and describe the distribution of the condition of watersheds in the Northwest Forest Plan (the Plan) area. The assessment is based on 250 watersheds selected at random within the Plan area. The distributions of conditions are presented for watersheds and for many of the...
IMBLMS phase B4, additional tasks 5.0. Microbial identification system
NASA Technical Reports Server (NTRS)
1971-01-01
A laboratory study was undertaken to provide simplified procedures leading to the presumptive identification (I/D) of defined microorganisms on-board an orbiting spacecraft. Identifications were to be initiated by nonprofessional bacteriologists, (crew members) on a contingency basis only. Key objectives/constraints for this investigation were as follows:(1) I/D procedures based on limited, defined diagnostic tests, (2) testing oriented about ten selected microorganisms, (3) provide for definitive I/D key and procedures per selected organism, (4) define possible occurrences of false positives for the resulting I/D key by search of the appropriate literature, and (5) evaluation of the I/D key and procedure through a limited field trial on randomly selected subjects using the I/D key.
An, Zhao; Wen-Xin, Zhang; Zhong, Yao; Yu-Kuan, Ma; Qing, Liu; Hou-Lang, Duan; Yi-di, Shang
2016-06-29
To optimize and simplify the survey method of Oncomelania hupensis snail in marshland endemic region of schistosomiasis and increase the precision, efficiency and economy of the snail survey. A quadrate experimental field was selected as the subject of 50 m×50 m size in Chayegang marshland near Henghu farm in the Poyang Lake region and a whole-covered method was adopted to survey the snails. The simple random sampling, systematic sampling and stratified random sampling methods were applied to calculate the minimum sample size, relative sampling error and absolute sampling error. The minimum sample sizes of the simple random sampling, systematic sampling and stratified random sampling methods were 300, 300 and 225, respectively. The relative sampling errors of three methods were all less than 15%. The absolute sampling errors were 0.221 7, 0.302 4 and 0.047 8, respectively. The spatial stratified sampling with altitude as the stratum variable is an efficient approach of lower cost and higher precision for the snail survey.
NASA Astrophysics Data System (ADS)
Liu, Zhangjun; Liu, Zenghui
2018-06-01
This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.
NASA Astrophysics Data System (ADS)
Bakhtiar, Nurizatul Syarfinas Ahmad; Abdullah, Farah Aini; Hasan, Yahya Abu
2017-08-01
In this paper, we consider the dynamical behaviour of the random field on the pulsating and snaking solitons in a dissipative systems described by the one-dimensional cubic-quintic complex Ginzburg-Landau equation (cqCGLE). The dynamical behaviour of the random filed was simulated by adding a random field to the initial pulse. Then, we solve it numerically by fixing the initial amplitude profile for the pulsating and snaking solitons without losing any generality. In order to create the random field, we choose 0 ≤ ɛ ≤ 1.0. As a result, multiple soliton trains are formed when the random field is applied to a pulse like initial profile for the parameters of the pulsating and snaking solitons. The results also show the effects of varying the random field of the transient energy peaks in pulsating and snaking solitons.
Das, Banibrata
2016-07-03
Brick manufacturing process releases large amounts of silica dust into the work environment due to the use of silica-containing materials. The main aim of the study was to investigate the impairment of lung function and prevalence of respiratory symptoms among the different groups of brick field workers in comparison with control subjects. A total of 250 brick field workers and 130 unexposed control subjects were randomly selected in which demographic characteristics, respiratory symptoms, and lung function values were recorded. The result showed significantly lower p value (<.001) in lung function and respiratory symptoms among brick field workers when compared with control group. The prevalence of respiratory symptoms was dyspnea (46.8%), phlegm (39.2%), and chest tightness (27.6%). Dust exposure in working environment affected the lung function values and increased the respiratory symptoms among the brick field workers.
ERIC Educational Resources Information Center
Köroglu, Zehra; Tüm, Gülden
2017-01-01
This study has been conducted to evaluate the TM usage in the MA theses written by the native speakers (NSs) of English and the Turkish speakers (TSs) of English. The purpose is to compare the TM usage in the introduction, results and discussion, and conclusion sections by both groups' randomly selected MA theses in the field of ELT between the…
2013-10-01
thrombin inhibition, leading to coagulopathy. Using intravital microscopy, we have obtained direct in vivo data showing glycocalyx thickness reduction...collected in 3.2% citrate for coagulation assays (ROTEM, TEM Innovations GmbH, Munich, Germany). Intravital Microscopy The system described in detail...microscopic fields containing venules were randomly selected. The first dye (TR-Dx70) was injected 5 min before baseline. Image sequences of
Tucker, Jalie A.; Reed, Geoffrey M.
2008-01-01
This paper examines the utility of evidentiary pluralism, a research strategy that selects methods in service of content questions, in the context of rehabilitation psychology. Hierarchical views that favor randomized controlled clinical trials (RCTs) over other evidence are discussed, and RCTs are considered as they intersect with issues in the field. RCTs are vital for establishing treatment efficacy, but whether they are uniformly the best evidence to inform practice is critically evaluated. We argue that because treatment is only one of several variables that influence functioning, disability, and participation over time, an expanded set of conceptual and data analytic approaches should be selected in an informed way to support an expanded research agenda that investigates therapeutic and extra-therapeutic influences on rehabilitation processes and outcomes. The benefits of evidentiary pluralism are considered, including helping close the gap between the narrower clinical rehabilitation model and a public health disability model. KEY WORDS: evidence-based practice, evidentiary pluralism, rehabilitation psychology, randomized controlled trials PMID:19649150
Columnar organization of orientation domains in V1
NASA Astrophysics Data System (ADS)
Liedtke, Joscha; Wolf, Fred
In the primary visual cortex (V1) of primates and carnivores, the functional architecture of basic stimulus selectivities appears similar across cortical layers (Hubel & Wiesel, 1962) justifying the use of two-dimensional cortical models and disregarding organization in the third dimension. Here we show theoretically that already small deviations from an exact columnar organization lead to non-trivial three-dimensional functional structures. We extend two-dimensional random field models (Schnabel et al., 2007) to a three-dimensional cortex by keeping a typical scale in each layer and introducing a correlation length in the third, columnar dimension. We examine in detail the three-dimensional functional architecture for different cortical geometries with different columnar correlation lengths. We find that (i) topological defect lines are generally curved and (ii) for large cortical curvatures closed loops and reconnecting topological defect lines appear. This theory extends the class of random field models by introducing a columnar dimension and provides a systematic statistical assessment of the three-dimensional functional architecture of V1 (see also (Tanaka et al., 2011)).
A multiscale Markov random field model in wavelet domain for image segmentation
NASA Astrophysics Data System (ADS)
Dai, Peng; Cheng, Yu; Wang, Shengchun; Du, Xinyu; Wu, Dan
2017-07-01
The human vision system has abilities for feature detection, learning and selective attention with some properties of hierarchy and bidirectional connection in the form of neural population. In this paper, a multiscale Markov random field model in the wavelet domain is proposed by mimicking some image processing functions of vision system. For an input scene, our model provides its sparse representations using wavelet transforms and extracts its topological organization using MRF. In addition, the hierarchy property of vision system is simulated using a pyramid framework in our model. There are two information flows in our model, i.e., a bottom-up procedure to extract input features and a top-down procedure to provide feedback controls. The two procedures are controlled simply by two pyramidal parameters, and some Gestalt laws are also integrated implicitly. Equipped with such biological inspired properties, our model can be used to accomplish different image segmentation tasks, such as edge detection and region segmentation.
Kouritzin, Michael A; Newton, Fraser; Wu, Biao
2013-04-01
Herein, we propose generating CAPTCHAs through random field simulation and give a novel, effective and efficient algorithm to do so. Indeed, we demonstrate that sufficient information about word tests for easy human recognition is contained in the site marginal probabilities and the site-to-nearby-site covariances and that these quantities can be embedded directly into certain conditional probabilities, designed for effective simulation. The CAPTCHAs are then partial random realizations of the random CAPTCHA word. We start with an initial random field (e.g., randomly scattered letter pieces) and use Gibbs resampling to re-simulate portions of the field repeatedly using these conditional probabilities until the word becomes human-readable. The residual randomness from the initial random field together with the random implementation of the CAPTCHA word provide significant resistance to attack. This results in a CAPTCHA, which is unrecognizable to modern optical character recognition but is recognized about 95% of the time in a human readability study.
Evolution of basic equations for nearshore wave field
ISOBE, Masahiko
2013-01-01
In this paper, a systematic, overall view of theories for periodic waves of permanent form, such as Stokes and cnoidal waves, is described first with their validity ranges. To deal with random waves, a method for estimating directional spectra is given. Then, various wave equations are introduced according to the assumptions included in their derivations. The mild-slope equation is derived for combined refraction and diffraction of linear periodic waves. Various parabolic approximations and time-dependent forms are proposed to include randomness and nonlinearity of waves as well as to simplify numerical calculation. Boussinesq equations are the equations developed for calculating nonlinear wave transformations in shallow water. Nonlinear mild-slope equations are derived as a set of wave equations to predict transformation of nonlinear random waves in the nearshore region. Finally, wave equations are classified systematically for a clear theoretical understanding and appropriate selection for specific applications. PMID:23318680
Influence of the operatory field isolation technique on tooth-colored direct dental restorations.
Cajazeira, Marlus Roberto Rodrigues; De Sabóia, Ticiana Medeiros; Maia, Lucianne Cople
2014-06-01
To evaluate, through a systematic review, the influence of the operatory field isolation technique on the longevity of dental restorations performed with tooth-colored materials. An electronic search of the scientific databases (MEDLINE, SCIRUS, VHL and SIGLE) and reference lists of the selected articles was conducted to identify randomized controlled clinical trials with a follow-up period of at least 12 months. The selected articles evaluated the effects of the operatory field isolation techniques (rubber dam or cotton rolls/saliva ejector) on the longevity of direct restorations performed with tooth-colored materials (e.g. resin composites, compomers and glass-ionomer cements) in primary or permanent posterior teeth. The selected studies were analyzed and categorized using a checklist proposed by the National Institute for Health and Clinical Excellence of the United Kingdom. 484 studies were identified on the scientific databases. After applying the exclusion criteria and removal of duplicates, a total of nine studies were considered as potentially eligible. From these, five studies were included in the final analysis by two evaluators. In four studies analyzed, the use of rubber dam did not influence the longevity of restorations in comparison to cotton rolls/saliva ejector. Only two studies were considered as low risk of bias.
Evaluation of a Teleform-based data collection system: a multi-center obesity research case study.
Jenkins, Todd M; Wilson Boyce, Tawny; Akers, Rachel; Andringa, Jennifer; Liu, Yanhong; Miller, Rosemary; Powers, Carolyn; Ralph Buncher, C
2014-06-01
Utilizing electronic data capture (EDC) systems in data collection and management allows automated validation programs to preemptively identify and correct data errors. For our multi-center, prospective study we chose to use TeleForm, a paper-based data capture software that uses recognition technology to create case report forms (CRFs) with similar functionality to EDC, including custom scripts to identify entry errors. We quantified the accuracy of the optimized system through a data audit of CRFs and the study database, examining selected critical variables for all subjects in the study, as well as an audit of all variables for 25 randomly selected subjects. Overall we found 6.7 errors per 10,000 fields, with similar estimates for critical (6.9/10,000) and non-critical (6.5/10,000) variables-values that fall below the acceptable quality threshold of 50 errors per 10,000 established by the Society for Clinical Data Management. However, error rates were found to widely vary by type of data field, with the highest rate observed with open text fields. Copyright © 2014 Elsevier Ltd. All rights reserved.
Kernel-Correlated Levy Field Driven Forward Rate and Application to Derivative Pricing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bo Lijun; Wang Yongjin; Yang Xuewei, E-mail: xwyangnk@yahoo.com.cn
2013-08-01
We propose a term structure of forward rates driven by a kernel-correlated Levy random field under the HJM framework. The kernel-correlated Levy random field is composed of a kernel-correlated Gaussian random field and a centered Poisson random measure. We shall give a criterion to preclude arbitrage under the risk-neutral pricing measure. As applications, an interest rate derivative with general payoff functional is priced under this pricing measure.
Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design
ERIC Educational Resources Information Center
Wagler, Amy; Wagler, Ron
2014-01-01
Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…
Hierarchical Bayesian spatial models for alcohol availability, drug "hot spots" and violent crime.
Zhu, Li; Gorman, Dennis M; Horel, Scott
2006-12-07
Ecologic studies have shown a relationship between alcohol outlet densities, illicit drug use and violence. The present study examined this relationship in the City of Houston, Texas, using a sample of 439 census tracts. Neighborhood sociostructural covariates, alcohol outlet density, drug crime density and violent crime data were collected for the year 2000, and analyzed using hierarchical Bayesian models. Model selection was accomplished by applying the Deviance Information Criterion. The counts of violent crime in each census tract were modelled as having a conditional Poisson distribution. Four neighbourhood explanatory variables were identified using principal component analysis. The best fitted model was selected as the one considering both unstructured and spatial dependence random effects. The results showed that drug-law violation explained a greater amount of variance in violent crime rates than alcohol outlet densities. The relative risk for drug-law violation was 2.49 and that for alcohol outlet density was 1.16. Of the neighbourhood sociostructural covariates, males of age 15 to 24 showed an effect on violence, with a 16% decrease in relative risk for each increase the size of its standard deviation. Both unstructured heterogeneity random effect and spatial dependence need to be included in the model. The analysis presented suggests that activity around illicit drug markets is more strongly associated with violent crime than is alcohol outlet density. Unique among the ecological studies in this field, the present study not only shows the direction and magnitude of impact of neighbourhood sociostructural covariates as well as alcohol and illicit drug activities in a neighbourhood, it also reveals the importance of applying hierarchical Bayesian models in this research field as both spatial dependence and heterogeneity random effects need to be considered simultaneously.
Is the Non-Dipole Magnetic Field Random?
NASA Technical Reports Server (NTRS)
Walker, Andrew D.; Backus, George E.
1996-01-01
Statistical modelling of the Earth's magnetic field B has a long history. In particular, the spherical harmonic coefficients of scalar fields derived from B can be treated as Gaussian random variables. In this paper, we give examples of highly organized fields whose spherical harmonic coefficients pass tests for independent Gaussian random variables. The fact that coefficients at some depth may be usefully summarized as independent samples from a normal distribution need not imply that there really is some physical, random process at that depth. In fact, the field can be extremely structured and still be regarded for some purposes as random. In this paper, we examined the radial magnetic field B(sub r) produced by the core, but the results apply to any scalar field on the core-mantle boundary (CMB) which determines B outside the CMB.
Application of Methods of Numerical Analysis to Physical and Engineering Data.
1980-10-15
directed algorithm would seem to be called for. However, 1(0) is itself a random process, making its gradient too unreliable for such a sensitive algorithm...radiation energy on the detector . Active laser systems, on the other hand, have created now the possibility for extremely narrow path band systems...emitted by the earth and its atmosphere. The broad spectral range was selected so that the field of view of the detector could be narrowed to obtain
Efficient robust conditional random fields.
Song, Dongjin; Liu, Wei; Zhou, Tianyi; Tao, Dacheng; Meyer, David A
2015-10-01
Conditional random fields (CRFs) are a flexible yet powerful probabilistic approach and have shown advantages for popular applications in various areas, including text analysis, bioinformatics, and computer vision. Traditional CRF models, however, are incapable of selecting relevant features as well as suppressing noise from noisy original features. Moreover, conventional optimization methods often converge slowly in solving the training procedure of CRFs, and will degrade significantly for tasks with a large number of samples and features. In this paper, we propose robust CRFs (RCRFs) to simultaneously select relevant features. An optimal gradient method (OGM) is further designed to train RCRFs efficiently. Specifically, the proposed RCRFs employ the l1 norm of the model parameters to regularize the objective used by traditional CRFs, therefore enabling discovery of the relevant unary features and pairwise features of CRFs. In each iteration of OGM, the gradient direction is determined jointly by the current gradient together with the historical gradients, and the Lipschitz constant is leveraged to specify the proper step size. We show that an OGM can tackle the RCRF model training very efficiently, achieving the optimal convergence rate [Formula: see text] (where k is the number of iterations). This convergence rate is theoretically superior to the convergence rate O(1/k) of previous first-order optimization methods. Extensive experiments performed on three practical image segmentation tasks demonstrate the efficacy of OGM in training our proposed RCRFs.
Seeking mathematics success for college students: a randomized field trial of an adapted approach
NASA Astrophysics Data System (ADS)
Gula, Taras; Hoessler, Carolyn; Maciejewski, Wes
2015-11-01
Many students enter the Canadian college system with insufficient mathematical ability and leave the system with little improvement. Those students who enter with poor mathematics ability typically take a developmental mathematics course as their first and possibly only mathematics course. The educational experiences that comprise a developmental mathematics course vary widely and are, too often, ineffective at improving students' ability. This trend is concerning, since low mathematics ability is known to be related to lower rates of success in subsequent courses. To date, little attention has been paid to the selection of an instructional approach to consistently apply across developmental mathematics courses. Prior research suggests that an appropriate instructional method would involve explicit instruction and practising mathematical procedures linked to a mathematical concept. This study reports on a randomized field trial of a developmental mathematics approach at a college in Ontario, Canada. The new approach is an adaptation of the JUMP Math program, an explicit instruction method designed for primary and secondary school curriculae, to the college learning environment. In this study, a subset of courses was assigned to JUMP Math and the remainder was taught in the same style as in the previous years. We found consistent, modest improvement in the JUMP Math sections compared to the non-JUMP sections, after accounting for potential covariates. The findings from this randomized field trial, along with prior research on effective education for developmental mathematics students, suggest that JUMP Math is a promising way to improve college student outcomes.
A Review of Clinical Trials in Spinal Cord Injury including Biomarkers.
Badhiwala, Jetan H; Wilson, Jefferson R; Kwon, Brian K; Casha, Steve; Fehlings, Michael G
2018-06-11
Acute traumatic spinal cord injury (SCI) entered the arena of prospective randomized clinical trials almost 40 years ago, with the undertaking of the National Acute Spinal Cord Study (NASCIS) I trial. Since then, a number of clinical trials have been conducted in the field, spurred by the devastating physical, social, and economic consequences of acute SCI for patients, families, and society at large. Many of these have been controversial and attracted criticism. The current review provides a critical summary of select past and current clinical trials in SCI, focusing in particular on the findings of prospective randomized controlled trials (RCTs), the challenges and barriers encountered, and the valuable lessons learned that can be applied to future trials.
NASA Astrophysics Data System (ADS)
Vanmarcke, Erik
1983-03-01
Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.
Surface plasmon enhanced cell microscopy with blocked random spatial activation
NASA Astrophysics Data System (ADS)
Son, Taehwang; Oh, Youngjin; Lee, Wonju; Yang, Heejin; Kim, Donghyun
2016-03-01
We present surface plasmon enhanced fluorescence microscopy with random spatial sampling using patterned block of silver nanoislands. Rigorous coupled wave analysis was performed to confirm near-field localization on nanoislands. Random nanoislands were fabricated in silver by temperature annealing. By analyzing random near-field distribution, average size of localized fields was found to be on the order of 135 nm. Randomly localized near-fields were used to spatially sample F-actin of J774 cells (mouse macrophage cell-line). Image deconvolution algorithm based on linear imaging theory was established for stochastic estimation of fluorescent molecular distribution. The alignment between near-field distribution and raw image was performed by the patterned block. The achieved resolution is dependent upon factors including the size of localized fields and estimated to be 100-150 nm.
Application of random effects to the study of resource selection by animals
Gillies, C.S.; Hebblewhite, M.; Nielsen, S.E.; Krawchuk, M.A.; Aldridge, Cameron L.; Frair, J.L.; Saher, D.J.; Stevens, C.E.; Jerde, C.L.
2006-01-01
1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence.2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability.3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed.4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects.5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection.6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions limiting their generality. This approach will allow researchers to appropriately estimate marginal (population) and conditional (individual) responses, and account for complex grouping, unbalanced sample designs and autocorrelation.
Application of random effects to the study of resource selection by animals.
Gillies, Cameron S; Hebblewhite, Mark; Nielsen, Scott E; Krawchuk, Meg A; Aldridge, Cameron L; Frair, Jacqueline L; Saher, D Joanne; Stevens, Cameron E; Jerde, Christopher L
2006-07-01
1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions limiting their generality. This approach will allow researchers to appropriately estimate marginal (population) and conditional (individual) responses, and account for complex grouping, unbalanced sample designs and autocorrelation.
Murthy, Aditya; Ray, Supriya; Shorter, Stephanie M; Schall, Jeffrey D; Thompson, Kirk G
2009-05-01
The dynamics of visual selection and saccade preparation by the frontal eye field was investigated in macaque monkeys performing a search-step task combining the classic double-step saccade task with visual search. Reward was earned for producing a saccade to a color singleton. On random trials the target and one distractor swapped locations before the saccade and monkeys were rewarded for shifting gaze to the new singleton location. A race model accounts for the probabilities and latencies of saccades to the initial and final singleton locations and provides a measure of the duration of a covert compensation process-target-step reaction time. When the target stepped out of a movement field, noncompensated saccades to the original location were produced when movement-related activity grew rapidly to a threshold. Compensated saccades to the final location were produced when the growth of the original movement-related activity was interrupted within target-step reaction time and was replaced by activation of other neurons producing the compensated saccade. When the target stepped into a receptive field, visual neurons selected the new target location regardless of the monkeys' response. When the target stepped out of a receptive field most visual neurons maintained the representation of the original target location, but a minority of visual neurons showed reduced activity. Chronometric analyses of the neural responses to the target step revealed that the modulation of visually responsive neurons and movement-related neurons occurred early enough to shift attention and saccade preparation from the old to the new target location. These findings indicate that visual activity in the frontal eye field signals the location of targets for orienting, whereas movement-related activity instantiates saccade preparation.
Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos
2015-01-01
The objective of this study was to assess the risk of bias of randomized controlled trials (RCTs) published in prosthodontic and implant dentistry journals. The last 30 issues of 9 journals in the field of prosthodontic and implant dentistry (Clinical Implant Dentistry and Related Research, Clinical Oral Implants Research, Implant Dentistry, International Journal of Oral & Maxillofacial Implants, International Journal of Periodontics and Restorative Dentistry, International Journal of Prosthodontics, Journal of Dentistry, Journal of Oral Rehabilitation, and Journal of Prosthetic Dentistry) were hand-searched for RCTs. Risk of bias was assessed using the Cochrane Collaboration's risk of bias tool and analyzed descriptively. From the 3,667 articles screened, a total of 147 RCTs were identified and included. The number of published RCTs increased with time. The overall distribution of a high risk of bias assessment varied across the domains of the Cochrane risk of bias tool: 8% for random sequence generation, 18% for allocation concealment, 41% for masking, 47% for blinding of outcome assessment, 7% for incomplete outcome data, 12% for selective reporting, and 41% for other biases. The distribution of high risk of bias for RCTs published in the selected prosthodontic and implant dentistry journals varied among journals and ranged from 8% to 47%, which can be considered as substantial.
Petersen, James H.; DeAngelis, Donald L.
1992-01-01
The behavior of individual northern squawfish (Ptychocheilus oregonensis) preying on juvenile salmonids was modeled to address questions about capture rate and the timing of prey captures (random versus contagious). Prey density, predator weight, prey weight, temperature, and diel feeding pattern were first incorporated into predation equations analogous to Holling Type 2 and Type 3 functional response models. Type 2 and Type 3 equations fit field data from the Columbia River equally well, and both models predicted predation rates on five of seven independent dates. Selecting a functional response type may be complicated by variable predation rates, analytical methods, and assumptions of the model equations. Using the Type 2 functional response, random versus contagious timing of prey capture was tested using two related models. ln the simpler model, salmon captures were assumed to be controlled by a Poisson renewal process; in the second model, several salmon captures were assumed to occur during brief "feeding bouts", modeled with a compound Poisson process. Salmon captures by individual northern squawfish were clustered through time, rather than random, based on comparison of model simulations and field data. The contagious-feeding result suggests that salmonids may be encountered as patches or schools in the river.
NASA Astrophysics Data System (ADS)
Andresen, Juan Carlos; Katzgraber, Helmut G.; Schechter, Moshe
2017-12-01
Random fields disorder Ising ferromagnets by aligning single spins in the direction of the random field in three space dimensions, or by flipping large ferromagnetic domains at dimensions two and below. While the former requires random fields of typical magnitude similar to the interaction strength, the latter Imry-Ma mechanism only requires infinitesimal random fields. Recently, it has been shown that for dilute anisotropic dipolar systems a third mechanism exists, where the ferromagnetic phase is disordered by finite-size glassy domains at a random field of finite magnitude that is considerably smaller than the typical interaction strength. Using large-scale Monte Carlo simulations and zero-temperature numerical approaches, we show that this mechanism applies to disordered ferromagnets with competing short-range ferromagnetic and antiferromagnetic interactions, suggesting its generality in ferromagnetic systems with competing interactions and an underlying spin-glass phase. A finite-size-scaling analysis of the magnetization distribution suggests that the transition might be first order.
Campbell, Rebecca; Pierce, Steven J; Sharma, Dhruv B; Shaw, Jessica; Feeney, Hannah; Nye, Jeffrey; Schelling, Kristin; Fehler-Cabral, Giannina
2017-01-01
A growing number of U.S. cities have large numbers of untested sexual assault kits (SAKs) in police property facilities. Testing older kits and maintaining current case work will be challenging for forensic laboratories, creating a need for more efficient testing methods. We evaluated selective degradation methods for DNA extraction using actual case work from a sample of previously unsubmitted SAKs in Detroit, Michigan. We randomly assigned 350 kits to either standard or selective degradation testing methods and then compared DNA testing rates and CODIS entry rates between the two groups. Continuation-ratio modeling showed no significant differences, indicating that the selective degradation method had no decrement in performance relative to customary methods. Follow-up equivalence tests indicated that CODIS entry rates for the two methods could differ by more than ±5%. Selective degradation methods required less personnel time for testing and scientific review than standard testing. © 2016 American Academy of Forensic Sciences.
Mate choice theory and the mode of selection in sexual populations.
Carson, Hampton L
2003-05-27
Indirect new data imply that mate and/or gamete choice are major selective forces driving genetic change in sexual populations. The system dictates nonrandom mating, an evolutionary process requiring both revised genetic theory and new data on heritability of characters underlying Darwinian fitness. Successfully reproducing individuals represent rare selections from among vigorous, competing survivors of preadult natural selection. Nonrandom mating has correlated demographic effects: reduced effective population size, inbreeding, low gene flow, and emphasis on deme structure. Characters involved in choice behavior at reproduction appear based on quantitative trait loci. This variability serves selection for fitness within the population, having only an incidental relationship to the origin of genetically based reproductive isolation between populations. The claim that extensive hybridization experiments with Drosophila indicate that selection favors a gradual progression of "isolating mechanisms" is flawed, because intra-group random mating is assumed. Over deep time, local sexual populations are strong, independent genetic systems that use rich fields of variable polygenic components of fitness. The sexual reproduction system thus particularizes, in small subspecific populations, the genetic basis of the grand adaptive sweep of selective evolutionary change, much as Darwin proposed.
Some Mixotrophic Flagellate Species Selectively Graze on Archaea
Ballen-Segura, Miguel; Catalan, Jordi
2016-01-01
ABSTRACT Many phototrophic flagellates ingest prokaryotes. This mixotrophic trait becomes a critical aspect of the microbial loop in planktonic food webs because of the typical high abundance of these flagellates. Our knowledge of their selective feeding upon different groups of prokaryotes, particularly under field conditions, is still quite limited. In this study, we investigated the feeding behavior of three species (Rhodomonas sp., Cryptomonas ovata, and Dinobryon cylindricum) via their food vacuole content in field populations of a high mountain lake. We used the catalyzed reporter deposition-fluorescence in situ hybridization (CARD-FISH) protocol with probes specific for the domain Archaea and three groups of Eubacteria: Betaproteobacteria, Actinobacteria, and Cytophaga-Flavobacteria of Bacteroidetes. Our results provide field evidence that contrasting selective feeding exists between coexisting mixotrophic flagellates under the same environmental conditions and that some prokaryotic groups may be preferentially impacted by phagotrophic pressure in aquatic microbial food webs. In our study, Archaea were the preferred prey, chiefly in the case of Rhodomonas sp., which rarely fed on any other prokaryotic group. In general, prey selection did not relate to prey size among the grazed groups. However, Actinobacteria, which were clearly avoided, mostly showed a size of <0.5 μm, markedly smaller than cells from the other groups. IMPORTANCE That mixotrophic flagellates are not randomly feeding in the main prokaryotic groups under field conditions is a pioneer finding in species-specific behavior that paves the way for future studies according to this new paradigm. The particular case that Archaea were preferentially affected in the situation studied shows that phagotrophic pressure cannot be disregarded when considering the distribution of this group in freshwater oligotrophic systems. PMID:27815273
2011-10-01
lung tissue. We were not able to detect sufficient numbers of cells in this manner. We tried a different procedure for fixing the lungs after they...added after 24 hours. The films were fixed and evaluated microscopically. In four trials, 10 random microscopic fields were selected and... dosing by oral gavage once daily with 1.3 mg/kg L-valine-L-boroproline called talabostat (extracellular & intracellular DASH), 13.3 mg/kg L- glutamyl
Winter Simulation Conference, Miami Beach, Fla., December 4-6, 1978, Proceedings. Volumes 1 & 2
NASA Technical Reports Server (NTRS)
Highland, H. J. (Editor); Nielsen, N. R.; Hull, L. G.
1978-01-01
The papers report on the various aspects of simulation such as random variate generation, simulation optimization, ranking and selection of alternatives, model management, documentation, data bases, and instructional methods. Simulation studies in a wide variety of fields are described, including system design and scheduling, government and social systems, agriculture, computer systems, the military, transportation, corporate planning, ecosystems, health care, manufacturing and industrial systems, computer networks, education, energy, production planning and control, financial models, behavioral models, information systems, and inventory control.
New Instrumentation for Phase Partitioning
NASA Technical Reports Server (NTRS)
Harris, J. M.
1985-01-01
Cells and molecules can be purified by partitioning between the two immiscible liquid phases formed by aqueous solutions of poly/ethylene glycol and dextran. Such purification can be more selective, higher yielding, and less destructive to sensitive biological materials than other available techniques. Earth's gravitational field is a hindering factor as it causes sedimentation of particles to be purified and shear-induced particle randomization. The present proposal is directed toward developing new instrumentation for performing phase partitioning both on Earth and in microgravity.
On Pfaffian Random Point Fields
NASA Astrophysics Data System (ADS)
Kargin, V.
2014-02-01
We study Pfaffian random point fields by using the Moore-Dyson quaternion determinants. First, we give sufficient conditions that ensure that a self-dual quaternion kernel defines a valid random point field, and then we prove a CLT for Pfaffian point fields. The proofs are based on a new quaternion extension of the Cauchy-Binet determinantal identity. In addition, we derive the Fredholm determinantal formulas for the Pfaffian point fields which use the quaternion determinant.
Tortuosity of lightning return stroke channels
NASA Technical Reports Server (NTRS)
Levine, D. M.; Gilson, B.
1984-01-01
Data obtained from photographs of lightning are presented on the tortuosity of return stroke channels. The data were obtained by making piecewise linear fits to the channels, and recording the cartesian coordinates of the ends of each linear segment. The mean change between ends of the segments was nearly zero in the horizontal direction and was about eight meters in the vertical direction. Histograms of these changes are presented. These data were used to create model lightning channels and to predict the electric fields radiated during return strokes. This was done using a computer generated random walk in which linear segments were placed end-to-end to form a piecewise linear representation of the channel. The computer selected random numbers for the ends of the segments assuming a normal distribution with the measured statistics. Once the channels were simulated, the electric fields radiated during a return stroke were predicted using a transmission line model on each segment. It was found that realistic channels are obtained with this procedure, but only if the model includes two scales of tortuosity: fine scale irregularities corresponding to the local channel tortuosity which are superimposed on large scale horizontal drifts. The two scales of tortuosity are also necessary to obtain agreement between the electric fields computed mathematically from the simulated channels and the electric fields radiated from real return strokes. Without large scale drifts, the computed electric fields do not have the undulations characteristics of the data.
What to expect from an evolutionary hypothesis for a human disease: The case of type 2 diabetes.
Watve, Milind; Diwekar-Joshi, Manawa
2016-10-01
Evolutionary medicine has a promise to bring in a conceptual revolution in medicine. However, as yet the field does not have the same theoretical rigour as that of many other fields in evolutionary studies. We discuss here with reference to type 2 diabetes mellitus (T2DM) what role an evolutionary hypothesis should play in the development of thinking in medicine. Starting with the thrifty gene hypothesis, evolutionary thinking in T2DM has undergone several transitions, modifications and refinements of the thrift family of hypotheses. In addition alternative hypotheses independent of thrift are also suggested. However, most hypotheses look at partial pictures; make selective use of supportive data ignoring inconvenient truths. Most hypotheses look at a superficial picture and avoid getting into the intricacies of underlying molecular, neuronal and physiological processes. Very few hypotheses have suggested clinical implications and none of them have been tested with randomized clinical trials. In the meanwhile the concepts in the pathophysiology of T2DM are undergoing radical changes and evolutionary hypotheses need to take them into account. We suggest an approach and a set of criteria to evaluate the relative merits of the alternative hypotheses. A number of hypotheses are likely to fail when critically evaluated against these criteria. It is possible that more than one selective process are at work in the evolution of propensity to T2DM, but the intercompatibility of the alternative selective forces and their relative contribution needs to be examined. The approach we describe could potentially lead to a sound evolutionary theory that is clinically useful and testable by randomized controlled clinical trials. Copyright © 2016 Elsevier GmbH. All rights reserved.
Topology in two dimensions. II - The Abell and ACO cluster catalogues
NASA Astrophysics Data System (ADS)
Plionis, Manolis; Valdarnini, Riccardo; Coles, Peter
1992-09-01
We apply a method for quantifying the topology of projected galaxy clustering to the Abell and ACO catalogues of rich clusters. We use numerical simulations to quantify the statistical bias involved in using high peaks to define the large-scale structure, and we use the results obtained to correct our observational determinations for this known selection effect and also for possible errors introduced by boundary effects. We find that the Abell cluster sample is consistent with clusters being identified with high peaks of a Gaussian random field, but that the ACO shows a slight meatball shift away from the Gaussian behavior over and above that expected purely from the high-peak selection. The most conservative explanation of this effect is that it is caused by some artefact of the procedure used to select the clusters in the two samples.
NASA Astrophysics Data System (ADS)
Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe
2018-06-01
In this study, we present a method for improving the quality of automatic single fallen tree stem segmentation in ALS data by applying a specialized constrained conditional random field (CRF). The entire processing pipeline is composed of two steps. First, short stem segments of equal length are detected and a subset of them is selected for further processing, while in the second step the chosen segments are merged to form entire trees. The first step is accomplished using the specialized CRF defined on the space of segment labelings, capable of finding segment candidates which are easier to merge subsequently. To achieve this, the CRF considers not only the features of every candidate individually, but incorporates pairwise spatial interactions between adjacent segments into the model. In particular, pairwise interactions include a collinearity/angular deviation probability which is learned from training data as well as the ratio of spatial overlap, whereas unary potentials encode a learned probabilistic model of the laser point distribution around each segment. Each of these components enters the CRF energy with its own balance factor. To process previously unseen data, we first calculate the subset of segments for merging on a grid of balance factors by minimizing the CRF energy. Then, we perform the merging and rank the balance configurations according to the quality of their resulting merged trees, obtained from a learned tree appearance model. The final result is derived from the top-ranked configuration. We tested our approach on 5 plots from the Bavarian Forest National Park using reference data acquired in a field inventory. Compared to our previous segment selection method without pairwise interactions, an increase in detection correctness and completeness of up to 7 and 9 percentage points, respectively, was observed.
Development of a Random Field Model for Gas Plume Detection in Multiple LWIR Images.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heasler, Patrick G.
This report develops a random field model that describes gas plumes in LWIR remote sensing images. The random field model serves as a prior distribution that can be combined with LWIR data to produce a posterior that determines the probability that a gas plume exists in the scene and also maps the most probable location of any plume. The random field model is intended to work with a single pixel regression estimator--a regression model that estimates gas concentration on an individual pixel basis.
Tensor Minkowski Functionals for random fields on the sphere
NASA Astrophysics Data System (ADS)
Chingangbam, Pravabati; Yogendran, K. P.; Joby, P. K.; Ganesan, Vidhya; Appleby, Stephen; Park, Changbom
2017-12-01
We generalize the translation invariant tensor-valued Minkowski Functionals which are defined on two-dimensional flat space to the unit sphere. We apply them to level sets of random fields. The contours enclosing boundaries of level sets of random fields give a spatial distribution of random smooth closed curves. We outline a method to compute the tensor-valued Minkowski Functionals numerically for any random field on the sphere. Then we obtain analytic expressions for the ensemble expectation values of the matrix elements for isotropic Gaussian and Rayleigh fields. The results hold on flat as well as any curved space with affine connection. We elucidate the way in which the matrix elements encode information about the Gaussian nature and statistical isotropy (or departure from isotropy) of the field. Finally, we apply the method to maps of the Galactic foreground emissions from the 2015 PLANCK data and demonstrate their high level of statistical anisotropy and departure from Gaussianity.
Demaerschalk, Bart M; Brown, Robert D; Roubin, Gary S; Howard, Virginia J; Cesko, Eldina; Barrett, Kevin M; Longbottom, Mary E; Voeks, Jenifer H; Chaturvedi, Seemant; Brott, Thomas G; Lal, Brajesh K; Meschia, James F; Howard, George
2017-09-01
Multicenter clinical trials attempt to select sites that can move rapidly to randomization and enroll sufficient numbers of patients. However, there are few assessments of the success of site selection. In the CREST-2 (Carotid Revascularization and Medical Management for Asymptomatic Carotid Stenosis Trials), we assess factors associated with the time between site selection and authorization to randomize, the time between authorization to randomize and the first randomization, and the average number of randomizations per site per month. Potential factors included characteristics of the site, specialty of the principal investigator, and site type. For 147 sites, the median time between site selection to authorization to randomize was 9.9 months (interquartile range, 7.7, 12.4), and factors associated with early site activation were not identified. The median time between authorization to randomize and a randomization was 4.6 months (interquartile range, 2.6, 10.5). Sites with authorization to randomize in only the carotid endarterectomy study were slower to randomize, and other factors examined were not significantly associated with time-to-randomization. The recruitment rate was 0.26 (95% confidence interval, 0.23-0.28) patients per site per month. By univariate analysis, factors associated with faster recruitment were authorization to randomize in both trials, principal investigator specialties of interventional radiology and cardiology, pre-trial reported performance >50 carotid angioplasty and stenting procedures per year, status in the top half of recruitment in the CREST trial, and classification as a private health facility. Participation in StrokeNet was associated with slower recruitment as compared with the non-StrokeNet sites. Overall, selection of sites with high enrollment rates will likely require customization to align the sites selected to the factor under study in the trial. URL: http://www.clinicaltrials.gov. Unique identifier: NCT02089217. © 2017 American Heart Association, Inc.
NASA Technical Reports Server (NTRS)
Welch, R. M.; Sengupta, S. K.; Chen, D. W.
1990-01-01
Stratocumulus cloud fields in the FIRE IFO region are analyzed using LANDSAT Thematic Mapper imagery. Structural properties such as cloud cell size distribution, cell horizontal aspect ratio, fractional coverage and fractal dimension are determined. It is found that stratocumulus cloud number densities are represented by a power law. Cell horizontal aspect ratio has a tendency to increase at large cell sizes, and cells are bi-fractal in nature. Using LANDSAT Multispectral Scanner imagery for twelve selected stratocumulus scenes acquired during previous years, similar structural characteristics are obtained. Cloud field spatial organization also is analyzed. Nearest-neighbor spacings are fit with a number of functions, with Weibull and Gamma distributions providing the best fits. Poisson tests show that the spatial separations are not random. Second order statistics are used to examine clustering.
NASA Astrophysics Data System (ADS)
Löw, Fabian; Schorcht, Gunther; Michel, Ulrich; Dech, Stefan; Conrad, Christopher
2012-10-01
Accurate crop identification and crop area estimation are important for studies on irrigated agricultural systems, yield and water demand modeling, and agrarian policy development. In this study a novel combination of Random Forest (RF) and Support Vector Machine (SVM) classifiers is presented that (i) enhances crop classification accuracy and (ii) provides spatial information on map uncertainty. The methodology was implemented over four distinct irrigated sites in Middle Asia using RapidEye time series data. The RF feature importance statistics was used as feature-selection strategy for the SVM to assess possible negative effects on classification accuracy caused by an oversized feature space. The results of the individual RF and SVM classifications were combined with rules based on posterior classification probability and estimates of classification probability entropy. SVM classification performance was increased by feature selection through RF. Further experimental results indicate that the hybrid classifier improves overall classification accuracy in comparison to the single classifiers as well as useŕs and produceŕs accuracy.
Workshop on Incomplete Network Data Held at Sandia National Labs – Livermore
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soundarajan, Sucheta; Wendt, Jeremy D.
2016-06-01
While network analysis is applied in a broad variety of scientific fields (including physics, computer science, biology, and the social sciences), how networks are constructed and the resulting bias and incompleteness have drawn more limited attention. For example, in biology, gene networks are typically developed via experiment -- many actual interactions are likely yet to be discovered. In addition to this incompleteness, the data-collection processes can introduce significant bias into the observed network datasets. For instance, if you observe part of the World Wide Web network through a classic random walk, then high degree nodes are more likely to bemore » found than if you had selected nodes at random. Unfortunately, such incomplete and biasing data collection methods must be often used.« less
Global mean-field phase diagram of the spin-1 Ising ferromagnet in a random crystal field
NASA Astrophysics Data System (ADS)
Borelli, M. E. S.; Carneiro, C. E. I.
1996-02-01
We study the phase diagram of the mean-field spin-1 Ising ferromagnet in a uniform magnetic field H and a random crystal field Δi, with probability distribution P( Δi) = pδ( Δi - Δ) + (1 - p) δ( Δi). We analyse the effects of randomness on the first-order surfaces of the Δ- T- H phase diagram for different values of the concentration p and show how these surfaces are affected by the dilution of the crystal field.
Effects of Peripheral Visual Field Loss on Eye Movements During Visual Search
Wiecek, Emily; Pasquale, Louis R.; Fiser, Jozsef; Dakin, Steven; Bex, Peter J.
2012-01-01
Natural vision involves sequential eye movements that bring the fovea to locations selected by peripheral vision. How peripheral visual field loss (PVFL) affects this process is not well understood. We examine how the location and extent of PVFL affects eye movement behavior in a naturalistic visual search task. Ten patients with PVFL and 13 normally sighted subjects with full visual fields (FVF) completed 30 visual searches monocularly. Subjects located a 4° × 4° target, pseudo-randomly selected within a 26° × 11° natural image. Eye positions were recorded at 50 Hz. Search duration, fixation duration, saccade size, and number of saccades per trial were not significantly different between PVFL and FVF groups (p > 0.1). A χ2 test showed that the distributions of saccade directions for PVFL and FVL subjects were significantly different in 8 out of 10 cases (p < 0.01). Humphrey Visual Field pattern deviations for each subject were compared with the spatial distribution of eye movement directions. There were no significant correlations between saccade directional bias and visual field sensitivity across the 10 patients. Visual search performance was not significantly affected by PVFL. An analysis of eye movement directions revealed patients with PVFL show a biased directional distribution that was not directly related to the locus of vision loss, challenging feed-forward models of eye movement control. Consequently, many patients do not optimally compensate for visual field loss during visual search. PMID:23162511
Modulation of human extrastriate visual processing by selective attention to colours and words.
Nobre, A C; Allison, T; McCarthy, G
1998-07-01
The present study investigated the effect of visual selective attention upon neural processing within functionally specialized regions of the human extrastriate visual cortex. Field potentials were recorded directly from the inferior surface of the temporal lobes in subjects with epilepsy. The experimental task required subjects to focus attention on words from one of two competing texts. Words were presented individually and foveally. Texts were interleaved randomly and were distinguishable on the basis of word colour. Focal field potentials were evoked by words in the posterior part of the fusiform gyrus. Selective attention strongly modulated long-latency potentials evoked by words. The attention effect co-localized with word-related potentials in the posterior fusiform gyrus, and was independent of stimulus colour. The results demonstrated that stimuli receive differential processing within specialized regions of the extrastriate cortex as a function of attention. The late onset of the attention effect and its co-localization with letter string-related potentials but not with colour-related potentials recorded from nearby regions of the fusiform gyrus suggest that the attention effect is due to top-down influences from downstream regions involved in word processing.
Connectivity ranking of heterogeneous random conductivity models
NASA Astrophysics Data System (ADS)
Rizzo, C. B.; de Barros, F.
2017-12-01
To overcome the challenges associated with hydrogeological data scarcity, the hydraulic conductivity (K) field is often represented by a spatial random process. The state-of-the-art provides several methods to generate 2D or 3D random K-fields, such as the classic multi-Gaussian fields or non-Gaussian fields, training image-based fields and object-based fields. We provide a systematic comparison of these models based on their connectivity. We use the minimum hydraulic resistance as a connectivity measure, which it has been found to be strictly correlated with early time arrival of dissolved contaminants. A computationally efficient graph-based algorithm is employed, allowing a stochastic treatment of the minimum hydraulic resistance through a Monte-Carlo approach and therefore enabling the computation of its uncertainty. The results show the impact of geostatistical parameters on the connectivity for each group of random fields, being able to rank the fields according to their minimum hydraulic resistance.
Vacuum selection on axionic landscapes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Gaoyuan; Battefeld, Thorsten, E-mail: gaoyuan.wang@stud.uni-goettingen.de, E-mail: tbattefe@astro.physik.uni-goettingen.de
2016-04-01
We compute the distribution of minima that are reached dynamically on multi-field axionic landscapes, both numerically and analytically. Such landscapes are well suited for inflationary model building due to the presence of shift symmetries and possible alignment effects (the KNP mechanism). The resulting distribution of dynamically reached minima differs considerably from the naive expectation based on counting all vacua. These differences are more pronounced in the presence of many fields due to dynamical selection effects: while low lying minima are preferred as fields roll down the potential, trajectories are also more likely to get trapped by one of the manymore » nearby minima. We show that common analytic arguments based on random matrix theory in the large D-limit to estimate the distribution of minima are insufficient for quantitative arguments pertaining to the dynamically reached ones. This discrepancy is not restricted to axionic potentials. We provide an empirical expression for the expectation value of such dynamically reached minimas' height and argue that the cosmological constant problem is not alleviated in the absence of anthropic arguments. We further comment on the likelihood of inflation on axionic landscapes in the large D-limit.« less
Asiimwe, Stephen; Oloya, James; Song, Xiao; Whalen, Christopher C
2014-12-01
Unsupervised HIV self-testing (HST) has potential to increase knowledge of HIV status; however, its accuracy is unknown. To estimate the accuracy of unsupervised HST in field settings in Uganda, we performed a non-blinded, randomized controlled, non-inferiority trial of unsupervised compared with supervised HST among selected high HIV risk fisherfolk (22.1 % HIV Prevalence) in three fishing villages in Uganda between July and September 2013. The study enrolled 246 participants and randomized them in a 1:1 ratio to unsupervised HST or provider-supervised HST. In an intent-to-treat analysis, the HST sensitivity was 90 % in the unsupervised arm and 100 % among the provider-supervised, yielding a difference 0f -10 % (90 % CI -21, 1 %); non-inferiority was not shown. In a per protocol analysis, the difference in sensitivity was -5.6 % (90 % CI -14.4, 3.3 %) and did show non-inferiority. We conclude that unsupervised HST is feasible in rural Africa and may be non-inferior to provider-supervised HST.
47 CFR 1.1604 - Post-selection hearings.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random...
47 CFR 1.1604 - Post-selection hearings.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random...
NASA Astrophysics Data System (ADS)
Xu, Chong; Dai, Fuchu; Xu, Xiwei; Lee, Yuan Hsi
2012-04-01
Support vector machine (SVM) modeling is based on statistical learning theory. It involves a training phase with associated input and target output values. In recent years, the method has become increasingly popular. The main purpose of this study is to evaluate the mapping power of SVM modeling in earthquake triggered landslide-susceptibility mapping for a section of the Jianjiang River watershed using a Geographic Information System (GIS) software. The river was affected by the Wenchuan earthquake of May 12, 2008. Visual interpretation of colored aerial photographs of 1-m resolution and extensive field surveys provided a detailed landslide inventory map containing 3147 landslides related to the 2008 Wenchuan earthquake. Elevation, slope angle, slope aspect, distance from seismogenic faults, distance from drainages, and lithology were used as the controlling parameters. For modeling, three groups of positive and negative training samples were used in concert with four different kernel functions. Positive training samples include the centroids of 500 large landslides, those of all 3147 landslides, and 5000 randomly selected points in landslide polygons. Negative training samples include 500, 3147, and 5000 randomly selected points on slopes that remained stable during the Wenchuan earthquake. The four kernel functions are linear, polynomial, radial basis, and sigmoid. In total, 12 cases of landslide susceptibility were mapped. Comparative analyses of landslide-susceptibility probability and area relation curves show that both the polynomial and radial basis functions suitably classified the input data as either landslide positive or negative though the radial basis function was more successful. The 12 generated landslide-susceptibility maps were compared with known landslide centroid locations and landslide polygons to verify the success rate and predictive accuracy of each model. The 12 results were further validated using area-under-curve analysis. Group 3 with 5000 randomly selected points on the landslide polygons, and 5000 randomly selected points along stable slopes gave the best results with a success rate of 79.20% and predictive accuracy of 79.13% under the radial basis function. Of all the results, the sigmoid kernel function was the least skillful when used in concert with the centroid data of all 3147 landslides as positive training samples, and the negative training samples of 3147 randomly selected points in regions of stable slope (success rate = 54.95%; predictive accuracy = 61.85%). This paper also provides suggestions and reference data for selecting appropriate training samples and kernel function types for earthquake triggered landslide-susceptibility mapping using SVM modeling. Predictive landslide-susceptibility maps could be useful in hazard mitigation by helping planners understand the probability of landslides in different regions.
Statistical analysis of loopy belief propagation in random fields
NASA Astrophysics Data System (ADS)
Yasuda, Muneki; Kataoka, Shun; Tanaka, Kazuyuki
2015-10-01
Loopy belief propagation (LBP), which is equivalent to the Bethe approximation in statistical mechanics, is a message-passing-type inference method that is widely used to analyze systems based on Markov random fields (MRFs). In this paper, we propose a message-passing-type method to analytically evaluate the quenched average of LBP in random fields by using the replica cluster variation method. The proposed analytical method is applicable to general pairwise MRFs with random fields whose distributions differ from each other and can give the quenched averages of the Bethe free energies over random fields, which are consistent with numerical results. The order of its computational cost is equivalent to that of standard LBP. In the latter part of this paper, we describe the application of the proposed method to Bayesian image restoration, in which we observed that our theoretical results are in good agreement with the numerical results for natural images.
Inflation with a graceful exit in a random landscape
NASA Astrophysics Data System (ADS)
Pedro, F. G.; Westphal, A.
2017-03-01
We develop a stochastic description of small-field inflationary histories with a graceful exit in a random potential whose Hessian is a Gaussian random matrix as a model of the unstructured part of the string landscape. The dynamical evolution in such a random potential from a small-field inflation region towards a viable late-time de Sitter (dS) minimum maps to the dynamics of Dyson Brownian motion describing the relaxation of non-equilibrium eigenvalue spectra in random matrix theory. We analytically compute the relaxation probability in a saddle point approximation of the partition function of the eigenvalue distribution of the Wigner ensemble describing the mass matrices of the critical points. When applied to small-field inflation in the landscape, this leads to an exponentially strong bias against small-field ranges and an upper bound N ≪ 10 on the number of light fields N participating during inflation from the non-observation of negative spatial curvature.
NASA Astrophysics Data System (ADS)
Graham, Wendy D.; Tankersley, Claude D.
1994-05-01
Stochastic methods are used to analyze two-dimensional steady groundwater flow subject to spatially variable recharge and transmissivity. Approximate partial differential equations are developed for the covariances and cross-covariances between the random head, transmissivity and recharge fields. Closed-form solutions of these equations are obtained using Fourier transform techniques. The resulting covariances and cross-covariances can be incorporated into a Bayesian conditioning procedure which provides optimal estimates of the recharge, transmissivity and head fields given available measurements of any or all of these random fields. Results show that head measurements contain valuable information for estimating the random recharge field. However, when recharge is treated as a spatially variable random field, the value of head measurements for estimating the transmissivity field can be reduced considerably. In a companion paper, the method is applied to a case study of the Upper Floridan Aquifer in NE Florida.
The random field Blume-Capel model revisited
NASA Astrophysics Data System (ADS)
Santos, P. V.; da Costa, F. A.; de Araújo, J. M.
2018-04-01
We have revisited the mean-field treatment for the Blume-Capel model under the presence of a discrete random magnetic field as introduced by Kaufman and Kanner (1990). The magnetic field (H) versus temperature (T) phase diagrams for given values of the crystal field D were recovered in accordance to Kaufman and Kanner original work. However, our main goal in the present work was to investigate the distinct structures of the crystal field versus temperature phase diagrams as the random magnetic field is varied because similar models have presented reentrant phenomenon due to randomness. Following previous works we have classified the distinct phase diagrams according to five different topologies. The topological structure of the phase diagrams is maintained for both H - T and D - T cases. Although the phase diagrams exhibit a richness of multicritical phenomena we did not found any reentrant effect as have been seen in similar models.
Dai, Qiong; Cheng, Jun-Hu; Sun, Da-Wen; Zeng, Xin-An
2015-01-01
There is an increased interest in the applications of hyperspectral imaging (HSI) for assessing food quality, safety, and authenticity. HSI provides abundance of spatial and spectral information from foods by combining both spectroscopy and imaging, resulting in hundreds of contiguous wavebands for each spatial position of food samples, also known as the curse of dimensionality. It is desirable to employ feature selection algorithms for decreasing computation burden and increasing predicting accuracy, which are especially relevant in the development of online applications. Recently, a variety of feature selection algorithms have been proposed that can be categorized into three groups based on the searching strategy namely complete search, heuristic search and random search. This review mainly introduced the fundamental of each algorithm, illustrated its applications in hyperspectral data analysis in the food field, and discussed the advantages and disadvantages of these algorithms. It is hoped that this review should provide a guideline for feature selections and data processing in the future development of hyperspectral imaging technique in foods.
A New Algorithm with Plane Waves and Wavelets for Random Velocity Fields with Many Spatial Scales
NASA Astrophysics Data System (ADS)
Elliott, Frank W.; Majda, Andrew J.
1995-03-01
A new Monte Carlo algorithm for constructing and sampling stationary isotropic Gaussian random fields with power-law energy spectrum, infrared divergence, and fractal self-similar scaling is developed here. The theoretical basis for this algorithm involves the fact that such a random field is well approximated by a superposition of random one-dimensional plane waves involving a fixed finite number of directions. In general each one-dimensional plane wave is the sum of a random shear layer and a random acoustical wave. These one-dimensional random plane waves are then simulated by a wavelet Monte Carlo method for a single space variable developed recently by the authors. The computational results reported in this paper demonstrate remarkable low variance and economical representation of such Gaussian random fields through this new algorithm. In particular, the velocity structure function for an imcorepressible isotropic Gaussian random field in two space dimensions with the Kolmogoroff spectrum can be simulated accurately over 12 decades with only 100 realizations of the algorithm with the scaling exponent accurate to 1.1% and the constant prefactor accurate to 6%; in fact, the exponent of the velocity structure function can be computed over 12 decades within 3.3% with only 10 realizations. Furthermore, only 46,592 active computational elements are utilized in each realization to achieve these results for 12 decades of scaling behavior.
Coevolutionary dynamics in large, but finite populations
NASA Astrophysics Data System (ADS)
Traulsen, Arne; Claussen, Jens Christian; Hauert, Christoph
2006-07-01
Coevolving and competing species or game-theoretic strategies exhibit rich and complex dynamics for which a general theoretical framework based on finite populations is still lacking. Recently, an explicit mean-field description in the form of a Fokker-Planck equation was derived for frequency-dependent selection with two strategies in finite populations based on microscopic processes [A. Traulsen, J. C. Claussen, and C. Hauert, Phys. Rev. Lett. 95, 238701 (2005)]. Here we generalize this approach in a twofold way: First, we extend the framework to an arbitrary number of strategies and second, we allow for mutations in the evolutionary process. The deterministic limit of infinite population size of the frequency-dependent Moran process yields the adjusted replicator-mutator equation, which describes the combined effect of selection and mutation. For finite populations, we provide an extension taking random drift into account. In the limit of neutral selection, i.e., whenever the process is determined by random drift and mutations, the stationary strategy distribution is derived. This distribution forms the background for the coevolutionary process. In particular, a critical mutation rate uc is obtained separating two scenarios: above uc the population predominantly consists of a mixture of strategies whereas below uc the population tends to be in homogeneous states. For one of the fundamental problems in evolutionary biology, the evolution of cooperation under Darwinian selection, we demonstrate that the analytical framework provides excellent approximations to individual based simulations even for rather small population sizes. This approach complements simulation results and provides a deeper, systematic understanding of coevolutionary dynamics.
SMERFS: Stochastic Markov Evaluation of Random Fields on the Sphere
NASA Astrophysics Data System (ADS)
Creasey, Peter; Lang, Annika
2018-04-01
SMERFS (Stochastic Markov Evaluation of Random Fields on the Sphere) creates large realizations of random fields on the sphere. It uses a fast algorithm based on Markov properties and fast Fourier Transforms in 1d that generates samples on an n X n grid in O(n2 log n) and efficiently derives the necessary conditional covariance matrices.
Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore
2011-12-01
Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.
Dharmarajan, Kavita V; Friedman, Debra L; Schwartz, Cindy L; Chen, Lu; FitzGerald, T J; McCarten, Kathleen M; Kessel, Sandy K; Iandoli, Matt; Constine, Louis S; Wolden, Suzanne L
2015-05-01
The study was designed to determine whether response-based therapy improves outcomes in intermediate-risk Hodgkin lymphoma. We examined patterns of first relapse in the study. From September 2002 to July 2010, 1712 patients <22 years old with stage I-IIA with bulk, I-IIAE, I-IIB, and IIIA-IVA with or without doxorubicin, bleomycin, vincristine, etoposide, prednisone, and cyclophosphamide were enrolled. Patients were categorized as rapid (RER) or slow early responders (SER) after 2 cycles of doxorubicin, bleomycin, vincristine, etoposide, prednisone, and cyclophosphamide (ABVE-PC). The SER patients were randomized to 2 additional ABVE-PC cycles or augmented chemotherapy with 21 Gy involved field radiation therapy (IFRT). RER patients were stipulated to undergo 2 additional ABVE-PC cycles and were then randomized to 21 Gy IFRT or no further treatment if complete response (CR) was achieved. RER without CR patients were non-randomly assigned to 21 Gy IFRT. Relapses were characterized without respect to site (initial, new, or both; and initial bulk or initial nonbulk), and involved field radiation therapy field (in-field, out-of-field, or both). Patients were grouped by treatment assignment (SER; RER/no CR; RER/CR/IFRT; and RER/CR/no IFRT). Summary statistics were reported. At 4-year median follow-up, 244 patients had experienced relapse, 198 of whom were fully evaluable for review. Those who progressed during treatment (n=30) or lacked relapse imaging (n=16) were excluded. The median time to relapse was 12.8 months. Of the 198 evaluable patients, 30% were RER/no CR, 26% were SER, 26% were RER/CR/no IFRT, 16% were RER/CR/IFRT, and 2% remained uncategorized. The 74% and 75% relapses involved initially bulky and nonbulky sites, respectively. First relapses rarely occurred at exclusively new or out-of-field sites. By contrast, relapses usually occurred at nodal sites of initial bulky and nonbulky disease. Although response-based therapy has helped define treatment for selected RER patients, it has not improved outcome for SER patients or facilitated refinement of IFRT volumes or doses. Copyright © 2015 Elsevier Inc. All rights reserved.
2008 Niday Perinatal Database quality audit: report of a quality assurance project.
Dunn, S; Bottomley, J; Ali, A; Walker, M
2011-12-01
This quality assurance project was designed to determine the reliability, completeness and comprehensiveness of the data entered into Niday Perinatal Database. Quality of the data was measured by comparing data re-abstracted from the patient record to the original data entered into the Niday Perinatal Database. A representative sample of hospitals in Ontario was selected and a random sample of 100 linked mother and newborn charts were audited for each site. A subset of 33 variables (representing 96 data fields) from the Niday dataset was chosen for re-abstraction. Of the data fields for which Cohen's kappa statistic or intraclass correlation coefficient (ICC) was calculated, 44% showed substantial or almost perfect agreement (beyond chance). However, about 17% showed less than 95% agreement and a kappa or ICC value of less than 60% indicating only slight, fair or moderate agreement (beyond chance). Recommendations to improve the quality of these data fields are presented.
Regional management of farmland feeding geese using an ecological prioritization tool.
Madsen, Jesper; Bjerrum, Morten; Tombre, Ingunn M
2014-10-01
Wild geese foraging on farmland cause increasing conflicts with agricultural interests, calling for a strategic approach to mitigation. In central Norway, conflicts between farmers and spring-staging pink-footed geese feeding on pastures have escalated. To alleviate the conflict, a scheme by which farmers are subsidized to allow geese to forage undisturbed was introduced. To guide allocation of subsidies, an ecological-based ranking of fields at a regional level was recommended and applied. Here we evaluate the scheme. On average, 40 % of subsidized fields were in the top 5 % of the ranking, and 80 % were within the top 20 %. Goose grazing pressure on subsidized pastures was 13 times higher compared to a stratified random selection of non-subsidized pastures, capturing 67 % of the pasture feeding geese despite that subsidized fields only comprised 13 % of the grassland area. Close dialogue between scientists and managers is regarded as a key to the success of the scheme.
Potential field cellular automata model for pedestrian flow
NASA Astrophysics Data System (ADS)
Zhang, Peng; Jian, Xiao-Xia; Wong, S. C.; Choi, Keechoo
2012-02-01
This paper proposes a cellular automata model of pedestrian flow that defines a cost potential field, which takes into account the costs of travel time and discomfort, for a pedestrian to move to an empty neighboring cell. The formulation is based on a reconstruction of the density distribution and the underlying physics, including the rule for resolving conflicts, which is comparable to that in the floor field cellular automaton model. However, we assume that each pedestrian is familiar with the surroundings, thereby minimizing his or her instantaneous cost. This, in turn, helps reduce the randomness in selecting a target cell, which improves the existing cellular automata modelings, together with the computational efficiency. In the presence of two pedestrian groups, which are distinguished by their destinations, the cost distribution for each group is magnified due to the strong interaction between the two groups. As a typical phenomenon, the formation of lanes in the counter flow is reproduced.
Box-Cox Mixed Logit Model for Travel Behavior Analysis
NASA Astrophysics Data System (ADS)
Orro, Alfonso; Novales, Margarita; Benitez, Francisco G.
2010-09-01
To represent the behavior of travelers when they are deciding how they are going to get to their destination, discrete choice models, based on the random utility theory, have become one of the most widely used tools. The field in which these models were developed was halfway between econometrics and transport engineering, although the latter now constitutes one of their principal areas of application. In the transport field, they have mainly been applied to mode choice, but also to the selection of destination, route, and other important decisions such as the vehicle ownership. In usual practice, the most frequently employed discrete choice models implement a fixed coefficient utility function that is linear in the parameters. The principal aim of this paper is to present the viability of specifying utility functions with random coefficients that are nonlinear in the parameters, in applications of discrete choice models to transport. Nonlinear specifications in the parameters were present in discrete choice theory at its outset, although they have seldom been used in practice until recently. The specification of random coefficients, however, began with the probit and the hedonic models in the 1970s, and, after a period of apparent little practical interest, has burgeoned into a field of intense activity in recent years with the new generation of mixed logit models. In this communication, we present a Box-Cox mixed logit model, original of the authors. It includes the estimation of the Box-Cox exponents in addition to the parameters of the random coefficients distribution. Probability of choose an alternative is an integral that will be calculated by simulation. The estimation of the model is carried out by maximizing the simulated log-likelihood of a sample of observed individual choices between alternatives. The differences between the predictions yielded by models that are inconsistent with real behavior have been studied with simulation experiments.
Williamson, Scott; Fledel-Alon, Adi; Bustamante, Carlos D
2004-09-01
We develop a Poisson random-field model of polymorphism and divergence that allows arbitrary dominance relations in a diploid context. This model provides a maximum-likelihood framework for estimating both selection and dominance parameters of new mutations using information on the frequency spectrum of sequence polymorphisms. This is the first DNA sequence-based estimator of the dominance parameter. Our model also leads to a likelihood-ratio test for distinguishing nongenic from genic selection; simulations indicate that this test is quite powerful when a large number of segregating sites are available. We also use simulations to explore the bias in selection parameter estimates caused by unacknowledged dominance relations. When inference is based on the frequency spectrum of polymorphisms, genic selection estimates of the selection parameter can be very strongly biased even for minor deviations from the genic selection model. Surprisingly, however, when inference is based on polymorphism and divergence (McDonald-Kreitman) data, genic selection estimates of the selection parameter are nearly unbiased, even for completely dominant or recessive mutations. Further, we find that weak overdominant selection can increase, rather than decrease, the substitution rate relative to levels of polymorphism. This nonintuitive result has major implications for the interpretation of several popular tests of neutrality.
The emergence of collective phenomena in systems with random interactions
NASA Astrophysics Data System (ADS)
Abramkina, Volha
Emergent phenomena are one of the most profound topics in modern science, addressing the ways that collectivities and complex patterns appear due to multiplicity of components and simple interactions. Ensembles of random Hamiltonians allow one to explore emergent phenomena in a statistical way. In this work we adopt a shell model approach with a two-body interaction Hamiltonian. The sets of the two-body interaction strengths are selected at random, resulting in the two-body random ensemble (TBRE). Symmetries such as angular momentum, isospin, and parity entangled with complex many-body dynamics result in surprising order discovered in the spectrum of low-lying excitations. The statistical patterns exhibited in the TBRE are remarkably similar to those observed in real nuclei. Signs of almost every collective feature seen in nuclei, namely, pairing superconductivity, deformation, and vibration, have been observed in random ensembles [3, 4, 5, 6]. In what follows a systematic investigation of nuclear shape collectivities in random ensembles is conducted. The development of the mean field, its geometry, multipole collectivities and their dependence on the underlying two-body interaction are explored. Apart from the role of static symmetries such as SU(2) angular momentum and isospin groups, the emergence of dynamical symmetries including the seniority SU(2), rotational symmetry, as well as the Elliot SU(3) is shown to be an important precursor for the existence of geometric collectivities.
Restoration of dimensional reduction in the random-field Ising model at five dimensions
NASA Astrophysics Data System (ADS)
Fytas, Nikolaos G.; Martín-Mayor, Víctor; Picco, Marco; Sourlas, Nicolas
2017-04-01
The random-field Ising model is one of the few disordered systems where the perturbative renormalization group can be carried out to all orders of perturbation theory. This analysis predicts dimensional reduction, i.e., that the critical properties of the random-field Ising model in D dimensions are identical to those of the pure Ising ferromagnet in D -2 dimensions. It is well known that dimensional reduction is not true in three dimensions, thus invalidating the perturbative renormalization group prediction. Here, we report high-precision numerical simulations of the 5D random-field Ising model at zero temperature. We illustrate universality by comparing different probability distributions for the random fields. We compute all the relevant critical exponents (including the critical slowing down exponent for the ground-state finding algorithm), as well as several other renormalization-group invariants. The estimated values of the critical exponents of the 5D random-field Ising model are statistically compatible to those of the pure 3D Ising ferromagnet. These results support the restoration of dimensional reduction at D =5 . We thus conclude that the failure of the perturbative renormalization group is a low-dimensional phenomenon. We close our contribution by comparing universal quantities for the random-field problem at dimensions 3 ≤D <6 to their values in the pure Ising model at D -2 dimensions, and we provide a clear verification of the Rushbrooke equality at all studied dimensions.
Restoration of dimensional reduction in the random-field Ising model at five dimensions.
Fytas, Nikolaos G; Martín-Mayor, Víctor; Picco, Marco; Sourlas, Nicolas
2017-04-01
The random-field Ising model is one of the few disordered systems where the perturbative renormalization group can be carried out to all orders of perturbation theory. This analysis predicts dimensional reduction, i.e., that the critical properties of the random-field Ising model in D dimensions are identical to those of the pure Ising ferromagnet in D-2 dimensions. It is well known that dimensional reduction is not true in three dimensions, thus invalidating the perturbative renormalization group prediction. Here, we report high-precision numerical simulations of the 5D random-field Ising model at zero temperature. We illustrate universality by comparing different probability distributions for the random fields. We compute all the relevant critical exponents (including the critical slowing down exponent for the ground-state finding algorithm), as well as several other renormalization-group invariants. The estimated values of the critical exponents of the 5D random-field Ising model are statistically compatible to those of the pure 3D Ising ferromagnet. These results support the restoration of dimensional reduction at D=5. We thus conclude that the failure of the perturbative renormalization group is a low-dimensional phenomenon. We close our contribution by comparing universal quantities for the random-field problem at dimensions 3≤D<6 to their values in the pure Ising model at D-2 dimensions, and we provide a clear verification of the Rushbrooke equality at all studied dimensions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kan, Jimmy J.; Gottwald, Matthias; Fullerton, Eric E.
We describe low-temperature characterization of magnetic tunnel junctions (MTJs) patterned by reactive ion etching for spin-transfer-torque magnetic random access memory. Magnetotransport measurements of typical MTJs show increasing tunneling magnetoresistance (TMR) and larger coercive fields as temperature is decreased down to 10 K. However, MTJs selected from the high-resistance population of an MTJ array exhibit stable intermediate magnetic states when measured at low temperature and show TMR roll-off below 100 K. These non-ideal low-temperature behaviors arise from edge damage during the etch process and can have negative impacts on thermal stability of the MTJs.
2017-08-30
stained cells in five randomly selected fields for each slide. ELISA Conditioned media from cell lines or mice sera diluted in carbonate coating buffer... ELISA . Our results showed that Pter/SAHA combination treat- ment was more effective in suppressing VEGF- c and IL- 1β circulating levels compared to...3) SAHA (50 mg/kg; n = 5), and (4) Pter + SAHA (10 mg/kg and 50 mg/kg; n = 7) by i.p. injections for 10 weeks were quantitatively analyzed by ELISA
Mitigating Upsets in SRAM-Based FPGAs from the Xilinx Virtex 2 Family
NASA Technical Reports Server (NTRS)
Swift, G. M.; Yui, C. C.; Carmichael, C.; Koga, R.; George, J. S.
2003-01-01
Static random access memory (SRAM) upset rates in field programmable gate arrays (FPGAs) from the Xilinx Virtex 2 family have been tested for radiation effects on configuration memory, block RAM and the power-on-reset (POR) and SelectMAP single event functional interrupts (SEFIs). Dynamic testing has shown the effectiveness and value of Triple Module Redundancy (TMR) and partial reconfiguration when used in conjunction. Continuing dynamic testing for more complex designs and other Virtex 2 capabilities (i.e., I/O standards, digital clock managers (DCM), etc.) is scheduled.
NASA Astrophysics Data System (ADS)
Fierro, Annalisa; Cocozza, Sergio; Monticelli, Antonella; Scala, Giovanni; Miele, Gennaro
2017-06-01
The presence of phenomena analogous to phase transition in Statistical Mechanics has been suggested in the evolution of a polygenic trait under stabilizing selection, mutation and genetic drift. By using numerical simulations of a model system, we analyze the evolution of a population of N diploid hermaphrodites in random mating regime. The population evolves under the effect of drift, selective pressure in form of viability on an additive polygenic trait, and mutation. The analysis allows to determine a phase diagram in the plane of mutation rate and strength of selection. The involved pattern of phase transitions is characterized by a line of critical points for weak selective pressure (smaller than a threshold), whereas discontinuous phase transitions, characterized by metastable hysteresis, are observed for strong selective pressure. A finite-size scaling analysis suggests the analogy between our system and the mean-field Ising model for selective pressure approaching the threshold from weaker values. In this framework, the mutation rate, which allows the system to explore the accessible microscopic states, is the parameter controlling the transition from large heterozygosity ( disordered phase) to small heterozygosity ( ordered one).
Motion of polymer cholesteric liquid crystal flakes in an electric field
NASA Astrophysics Data System (ADS)
Kosc, Tanya Zoriana
Polymer cholesteric liquid crystal (PCLC) flakes suspended in a host fluid can be manipulated with an electric field. Controlling a flake's orientation provides the opportunity to change and control the amount of selective reflection from the flake surface. Flake motion results from charge accumulation and an induced dipole moment established due to Maxwell-Wagner polarization. The type of flake behavior, whether random motion or uniform reorientation, depends upon the dielectric properties of the host fluid, which in turn dictate whether a DC or an AC electric field must be applied. PCLC flakes suspended in highly dielectric silicone oil host fluids tend to move randomly in the presence of a DC electric field, and no motion is seen in AC fields. Flakes suspended in a moderately conductive host fluid reorient 90° in the presence of an AC field within a specific frequency range. The flake shape and size are also important parameters that need to be controlled in order to produce uniform motion. Several methods for patterning flakes were investigated and identical square flakes were produced. Square PCLC flakes (80 mum sides) suspended in propylene carbonate reorient in 400 ms when a 40mVrms/mum field at 70 Hz is applied to the test device. Theoretical modeling supported experimental observations well, particularly in identifying the inverse quadratic dependence on the applied electric field and the electric field frequency dependence that is governed by the host fluid conductivity. Future goals and suggested experiments are provided, as well as an explanation and comparison of possible commercial applications for PCLC flakes. This research has resulted in one patent application and a series of invention disclosures that could place this research group and any industrial collaborators in a strong position to pursue commercial applications, particularly in the area of displays, and more specifically, electronic paper.
Mathematical models of cell factories: moving towards the core of industrial biotechnology.
Cvijovic, Marija; Bordel, Sergio; Nielsen, Jens
2011-09-01
Industrial biotechnology involves the utilization of cell factories for the production of fuels and chemicals. Traditionally, the development of highly productive microbial strains has relied on random mutagenesis and screening. The development of predictive mathematical models provides a new paradigm for the rational design of cell factories. Instead of selecting among a set of strains resulting from random mutagenesis, mathematical models allow the researchers to predict in silico the outcomes of different genetic manipulations and engineer new strains by performing gene deletions or additions leading to a higher productivity of the desired chemicals. In this review we aim to summarize the main modelling approaches of biological processes and illustrate the particular applications that they have found in the field of industrial microbiology. © 2010 The Authors. Journal compilation © 2010 Society for Applied Microbiology and Blackwell Publishing Ltd.
Bayesian approach to non-Gaussian field statistics for diffusive broadband terahertz pulses.
Pearce, Jeremy; Jian, Zhongping; Mittleman, Daniel M
2005-11-01
We develop a closed-form expression for the probability distribution function for the field components of a diffusive broadband wave propagating through a random medium. We consider each spectral component to provide an individual observation of a random variable, the configurationally averaged spectral intensity. Since the intensity determines the variance of the field distribution at each frequency, this random variable serves as the Bayesian prior that determines the form of the non-Gaussian field statistics. This model agrees well with experimental results.
Application of a multipurpose unequal probability stream survey in the Mid-Atlantic Coastal Plain
Ator, S.W.; Olsen, A.R.; Pitchford, A.M.; Denver, J.M.
2003-01-01
A stratified, spatially balanced sample with unequal probability selection was used to design a multipurpose survey of headwater streams in the Mid-Atlantic Coastal Plain. Objectives for the survey include unbiased estimates of regional stream conditions, and adequate coverage of unusual but significant environmental settings to support empirical modeling of the factors affecting those conditions. The design and field application of the survey are discussed in light of these multiple objectives. A probability (random) sample of 175 first-order nontidal streams was selected for synoptic sampling of water chemistry and benthic and riparian ecology during late winter and spring 2000. Twenty-five streams were selected within each of seven hydrogeologic subregions (strata) that were delineated on the basis of physiography and surficial geology. In each subregion, unequal inclusion probabilities were used to provide an approximately even distribution of streams along a gradient of forested to developed (agricultural or urban) land in the contributing watershed. Alternate streams were also selected. Alternates were included in groups of five in each subregion when field reconnaissance demonstrated that primary streams were inaccessible or otherwise unusable. Despite the rejection and replacement of a considerable number of primary streams during reconnaissance (up to 40 percent in one subregion), the desired land use distribution was maintained within each hydrogeologic subregion without sacrificing the probabilistic design.
Azzam, O; Yambao, M L; Muhsin, M; McNally, K L; Umadhay, K M
2000-01-01
The two adjacent genes of coat protein 1 and 2 of rice tungro spherical virus (RTSV) were amplified from total RNA extracts of serologically indistinguishable field isolates from the Philippines and Indonesia, using reverse transcriptase polymerase chain reaction (RT-PCR). Digestion with HindIII and BstYI restriction endonucleases differentiated the amplified DNA products into eight distinct coat protein genotypes. These genotypes were then used as indicators of virus diversity in the field. Inter- and intra-site diversities were determined over three cropping seasons. At each of the sites surveyed, one or two main genotypes prevailed together with other related minor or mixed genotypes that did not replace the main genotype over the sampling time. The cluster of genotypes found at the Philippines sites was significantly different from the one at the Indonesia sites, suggesting geographic isolation for virus populations. Phylogenetic studies based on the nucleotide sequences of 38 selected isolates confirm the spatial distribution of RTSV virus populations but show that gene flow may occur between populations. Under the present conditions, rice varieties do not seem to exert selective pressure on the virus populations. Based on the selective constraints in the coat protein amino acid sequences and the virus genetic composition per site, a negative selection model followed by random-sampling events due to vector transmissions is proposed to explain the inter-site diversity observed.
Random walk study of electron motion in helium in crossed electromagnetic fields
NASA Technical Reports Server (NTRS)
Englert, G. W.
1972-01-01
Random walk theory, previously adapted to electron motion in the presence of an electric field, is extended to include a transverse magnetic field. In principle, the random walk approach avoids mathematical complexity and concomitant simplifying assumptions and permits determination of energy distributions and transport coefficients within the accuracy of available collisional cross section data. Application is made to a weakly ionized helium gas. Time of relaxation of electron energy distribution, determined by the random walk, is described by simple expressions based on energy exchange between the electron and an effective electric field. The restrictive effect of the magnetic field on electron motion, which increases the required number of collisions per walk to reach a terminal steady state condition, as well as the effect of the magnetic field on electron transport coefficients and mean energy can be quite adequately described by expressions involving only the Hall parameter.
Abuasbi, Falastine; Lahham, Adnan; Abdel-Raziq, Issam Rashid
2018-05-01
In this study, levels of extremely low-frequency electric and magnetic fields originated from overhead power lines were investigated in the outdoor environment in Ramallah city, Palestine. Spot measurements were applied to record fields intensities over 6-min period. The Spectrum Analyzer NF-5035 was used to perform measurements at 1 m above ground level and directly underneath 40 randomly selected power lines distributed fairly within the city. Levels of electric fields varied depending on the line's category (power line, transformer or distributor), a minimum mean electric field of 3.9 V/m was found under a distributor line, and a maximum of 769.4 V/m under a high-voltage power line (66 kV). However, results of electric fields showed a log-normal distribution with the geometric mean and the geometric standard deviation of 35.9 and 2.8 V/m, respectively. Magnetic fields measured at power lines, on contrast, were not log-normally distributed; the minimum and maximum mean magnetic fields under power lines were 0.89 and 3.5 μT, respectively. As a result, none of the measured fields exceeded the ICNIRP's guidelines recommended for general public exposures to extremely low-frequency fields.
Persistence and Lifelong Fidelity of Phase Singularities in Optical Random Waves.
De Angelis, L; Alpeggiani, F; Di Falco, A; Kuipers, L
2017-11-17
Phase singularities are locations where light is twisted like a corkscrew, with positive or negative topological charge depending on the twisting direction. Among the multitude of singularities arising in random wave fields, some can be found at the same location, but only when they exhibit opposite topological charge, which results in their mutual annihilation. New pairs can be created as well. With near-field experiments supported by theory and numerical simulations, we study the persistence and pairing statistics of phase singularities in random optical fields as a function of the excitation wavelength. We demonstrate how such entities can encrypt fundamental properties of the random fields in which they arise.
Persistence and Lifelong Fidelity of Phase Singularities in Optical Random Waves
NASA Astrophysics Data System (ADS)
De Angelis, L.; Alpeggiani, F.; Di Falco, A.; Kuipers, L.
2017-11-01
Phase singularities are locations where light is twisted like a corkscrew, with positive or negative topological charge depending on the twisting direction. Among the multitude of singularities arising in random wave fields, some can be found at the same location, but only when they exhibit opposite topological charge, which results in their mutual annihilation. New pairs can be created as well. With near-field experiments supported by theory and numerical simulations, we study the persistence and pairing statistics of phase singularities in random optical fields as a function of the excitation wavelength. We demonstrate how such entities can encrypt fundamental properties of the random fields in which they arise.
Random scalar fields and hyperuniformity
NASA Astrophysics Data System (ADS)
Ma, Zheng; Torquato, Salvatore
2017-06-01
Disordered many-particle hyperuniform systems are exotic amorphous states of matter that lie between crystals and liquids. Hyperuniform systems have attracted recent attention because they are endowed with novel transport and optical properties. Recently, the hyperuniformity concept has been generalized to characterize two-phase media, scalar fields, and random vector fields. In this paper, we devise methods to explicitly construct hyperuniform scalar fields. Specifically, we analyze spatial patterns generated from Gaussian random fields, which have been used to model the microwave background radiation and heterogeneous materials, the Cahn-Hilliard equation for spinodal decomposition, and Swift-Hohenberg equations that have been used to model emergent pattern formation, including Rayleigh-Bénard convection. We show that the Gaussian random scalar fields can be constructed to be hyperuniform. We also numerically study the time evolution of spinodal decomposition patterns and demonstrate that they are hyperuniform in the scaling regime. Moreover, we find that labyrinth-like patterns generated by the Swift-Hohenberg equation are effectively hyperuniform. We show that thresholding (level-cutting) a hyperuniform Gaussian random field to produce a two-phase random medium tends to destroy the hyperuniformity of the progenitor scalar field. We then propose guidelines to achieve effectively hyperuniform two-phase media derived from thresholded non-Gaussian fields. Our investigation paves the way for new research directions to characterize the large-structure spatial patterns that arise in physics, chemistry, biology, and ecology. Moreover, our theoretical results are expected to guide experimentalists to synthesize new classes of hyperuniform materials with novel physical properties via coarsening processes and using state-of-the-art techniques, such as stereolithography and 3D printing.
Feldon, Steven E
2004-01-01
ABSTRACT Purpose To validate a computerized expert system evaluating visual fields in a prospective clinical trial, the Ischemic Optic Neuropathy Decompression Trial (IONDT). To identify the pattern and within-pattern severity of field defects for study eyes at baseline and 6-month follow-up. Design Humphrey visual field (HVF) change was used as the outcome measure for a prospective, randomized, multi-center trial to test the null hypothesis that optic nerve sheath decompression was ineffective in treating nonarteritic anterior ischemic optic neuropathy and to ascertain the natural history of the disease. Methods An expert panel established criteria for the type and severity of visual field defects. Using these criteria, a rule-based computerized expert system interpreted HVF from baseline and 6-month visits for patients randomized to surgery or careful follow-up and for patients who were not randomized. Results A computerized expert system was devised and validated. The system was then used to analyze HVFs. The pattern of defects found at baseline for patients randomized to surgery did not differ from that of patients randomized to careful follow-up. The most common pattern of defect was a superior and inferior arcuate with central scotoma for randomized eyes (19.2%) and a superior and inferior arcuate for nonrandomized eyes (30.6%). Field patterns at 6 months and baseline were not different. For randomized study eyes, the superior altitudinal defects improved (P = .03), as did the inferior altitudinal defects (P = .01). For nonrandomized study eyes, only the inferior altitudinal defects improved (P = .02). No treatment effect was noted. Conclusions A novel rule-based expert system successfully interpreted visual field defects at baseline of eyes enrolled in the IONDT. PMID:15747764
Plazier, Mark; Ost, Jan; Stassijns, Gaëtane; De Ridder, Dirk; Vanneste, Sven
2015-01-01
Fibromyalgia is a condition characterized by widespread chronic pain. Due to the high prevalence and high costs, it has a substantial burden on society. Treatment results are diverse and only help a small subset of patients. C2 nerve field stimulation, aka occipital nerve stimulation, is helpful and a minimally invasive treatment for primary headache syndromes. Small C2 pilot studies seem to be beneficial in fibromyalgia. Forty patients were implanted with a subcutaneous electrode in the C2 dermatoma as part of a prospective, double-blind, randomized, controlled cross-over study followed by an open label follow up period of 6 months. The patients underwent 2 week periods of different doses of stimulation consisting of minimal (.1 mA), subthreshold, and suprathreshold (for paresthesias) in a randomized order. Twenty seven patients received a permanent implant and 25 completed the 6 month open label follow up period. During the 6 week trial phase of the study, patients had an overall decrease of 36% on the fibromyalgia impact questionnaire (FIQ), a decrease of 33% fibromyalgia pain and improvement of 42% on the impact on daily life activities and quality. These results imply an overall improvement in the disease burden, maintained at 6 months follow up, as well as an improvement in life quality of 50%. Seventy six percent of patients were satisfied or very satisfied with their treatment. There seems to be a dose-response curve, with increasing amplitudes leading to better clinical outcomes. Subcutaneous C2 nerve field stimulation seems to offer a safe and effective treatment option for selected medically intractable patients with fibromyalgia. Copyright © 2015 Elsevier Inc. All rights reserved.
An Overview of Randomization and Minimization Programs for Randomized Clinical Trials
Saghaei, Mahmoud
2011-01-01
Randomization is an essential component of sound clinical trials, which prevents selection biases and helps in blinding the allocations. Randomization is a process by which subsequent subjects are enrolled into trial groups only by chance, which essentially eliminates selection biases. A serious consequence of randomization is severe imbalance among the treatment groups with respect to some prognostic factors, which invalidate the trial results or necessitate complex and usually unreliable secondary analysis to eradicate the source of imbalances. Minimization on the other hand tends to allocate in such a way as to minimize the differences among groups, with respect to prognostic factors. Pure minimization is therefore completely deterministic, that is, one can predict the allocation of the next subject by knowing the factor levels of a previously enrolled subject and having the properties of the next subject. To eliminate the predictability of randomization, it is necessary to include some elements of randomness in the minimization algorithms. In this article brief descriptions of randomization and minimization are presented followed by introducing selected randomization and minimization programs. PMID:22606659
Video game training to improve selective visual attention in older adults
Belchior, Patrícia; Marsiske, Michael; Sisco, Shannon M.; Yam, Anna; Bavelier, Daphne; Ball, Karlene; Mann, William C.
2013-01-01
The current study investigated the effect of video game training on older adult’s useful field of view performance (the UFOV® test). Fifty-eight older adult participants were randomized to receive practice with the target action game (Medal of Honor), a placebo control arcade game (Tetris), a clinically validated UFOV training program, or into a no contact control group. Examining pretest–posttest change in selective visual attention, the UFOV improved significantly more than the game groups; all three intervention groups improved significantly more than no-contact controls. There was a lack of difference between the two game conditions, differing from findings with younger adults. Discussion considers whether games posing less challenge might still be effective interventions for elders, and whether optimal training dosages should be higher. PMID:24003265
[Comparison on agronomy and quality characters of selective strain of Schizonepeta tenuifolia].
Cao, Liang; Jin, Yue; Wei, Jianhe; Chu, Qinglong; Zhao, Runhuai; Wang, Weiquan
2009-05-01
With the purpose of selecting adequate quality and high production of Schizonepeta tenuifolia, the comparative experiments were carried out on different strain of S. tenuifolia in 2007. The test fields were divided into blocks randomly, and the agronomy characters were investigated in harvest time; the content of volatile oil was measured by steam distillation and the pulegone were determined by HPLC. The yield of S4 was 18.63% and 29.99% higher than that of CK1 and CK2, respectively. The contents of volatile oil and pulegone were also higher than those of CK and other strains in this test. S4 shows the advantages of high production, strong disease resistance and high active components. S4 would be extended as the good breed in production.
Random Assignment: Practical Considerations from Field Experiments.
ERIC Educational Resources Information Center
Dunford, Franklyn W.
1990-01-01
Seven qualitative issues associated with randomization that have the potential to weaken or destroy otherwise sound experimental designs are reviewed and illustrated via actual field experiments. Issue areas include ethics and legality, liability risks, manipulation of randomized outcomes, hidden bias, design intrusiveness, case flow, and…
New constraints on modelling the random magnetic field of the MW
NASA Astrophysics Data System (ADS)
Beck, Marcus C.; Beck, Alexander M.; Beck, Rainer; Dolag, Klaus; Strong, Andrew W.; Nielaba, Peter
2016-05-01
We extend the description of the isotropic and anisotropic random component of the small-scale magnetic field within the existing magnetic field model of the Milky Way from Jansson & Farrar, by including random realizations of the small-scale component. Using a magnetic-field power spectrum with Gaussian random fields, the NE2001 model for the thermal electrons and the Galactic cosmic-ray electron distribution from the current GALPROP model we derive full-sky maps for the total and polarized synchrotron intensity as well as the Faraday rotation-measure distribution. While previous work assumed that small-scale fluctuations average out along the line-of-sight or which only computed ensemble averages of random fields, we show that these fluctuations need to be carefully taken into account. Comparing with observational data we obtain not only good agreement with 408 MHz total and WMAP7 22 GHz polarized intensity emission maps, but also an improved agreement with Galactic foreground rotation-measure maps and power spectra, whose amplitude and shape strongly depend on the parameters of the random field. We demonstrate that a correlation length of 0≈22 pc (05 pc being a 5σ lower limit) is needed to match the slope of the observed power spectrum of Galactic foreground rotation-measure maps. Using multiple realizations allows us also to infer errors on individual observables. We find that previously-used amplitudes for random and anisotropic random magnetic field components need to be rescaled by factors of ≈0.3 and 0.6 to account for the new small-scale contributions. Our model predicts a rotation measure of -2.8±7.1 rad/m2 and 04.4±11. rad/m2 for the north and south Galactic poles respectively, in good agreement with observations. Applying our model to deflections of ultra-high-energy cosmic rays we infer a mean deflection of ≈3.5±1.1 degree for 60 EeV protons arriving from CenA.
The RANDOM computer program: A linear congruential random number generator
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1986-01-01
The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.
Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch
2017-06-06
An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.
Hansen, Adam G.; Beauchamp, David A.
2014-01-01
Most predators eat only a subset of possible prey. However, studies evaluating diet selection rarely measure prey availability in a manner that accounts for temporal–spatial overlap with predators, the sensory mechanisms employed to detect prey, and constraints on prey capture.We evaluated the diet selection of cutthroat trout (Oncorhynchus clarkii) feeding on a diverse planktivore assemblage in Lake Washington to test the hypothesis that the diet selection of piscivores would reflect random (opportunistic) as opposed to non-random (targeted) feeding, after accounting for predator–prey overlap, visual detection and capture constraints.Diets of cutthroat trout were sampled in autumn 2005, when the abundance of transparent, age-0 longfin smelt (Spirinchus thaleichthys) was low, and 2006, when the abundance of smelt was nearly seven times higher. Diet selection was evaluated separately using depth-integrated and depth-specific (accounted for predator–prey overlap) prey abundance. The abundance of different prey was then adjusted for differences in detectability and vulnerability to predation to see whether these factors could explain diet selection.In 2005, cutthroat trout fed non-randomly by selecting against the smaller, transparent age-0 longfin smelt, but for the larger age-1 longfin smelt. After adjusting prey abundance for visual detection and capture, cutthroat trout fed randomly. In 2006, depth-integrated and depth-specific abundance explained the diets of cutthroat trout well, indicating random feeding. Feeding became non-random after adjusting for visual detection and capture. Cutthroat trout selected strongly for age-0 longfin smelt, but against similar sized threespine stickleback (Gasterosteus aculeatus) and larger age-1 longfin smelt in 2006. Overlap with juvenile sockeye salmon (O. nerka) was minimal in both years, and sockeye salmon were rare in the diets of cutthroat trout.The direction of the shift between random and non-random selection depended on the presence of a weak versus a strong year class of age-0 longfin smelt. These fish were easy to catch, but hard to see. When their density was low, poor detection could explain their rarity in the diet. When their density was high, poor detection was compensated by higher encounter rates with cutthroat trout, sufficient to elicit a targeted feeding response. The nature of the feeding selectivity of a predator can be highly dependent on fluctuations in the abundance and suitability of key prey.
Group Counseling With Emotionally Disturbed School Children in Taiwan.
ERIC Educational Resources Information Center
Chiu, Peter
The application of group counseling to emotionally disturbed school children in Chinese culture was examined. Two junior high schools located in Tao-Yuan Province were randomly selected with two eighth-grade classes randomly selected from each school. Ten emotionally disturbed students were chosen from each class and randomly assigned to two…
Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling
ERIC Educational Resources Information Center
Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah
2014-01-01
Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…
On Measuring and Reducing Selection Bias with a Quasi-Doubly Randomized Preference Trial
ERIC Educational Resources Information Center
Joyce, Ted; Remler, Dahlia K.; Jaeger, David A.; Altindag, Onur; O'Connell, Stephen D.; Crockett, Sean
2017-01-01
Randomized experiments provide unbiased estimates of treatment effects, but are costly and time consuming. We demonstrate how a randomized experiment can be leveraged to measure selection bias by conducting a subsequent observational study that is identical in every way except that subjects choose their treatment--a quasi-doubly randomized…
Baldissera, Sandro; Ferrante, Gianluigi; Quarchioni, Elisa; Minardi, Valentina; Possenti, Valentina; Carrozzi, Giuliano; Masocco, Maria; Salmaso, Stefania
2014-04-01
Field substitution of nonrespondents can be used to maintain the planned sample size and structure in surveys but may introduce additional bias. Sample weighting is suggested as the preferable alternative; however, limited empirical evidence exists comparing the two methods. We wanted to assess the impact of substitution on surveillance results using data from Progressi delle Aziende Sanitarie per la Salute in Italia-Progress by Local Health Units towards a Healthier Italy (PASSI). PASSI is conducted by Local Health Units (LHUs) through telephone interviews of stratified random samples of residents. Nonrespondents are replaced with substitutes randomly preselected in the same LHU stratum. We compared the weighted estimates obtained in the original PASSI sample (used as a reference) and in the substitutes' sample. The differences were evaluated using a Wald test. In 2011, 50,697 units were selected: 37,252 were from the original sample and 13,445 were substitutes; 37,162 persons were interviewed. The initially planned size and demographic composition were restored. No significant differences in the estimates between the original and the substitutes' sample were found. In our experience, field substitution is an acceptable method for dealing with nonresponse, maintaining the characteristics of the original sample without affecting the results. This evidence can support appropriate decisions about planning and implementing a surveillance system. Copyright © 2014 Elsevier Inc. All rights reserved.
Selection bias in studies of human reproduction-longevity trade-offs.
Helle, Samuli
2017-12-13
A shorter lifespan as a potential cost of high reproductive effort in humans has intrigued researchers for more than a century. However, the results have been inconclusive so far and despite strong theoretical expectations we do not currently have compelling evidence for the longevity costs of reproduction. Using Monte Carlo simulation, it is shown here that a common practice in human reproduction-longevity studies using historical data (the most relevant data sources for this question), the omission of women who died prior to menopausal age from the analysis, results in severe underestimation of the potential underlying trade-off between reproduction and lifespan. In other words, assuming that such a trade-off is expressed also during reproductive years, the strength of the trade-off between reproduction and lifespan is progressively weakened when women dying during reproductive ages are sequentially and non-randomly excluded from the analysis. In cases of small sample sizes (e.g. few hundreds of observations), this selection bias by reducing statistical power may even partly explain the null results commonly found in this field. Future studies in this field should thus apply statistical approaches that account for or avoid selection bias in order to recover reliable effect size estimates between reproduction and longevity. © 2017 The Author(s).
Effects of feature-selective and spatial attention at different stages of visual processing.
Andersen, Søren K; Fuchs, Sandra; Müller, Matthias M
2011-01-01
We investigated mechanisms of concurrent attentional selection of location and color using electrophysiological measures in human subjects. Two completely overlapping random dot kinematograms (RDKs) of two different colors were presented on either side of a central fixation cross. On each trial, participants attended one of these four RDKs, defined by its specific combination of color and location, in order to detect coherent motion targets. Sustained attentional selection while monitoring for targets was measured by means of steady-state visual evoked potentials (SSVEPs) elicited by the frequency-tagged RDKs. Attentional selection of transient targets and distractors was assessed by behavioral responses and by recording event-related potentials to these stimuli. Spatial attention and attention to color had independent and largely additive effects on the amplitudes of SSVEPs elicited in early visual areas. In contrast, behavioral false alarms and feature-selective modulation of P3 amplitudes to targets and distractors were limited to the attended location. These results suggest that feature-selective attention produces an early, global facilitation of stimuli having the attended feature throughout the visual field, whereas the discrimination of target events takes place at a later stage of processing that is only applied to stimuli at the attended position.
Robotic Surgery in Gynecology: An Updated Systematic Review
Weinberg, Lori; Rao, Sanjay; Escobar, Pedro F.
2011-01-01
The introduction of da Vinci Robotic Surgery to the field of Gynecology has resulted in large changes in surgical management. The robotic platform allows less experienced laparoscopic surgeons to perform more complex procedures. In general gynecology and reproductive gynecology, the robot is being increasingly used for procedures such as hysterectomies, myomectomies, adnexal surgery, and tubal anastomosis. Among urogynecology the robot is being utilized for sacrocolopexies. In the field of gynecologic oncology, the robot is being increasingly used for hysterectomies and lymphadenectomies in oncologic diseases. Despite the rapid and widespread adoption of robotic surgery in gynecology, there are no randomized trials comparing its efficacy and safety to other traditional surgical approaches. Our aim is to update previously published reviews with a focus on only comparative observational studies. We determined that, with the right amount of training and skill, along with appropriate patient selection, robotic surgery can be highly advantageous. Patients will likely have less blood loss, less post-operative pain, faster recoveries, and fewer complications compared to open surgery and potentially even laparoscopy. However, until larger, well-designed observational studies or randomized control trials are completed which report long-term outcomes, we cannot definitively state the superiority of robotic surgery over other surgical methods. PMID:22190948
Enhancing gene regulatory network inference through data integration with markov random fields
Banf, Michael; Rhee, Seung Y.
2017-02-01
Here, a gene regulatory network links transcription factors to their target genes and represents a map of transcriptional regulation. Much progress has been made in deciphering gene regulatory networks computationally. However, gene regulatory network inference for most eukaryotic organisms remain challenging. To improve the accuracy of gene regulatory network inference and facilitate candidate selection for experimentation, we developed an algorithm called GRACE (Gene Regulatory network inference ACcuracy Enhancement). GRACE exploits biological a priori and heterogeneous data integration to generate high- confidence network predictions for eukaryotic organisms using Markov Random Fields in a semi-supervised fashion. GRACE uses a novel optimization schememore » to integrate regulatory evidence and biological relevance. It is particularly suited for model learning with sparse regulatory gold standard data. We show GRACE’s potential to produce high confidence regulatory networks compared to state of the art approaches using Drosophila melanogaster and Arabidopsis thaliana data. In an A. thaliana developmental gene regulatory network, GRACE recovers cell cycle related regulatory mechanisms and further hypothesizes several novel regulatory links, including a putative control mechanism of vascular structure formation due to modifications in cell proliferation.« less
Enhancing gene regulatory network inference through data integration with markov random fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Banf, Michael; Rhee, Seung Y.
Here, a gene regulatory network links transcription factors to their target genes and represents a map of transcriptional regulation. Much progress has been made in deciphering gene regulatory networks computationally. However, gene regulatory network inference for most eukaryotic organisms remain challenging. To improve the accuracy of gene regulatory network inference and facilitate candidate selection for experimentation, we developed an algorithm called GRACE (Gene Regulatory network inference ACcuracy Enhancement). GRACE exploits biological a priori and heterogeneous data integration to generate high- confidence network predictions for eukaryotic organisms using Markov Random Fields in a semi-supervised fashion. GRACE uses a novel optimization schememore » to integrate regulatory evidence and biological relevance. It is particularly suited for model learning with sparse regulatory gold standard data. We show GRACE’s potential to produce high confidence regulatory networks compared to state of the art approaches using Drosophila melanogaster and Arabidopsis thaliana data. In an A. thaliana developmental gene regulatory network, GRACE recovers cell cycle related regulatory mechanisms and further hypothesizes several novel regulatory links, including a putative control mechanism of vascular structure formation due to modifications in cell proliferation.« less
Spatial Distribution of Phase Singularities in Optical Random Vector Waves.
De Angelis, L; Alpeggiani, F; Di Falco, A; Kuipers, L
2016-08-26
Phase singularities are dislocations widely studied in optical fields as well as in other areas of physics. With experiment and theory we show that the vectorial nature of light affects the spatial distribution of phase singularities in random light fields. While in scalar random waves phase singularities exhibit spatial distributions reminiscent of particles in isotropic liquids, in vector fields their distribution for the different vector components becomes anisotropic due to the direct relation between propagation and field direction. By incorporating this relation in the theory for scalar fields by Berry and Dennis [Proc. R. Soc. A 456, 2059 (2000)], we quantitatively describe our experiments.
NASA Astrophysics Data System (ADS)
Yüksel, Yusuf
2018-05-01
We propose an atomistic model and present Monte Carlo simulation results regarding the influence of FM/AF interface structure on the hysteresis mechanism and exchange bias behavior for a spin valve type FM/FM/AF magnetic junction. We simulate perfectly flat and roughened interface structures both with uncompensated interfacial AF moments. In order to simulate rough interface effect, we introduce the concept of random exchange anisotropy field induced at the interface, and acting on the interface AF spins. Our results yield that different types of the random field distributions of anisotropy field may lead to different behavior of exchange bias.
Testing for a genetic response to sexual selection in a wild Drosophila population.
Gosden, T P; Thomson, J R; Blows, M W; Schaul, A; Chenoweth, S F
2016-06-01
In accordance with the consensus that sexual selection is responsible for the rapid evolution of display traits on macroevolutionary scales, microevolutionary studies suggest sexual selection is a widespread and often strong form of directional selection in nature. However, empirical evidence for the contemporary evolution of sexually selected traits via sexual rather than natural selection remains weak. In this study, we used a novel application of quantitative genetic breeding designs to test for a genetic response to sexual selection on eight chemical display traits from a field population of the fly, Drosophila serrata. Using our quantitative genetic approach, we were able to detect a genetically based difference in means between groups of males descended from fathers who had either successfully sired offspring or were randomly collected from the same wild population for one of these display traits, the diene (Z,Z)-5,9-C27 : 2 . Our experimental results, in combination with previous laboratory studies on this system, suggest that both natural and sexual selection may be influencing the evolutionary trajectories of these traits in nature, limiting the capacity for a contemporary evolutionary response. © 2016 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2016 European Society For Evolutionary Biology.
A stratified two-stage sampling design for digital soil mapping in a Mediterranean basin
NASA Astrophysics Data System (ADS)
Blaschek, Michael; Duttmann, Rainer
2015-04-01
The quality of environmental modelling results often depends on reliable soil information. In order to obtain soil data in an efficient manner, several sampling strategies are at hand depending on the level of prior knowledge and the overall objective of the planned survey. This study focuses on the collection of soil samples considering available continuous secondary information in an undulating, 16 km²-sized river catchment near Ussana in southern Sardinia (Italy). A design-based, stratified, two-stage sampling design has been applied aiming at the spatial prediction of soil property values at individual locations. The stratification based on quantiles from density functions of two land-surface parameters - topographic wetness index and potential incoming solar radiation - derived from a digital elevation model. Combined with four main geological units, the applied procedure led to 30 different classes in the given test site. Up to six polygons of each available class were selected randomly excluding those areas smaller than 1ha to avoid incorrect location of the points in the field. Further exclusion rules were applied before polygon selection masking out roads and buildings using a 20m buffer. The selection procedure was repeated ten times and the set of polygons with the best geographical spread were chosen. Finally, exact point locations were selected randomly from inside the chosen polygon features. A second selection based on the same stratification and following the same methodology (selecting one polygon instead of six) was made in order to create an appropriate validation set. Supplementary samples were obtained during a second survey focusing on polygons that have either not been considered during the first phase at all or were not adequately represented with respect to feature size. In total, both field campaigns produced an interpolation set of 156 samples and a validation set of 41 points. The selection of sample point locations has been done using ESRI software (ArcGIS) extended by Hawth's Tools and later on its replacement the Geospatial Modelling Environment (GME). 88% of all desired points could actually be reached in the field and have been successfully sampled. Our results indicate that the sampled calibration and validation sets are representative for each other and could be successfully used as interpolation data for spatial prediction purposes. With respect to soil textural fractions, for instance, equal multivariate means and variance homogeneity were found for the two datasets as evidenced by significant (P > 0.05) Hotelling T²-test (2.3 with df1 = 3, df2 = 193) and Bartlett's test statistics (6.4 with df = 6). The multivariate prediction of clay, silt and sand content using a neural network residual cokriging approach reached an explained variance level of 56%, 47% and 63%. Thus, the presented case study is a successful example of considering readily available continuous information on soil forming factors such as geology and relief as stratifying variables for designing sampling schemes in digital soil mapping projects.
SNP selection and classification of genome-wide SNP data using stratified sampling random forests.
Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K
2012-09-01
For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.
Pure-phase selective excitation in fast-relaxing systems.
Zangger, K; Oberer, M; Sterk, H
2001-09-01
Selective pulses have been used frequently for small molecules. However, their application to proteins and other macromolecules has been limited. The long duration of shaped-selective pulses and the short T(2) relaxation times in proteins often prohibited the use of highly selective pulses especially on larger biomolecules. A very selective excitation can be obtained within a short time by using the selective excitation sequence presented in this paper. Instead of using a shaped low-intensity radiofrequency pulse, a cluster of hard 90 degrees pulses, delays of free precession, and pulsed field gradients can be used to selectively excite a narrow chemical shift range within a relatively short time. Thereby, off-resonance magnetization, which is allowed to evolve freely during the free precession intervals, is destroyed by the gradient pulses. Off-resonance excitation artifacts can be removed by random variation of the interpulse delays. This leads to an excitation profile with selectivity as well as phase and relaxation behavior superior to that of commonly used shaped-selective pulses. Since the evolution of scalar coupling is inherently suppressed during the double-selective excitation of two different scalar-coupled nuclei, the presented pulse cluster is especially suited for simultaneous highly selective excitation of N-H and C-H fragments. Experimental examples are demonstrated on hen egg white lysozyme (14 kD) and the bacterial antidote ParD (19 kD). Copyright 2001 Academic Press.
Summer School Effects in a Randomized Field Trial
ERIC Educational Resources Information Center
Zvoch, Keith; Stevens, Joseph J.
2013-01-01
This field-based randomized trial examined the effect of assignment to and participation in summer school for two moderately at-risk samples of struggling readers. Application of multiple regression models to difference scores capturing the change in summer reading fluency revealed that kindergarten students randomly assigned to summer school…
Engen, Steinar; Saether, Bernt-Erik
2014-03-01
We analyze the stochastic components of the Robertson-Price equation for the evolution of quantitative characters that enables decomposition of the selection differential into components due to demographic and environmental stochasticity. We show how these two types of stochasticity affect the evolution of multivariate quantitative characters by defining demographic and environmental variances as components of individual fitness. The exact covariance formula for selection is decomposed into three components, the deterministic mean value, as well as stochastic demographic and environmental components. We show that demographic and environmental stochasticity generate random genetic drift and fluctuating selection, respectively. This provides a common theoretical framework for linking ecological and evolutionary processes. Demographic stochasticity can cause random variation in selection differentials independent of fluctuating selection caused by environmental variation. We use this model of selection to illustrate that the effect on the expected selection differential of random variation in individual fitness is dependent on population size, and that the strength of fluctuating selection is affected by how environmental variation affects the covariance in Malthusian fitness between individuals with different phenotypes. Thus, our approach enables us to partition out the effects of fluctuating selection from the effects of selection due to random variation in individual fitness caused by demographic stochasticity. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.
Angunawela, I I; Diwan, V K; Tomson, G
1991-06-01
The intervention level of epidemiology is useful for studying effects in health systems research. Due to practical and ethical reasons, it is often difficult to apply experimental methods such as classical randomized clinical trials in the field. However with alternative approaches such as 'randomization by group' some of these problems can be overcome. Drug information has since long been considered as an instrument to influence physicians, however evaluation of its effects is a new field of research. In the present study the impact of drug information on prescribing behaviour was evaluated in an outpatient setting in Sri Lanka. The study included 15 state health institutions (45 prescribers) with a common drug formulary. Groups of prescribers were randomized into two interventions; newsletters and newsletters reinforced by a group seminar, and one control group. The target topic was 'rational prescribing of antibiotics'. Some 18,766 randomly selected outpatient drug prescriptions were studied. Antibiotics (and sulphonamides) were prescribed to 33.2% of the patients. An overall trend towards a decrease in proportion of patients prescribed antibiotics in the two intervention groups was seen, although the difference was not significant (p greater than 0.05) compared to the control group. This is similar to the effect of written information on prescribing in other studies. A mean difference of -7.4% in written, -7.3% in written + seminar and -0.4% in the control group was shown. The general antibiotic prescribing pattern did not change in any of the three groups. Penicillin was the most commonly prescribed antibiotic and tetracycline was only rarely prescribed to children. This experiment indicates the feasibility of drug information intervention studies in developing countries.(ABSTRACT TRUNCATED AT 250 WORDS)
Spin dynamics of random Ising chain in coexisting transverse and longitudinal magnetic fields
NASA Astrophysics Data System (ADS)
Liu, Zhong-Qiang; Jiang, Su-Rong; Kong, Xiang-Mu; Xu, Yu-Liang
2017-05-01
The dynamics of the random Ising spin chain in coexisting transverse and longitudinal magnetic fields is studied by the recursion method. Both the spin autocorrelation function and its spectral density are investigated by numerical calculations. It is found that system's dynamical behaviors depend on the deviation σJ of the random exchange coupling between nearest-neighbor spins and the ratio rlt of the longitudinal and the transverse fields: (i) For rlt = 0, the system undergoes two crossovers from N independent spins precessing about the transverse magnetic field to a collective-mode behavior, and then to a central-peak behavior as σJ increases. (ii) For rlt ≠ 0, the system may exhibit a coexistence behavior of a collective-mode one and a central-peak one. When σJ is small (or large enough), system undergoes a crossover from a coexistence behavior (or a disordered behavior) to a central-peak behavior as rlt increases. (iii) Increasing σJ depresses effects of both the transverse and the longitudinal magnetic fields. (iv) Quantum random Ising chain in coexisting magnetic fields may exhibit under-damping and critical-damping characteristics simultaneously. These results indicate that changing the external magnetic fields may control and manipulate the dynamics of the random Ising chain.
Childhood leukemia and magnetic fields in infant incubators.
Söderberg, Karin C; Naumburg, Estelle; Anger, Gert; Cnattingius, Sven; Ekbom, Anders; Feychting, Maria
2002-01-01
In studies of magnetic field exposure and childhood leukemia, power lines and other electrical installations close to the children's homes constitute the most extensively studied source of exposure. We conducted a study to assess whether exposure to magnetic fields in infant incubators is associated with an increased leukemia risk. We identified all children with leukemia born in Sweden between 1973 and 1989 from the national Cancer Registry and selected at random one control per case, individually matched by sex and time of birth, from the study base. We retrieved information about treatment in infant incubators from medical records. We made measurements of the magnetic fields inside the incubators for each incubator model kept by the hospitals. Exposure assessment was based on measurements of the magnetic field level inside the incubator, as well as on the length of treatment. For acute lymphoblastic leukemia, the risk estimates were close to unity for all exposure definitions. For acute myeloid leukemia, we found a slightly elevated risk, but with wide confidence intervals and with no indication of dose response. Overall, our results give little evidence that exposure to magnetic fields inside infant incubators is associated with an increased risk of childhood leukemia.
Cosmic Rays in Intermittent Magnetic Fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shukurov, Anvar; Seta, Amit; Bushby, Paul J.
The propagation of cosmic rays in turbulent magnetic fields is a diffusive process driven by the scattering of the charged particles by random magnetic fluctuations. Such fields are usually highly intermittent, consisting of intense magnetic filaments and ribbons surrounded by weaker, unstructured fluctuations. Studies of cosmic-ray propagation have largely overlooked intermittency, instead adopting Gaussian random magnetic fields. Using test particle simulations, we calculate cosmic-ray diffusivity in intermittent, dynamo-generated magnetic fields. The results are compared with those obtained from non-intermittent magnetic fields having identical power spectra. The presence of magnetic intermittency significantly enhances cosmic-ray diffusion over a wide range of particlemore » energies. We demonstrate that the results can be interpreted in terms of a correlated random walk.« less
BIGEL analysis of gene expression in HL60 cells exposed to X rays or 60 Hz magnetic fields
NASA Technical Reports Server (NTRS)
Balcer-Kubiczek, E. K.; Zhang, X. F.; Han, L. H.; Harrison, G. H.; Davis, C. C.; Zhou, X. J.; Ioffe, V.; McCready, W. A.; Abraham, J. M.; Meltzer, S. J.
1998-01-01
We screened a panel of 1,920 randomly selected cDNAs to discover genes that are differentially expressed in HL60 cells exposed to 60 Hz magnetic fields (2 mT) or X rays (5 Gy) compared to unexposed cells. Identification of these clones was accomplished using our two-gel cDNA library screening method (BIGEL). Eighteen cDNAs differentially expressed in X-irradiated compared to control HL60 cells were recovered from a panel of 1,920 clones. Differential expression in experimental compared to control cells was confirmed independently by Northern blotting of paired total RNA samples hybridized to each of the 18 clone-specific cDNA probes. DNA sequencing revealed that 15 of the 18 cDNA clones produced matches with the database for genes related to cell growth, protein synthesis, energy metabolism, oxidative stress or apoptosis (including MYC, neuroleukin, copper zinc-dependent superoxide dismutase, TC4 RAS-like protein, peptide elongation factor 1alpha, BNIP3, GATA3, NF45, cytochrome c oxidase II and triosephosphate isomerase mRNAs). In contrast, BIGEL analysis of the same 1,920 cDNAs revealed no differences greater than 1.5-fold in expression levels in magnetic-field compared to sham-exposed cells. Magnetic-field-exposed and control samples were analyzed further for the presence of mRNA encoding X-ray-responsive genes by hybridization of the 18 specific cDNA probes to RNA from exposed and control HL60 cells. Our results suggest that differential gene expression is induced in approximately 1% of a random pool of cDNAs by ionizing radiation but not by 60 Hz magnetic fields under the present experimental conditions.
Weighted stacking of seismic AVO data using hybrid AB semblance and local similarity
NASA Astrophysics Data System (ADS)
Deng, Pan; Chen, Yangkang; Zhang, Yu; Zhou, Hua-Wei
2016-04-01
The common-midpoint (CMP) stacking technique plays an important role in enhancing the signal-to-noise ratio (SNR) in seismic data processing and imaging. Weighted stacking is often used to improve the performance of conventional equal-weight stacking in further attenuating random noise and handling the amplitude variations in real seismic data. In this study, we propose to use a hybrid framework of combining AB semblance and a local-similarity-weighted stacking scheme. The objective is to achieve an optimal stacking of the CMP gathers with class II amplitude-variation-with-offset (AVO) polarity-reversal anomaly. The selection of high-quality near-offset reference trace is another innovation of this work because of its better preservation of useful energy. Applications to synthetic and field seismic data demonstrate a great improvement using our method to capture the true locations of weak reflections, distinguish thin-bed tuning artifacts, and effectively attenuate random noise.
Introduction to the Special Issue.
ERIC Educational Resources Information Center
Petrosino, Anthony
2003-01-01
Introduces the articles of this special issue focusing on randomized field trials in criminology. In spite of the overall lack of randomized field trials in criminology, some agencies and individuals are able to mount an impressive number of field trials, and these articles focus on their experiences. (SLD)
NASA Astrophysics Data System (ADS)
Jones, A. W.; Bland-Hawthorn, J.; Kaiser, N.
1994-12-01
In the first half of 1995, the Anglo-Australian Observatory is due to commission a wide field (2.1(deg) ), 400-fiber, double spectrograph system (2dF) at the f/3.3 prime focus of the AAT 3.9m bi-national facility. The instrument should be able to measure ~ 4000 galaxy redshifts (assuming a magnitude limit of b_J ~\\ 20) in a single dark night and is therefore ideally suited to studies of large-scale structure. We have carried out simple 3D numerical simulations to judge the relative merits of sparse surveys and contiguous surveys. We generate a survey volume and fill it randomly with particles according to a selection function which mimics a magnitude-limited survey at b_J = 19.7. Each of the particles is perturbed by a gaussian random field according to the dimensionless power spectrum k(3) P(k) / 2pi (2) determined by Feldman, Kaiser & Peacock (1994) from the IRAS QDOT survey. We introduce some redshift-space distortion as described by Kaiser (1987), a `thermal' component measured from pairwise velocities (Davis & Peebles 1983), and `fingers of god' due to rich clusters at random density enhancements. Our particular concern is to understand how the window function W(2(k)) of the survey geometry compromises the accuracy of statistical measures [e.g., P(k), xi (r), xi (r_sigma ,r_pi )] commonly used in the study of large-scale structure. We also examine the reliability of various tools (e.g. genus) for describing the topological structure within a contiguous region of the survey.
The Coalescent Process in Models with Selection
Kaplan, N. L.; Darden, T.; Hudson, R. R.
1988-01-01
Statistical properties of the process describing the genealogical history of a random sample of genes are obtained for a class of population genetics models with selection. For models with selection, in contrast to models without selection, the distribution of this process, the coalescent process, depends on the distribution of the frequencies of alleles in the ancestral generations. If the ancestral frequency process can be approximated by a diffusion, then the mean and the variance of the number of segregating sites due to selectively neutral mutations in random samples can be numerically calculated. The calculations are greatly simplified if the frequencies of the alleles are tightly regulated. If the mutation rates between alleles maintained by balancing selection are low, then the number of selectively neutral segregating sites in a random sample of genes is expected to substantially exceed the number predicted under a neutral model. PMID:3066685
Random phase detection in multidimensional NMR.
Maciejewski, Mark W; Fenwick, Matthew; Schuyler, Adam D; Stern, Alan S; Gorbatyuk, Vitaliy; Hoch, Jeffrey C
2011-10-04
Despite advances in resolution accompanying the development of high-field superconducting magnets, biomolecular applications of NMR require multiple dimensions in order to resolve individual resonances, and the achievable resolution is typically limited by practical constraints on measuring time. In addition to the need for measuring long evolution times to obtain high resolution, the need to distinguish the sign of the frequency constrains the ability to shorten measuring times. Sign discrimination is typically accomplished by sampling the signal with two different receiver phases or by selecting a reference frequency outside the range of frequencies spanned by the signal and then sampling at a higher rate. In the parametrically sampled (indirect) time dimensions of multidimensional NMR experiments, either method imposes an additional factor of 2 sampling burden for each dimension. We demonstrate that by using a single detector phase at each time sample point, but randomly altering the phase for different points, the sign ambiguity that attends fixed single-phase detection is resolved. Random phase detection enables a reduction in experiment time by a factor of 2 for each indirect dimension, amounting to a factor of 8 for a four-dimensional experiment, albeit at the cost of introducing sampling artifacts. Alternatively, for fixed measuring time, random phase detection can be used to double resolution in each indirect dimension. Random phase detection is complementary to nonuniform sampling methods, and their combination offers the potential for additional benefits. In addition to applications in biomolecular NMR, random phase detection could be useful in magnetic resonance imaging and other signal processing contexts.
Effects of Selected Meditative Asanas on Kinaesthetic Perception and Speed of Movement
ERIC Educational Resources Information Center
Singh, Kanwaljeet; Bal, Baljinder S.; Deol, Nishan S.
2009-01-01
Study aim: To assess the effects of selected meditative "asanas" on kinesthetic perception and movement speed. Material and methods: Thirty randomly selected male students aged 18-24 years volunteered to participate in the study. They were randomly assigned into two groups: A (medidative) and B (control). The Nelson's movement speed and…
Model Selection with the Linear Mixed Model for Longitudinal Data
ERIC Educational Resources Information Center
Ryoo, Ji Hoon
2011-01-01
Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…
Kronberg, J.W.
1993-04-20
An apparatus for selecting at random one item of N items on the average comprising counter and reset elements for counting repeatedly between zero and N, a number selected by the user, a circuit for activating and deactivating the counter, a comparator to determine if the counter stopped at a count of zero, an output to indicate an item has been selected when the count is zero or not selected if the count is not zero. Randomness is provided by having the counter cycle very often while varying the relatively longer duration between activation and deactivation of the count. The passive circuit components of the activating/deactivating circuit and those of the counter are selected for the sensitivity of their response to variations in temperature and other physical characteristics of the environment so that the response time of the circuitry varies. Additionally, the items themselves, which may be people, may vary in shape or the time they press a pushbutton, so that, for example, an ultrasonic beam broken by the item or person passing through it will add to the duration of the count and thus to the randomness of the selection.
Kronberg, James W.
1993-01-01
An apparatus for selecting at random one item of N items on the average comprising counter and reset elements for counting repeatedly between zero and N, a number selected by the user, a circuit for activating and deactivating the counter, a comparator to determine if the counter stopped at a count of zero, an output to indicate an item has been selected when the count is zero or not selected if the count is not zero. Randomness is provided by having the counter cycle very often while varying the relatively longer duration between activation and deactivation of the count. The passive circuit components of the activating/deactivating circuit and those of the counter are selected for the sensitivity of their response to variations in temperature and other physical characteristics of the environment so that the response time of the circuitry varies. Additionally, the items themselves, which may be people, may vary in shape or the time they press a pushbutton, so that, for example, an ultrasonic beam broken by the item or person passing through it will add to the duration of the count and thus to the randomness of the selection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zentner, I.; Ferré, G., E-mail: gregoire.ferre@ponts.org; Poirion, F.
2016-06-01
In this paper, a new method for the identification and simulation of non-Gaussian and non-stationary stochastic fields given a database is proposed. It is based on two successive biorthogonal decompositions aiming at representing spatio–temporal stochastic fields. The proposed double expansion allows to build the model even in the case of large-size problems by separating the time, space and random parts of the field. A Gaussian kernel estimator is used to simulate the high dimensional set of random variables appearing in the decomposition. The capability of the method to reproduce the non-stationary and non-Gaussian features of random phenomena is illustrated bymore » applications to earthquakes (seismic ground motion) and sea states (wave heights).« less
Population differentiation in Pacific salmon: local adaptation, genetic drift, or the environment?
Adkison, Milo D.
1995-01-01
Morphological, behavioral, and life-history differences between Pacific salmon (Oncorhynchus spp.) populations are commonly thought to reflect local adaptation, and it is likewise common to assume that salmon populations separated by small distances are locally adapted. Two alternatives to local adaptation exist: random genetic differentiation owing to genetic drift and founder events, and genetic homogeneity among populations, in which differences reflect differential trait expression in differing environments. Population genetics theory and simulations suggest that both alternatives are possible. With selectively neutral alleles, genetic drift can result in random differentiation despite many strays per generation. Even weak selection can prevent genetic drift in stable populations; however, founder effects can result in random differentiation despite selective pressures. Overlapping generations reduce the potential for random differentiation. Genetic homogeneity can occur despite differences in selective regimes when straying rates are high. In sum, localized differences in selection should not always result in local adaptation. Local adaptation is favored when population sizes are large and stable, selection is consistent over large areas, selective diffeentials are large, and straying rates are neither too high nor too low. Consideration of alternatives to local adaptation would improve both biological research and salmon conservation efforts.
Effect of non-ionizing electromagnetic field on the alteration of ovarian follicles in rats.
Ahmadi, Seyed Shahin; Khaki, Amir Afshin; Ainehchi, Nava; Alihemmati, Alireza; Khatooni, Azam Asghari; Khaki, Arash; Asghari, Ali
2016-03-01
In recent years, there has been an increase in the attention paid to safety effects, environmental and society's health, extremely low frequency electromagnetic fields (ELF-EMF), and radio frequency electromagnetic fields (RF-EMF). The aim of this research was to determine the effect of EMF on the alteration of ovarian follicles. In this experimental study at Tabriz Medical University in 2015, we did EMF exposures and assessed the alteration of rats' ovarian follicles. Thirty three-month old rats were selected randomly from laboratory animals, and, after their ages and weights were determined, they were divided randomly into three groups. The control group consisted of 10 rats without any treatment, and they were kept in normal conditions. The second group of rats was influenced by a magnetic field of 50 Hz for eight weeks (three weeks intrauterine and five weeks ectopic). The third group of rats was influenced by a magnetic field of 50 Hz for 13 weeks (three weeks intrauterine and ten weeks ectopic). Samples were fixed in 10% buffered formaldehyde and cleared with Xylol and embedded in paraffin. After sectioning and staining, samples were studied by optic microscopy. Finally, SPSS version 17, were used for data analysis. EMF radiation increased the harmful effects on the formation of ovarian follicles and oocytes implantation. Studies on the effects of electromagnetic fields on ovarian follicles have shown that the nuclei of the oocytes become smaller and change shape. There were significant, harmful changes in the groups affected by electromagnetic waves. Atresia of ovarian follicles was significantly significant in both study groups compared to the control group (p < 0.05). Exposure to electromagnetic fields during embryonic development can cause morphological changes in oocytes and affect the differentiation of oocytes and folliculogenesis, resulting in decreased ovarian reserve leading to infertility or reduced fertility.
Laurie, Cathy C.; Chasalow, Scott D.; LeDeaux, John R.; McCarroll, Robert; Bush, David; Hauge, Brian; Lai, Chaoqiang; Clark, Darryl; Rocheford, Torbert R.; Dudley, John W.
2004-01-01
In one of the longest-running experiments in biology, researchers at the University of Illinois have selected for altered composition of the maize kernel since 1896. Here we use an association study to infer the genetic basis of dramatic changes that occurred in response to selection for changes in oil concentration. The study population was produced by a cross between the high- and low-selection lines at generation 70, followed by 10 generations of random mating and the derivation of 500 lines by selfing. These lines were genotyped for 488 genetic markers and the oil concentration was evaluated in replicated field trials. Three methods of analysis were tested in simulations for ability to detect quantitative trait loci (QTL). The most effective method was model selection in multiple regression. This method detected ∼50 QTL accounting for ∼50% of the genetic variance, suggesting that >50 QTL are involved. The QTL effect estimates are small and largely additive. About 20% of the QTL have negative effects (i.e., not predicted by the parental difference), which is consistent with hitchhiking and small population size during selection. The large number of QTL detected accounts for the smooth and sustained response to selection throughout the twentieth century. PMID:15611182
Time Series Analysis for Spatial Node Selection in Environment Monitoring Sensor Networks
Bhandari, Siddhartha; Jurdak, Raja; Kusy, Branislav
2017-01-01
Wireless sensor networks are widely used in environmental monitoring. The number of sensor nodes to be deployed will vary depending on the desired spatio-temporal resolution. Selecting an optimal number, position and sampling rate for an array of sensor nodes in environmental monitoring is a challenging question. Most of the current solutions are either theoretical or simulation-based where the problems are tackled using random field theory, computational geometry or computer simulations, limiting their specificity to a given sensor deployment. Using an empirical dataset from a mine rehabilitation monitoring sensor network, this work proposes a data-driven approach where co-integrated time series analysis is used to select the number of sensors from a short-term deployment of a larger set of potential node positions. Analyses conducted on temperature time series show 75% of sensors are co-integrated. Using only 25% of the original nodes can generate a complete dataset within a 0.5 °C average error bound. Our data-driven approach to sensor position selection is applicable for spatiotemporal monitoring of spatially correlated environmental parameters to minimize deployment cost without compromising data resolution. PMID:29271880
Directed evolution: tailoring biocatalysts for industrial applications.
Kumar, Ashwani; Singh, Suren
2013-12-01
Current challenges and promises of white biotechnology encourage protein engineers to use a directed evolution approach to generate novel and useful biocatalysts for various sets of applications. Different methods of enzyme engineering have been used in the past in an attempt to produce enzymes with improved functions and properties. Recent advancement in the field of random mutagenesis, screening, selection and computational design increased the versatility and the rapid development of enzymes under strong selection pressure with directed evolution experiments. Techniques of directed evolution improve enzymes fitness without understanding them in great detail and clearly demonstrate its future role in adapting enzymes for use in industry. Despite significant advances to date regarding biocatalyst improvement, there still remains a need to improve mutagenesis strategies and development of easy screening and selection tools without significant human intervention. This review covers fundamental and major development of directed evolution techniques, and highlights the advances in mutagenesis, screening and selection methods with examples of enzymes developed by using these approaches. Several commonly used methods for creating molecular diversity with their advantages and disadvantages including some recently used strategies are also discussed.
Photoactivable antibody binding protein: site-selective and covalent coupling of antibody.
Jung, Yongwon; Lee, Jeong Min; Kim, Jung-won; Yoon, Jeongwon; Cho, Hyunmin; Chung, Bong Hyun
2009-02-01
Here we report new photoactivable antibody binding proteins, which site-selectively capture antibodies and form covalent conjugates with captured antibodies upon irradiation. The proteins allow the site-selective tagging and/or immobilization of antibodies with a highly preferred orientation and omit the need for prior antibody modifications. The minimal Fc-binding domain of protein G, a widely used antibody binding protein, was genetically and chemically engineered to contain a site-specific photo cross-linker, benzophenone. In addition, the domain was further mutated to have an enhanced Fc-targeting ability. This small engineered protein was successfully cross-linked only to the Fc region of the antibody without any nonspecific reactivity. SPR analysis indicated that antibodies can be site-selectively biotinylated through the present photoactivable protein. Furthermore, the system enabled light-induced covalent immobilization of antibodies directly on various solid surfaces, such as those of glass slides, gold chips, and small particles. Antibody coupling via photoactivable antibody binding proteins overcomes several limitations of conventional approaches, such as random chemical reactions or reversible protein binding, and offers a versatile tool for the field of immunosensors.
Magnetic field line random walk in models and simulations of reduced magnetohydrodynamic turbulence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snodin, A. P.; Ruffolo, D.; Oughton, S.
2013-12-10
The random walk of magnetic field lines is examined numerically and analytically in the context of reduced magnetohydrodynamic (RMHD) turbulence, which provides a useful description of plasmas dominated by a strong mean field, such as in the solar corona. A recently developed non-perturbative theory of magnetic field line diffusion is compared with the diffusion coefficients obtained by accurate numerical tracing of magnetic field lines for both synthetic models and direct numerical simulations of RMHD. Statistical analysis of an ensemble of trajectories confirms the applicability of the theory, which very closely matches the numerical field line diffusion coefficient as a functionmore » of distance z along the mean magnetic field for a wide range of the Kubo number R. This theory employs Corrsin's independence hypothesis, sometimes thought to be valid only at low R. However, the results demonstrate that it works well up to R = 10, both for a synthetic RMHD model and an RMHD simulation. The numerical results from the RMHD simulation are compared with and without phase randomization, demonstrating a clear effect of coherent structures on the field line random walk for a very low Kubo number.« less
Epidemic spreading on preferred degree adaptive networks.
Jolad, Shivakumar; Liu, Wenjia; Schmittmann, B; Zia, R K P
2012-01-01
We study the standard SIS model of epidemic spreading on networks where individuals have a fluctuating number of connections around a preferred degree κ. Using very simple rules for forming such preferred degree networks, we find some unusual statistical properties not found in familiar Erdös-Rényi or scale free networks. By letting κ depend on the fraction of infected individuals, we model the behavioral changes in response to how the extent of the epidemic is perceived. In our models, the behavioral adaptations can be either 'blind' or 'selective'--depending on whether a node adapts by cutting or adding links to randomly chosen partners or selectively, based on the state of the partner. For a frozen preferred network, we find that the infection threshold follows the heterogeneous mean field result λ(c)/μ = <κ>/<κ2> and the phase diagram matches the predictions of the annealed adjacency matrix (AAM) approach. With 'blind' adaptations, although the epidemic threshold remains unchanged, the infection level is substantially affected, depending on the details of the adaptation. The 'selective' adaptive SIS models are most interesting. Both the threshold and the level of infection changes, controlled not only by how the adaptations are implemented but also how often the nodes cut/add links (compared to the time scales of the epidemic spreading). A simple mean field theory is presented for the selective adaptations which capture the qualitative and some of the quantitative features of the infection phase diagram.
Subcritical Multiplicative Chaos for Regularized Counting Statistics from Random Matrix Theory
NASA Astrophysics Data System (ADS)
Lambert, Gaultier; Ostrovsky, Dmitry; Simm, Nick
2018-05-01
For an {N × N} Haar distributed random unitary matrix U N , we consider the random field defined by counting the number of eigenvalues of U N in a mesoscopic arc centered at the point u on the unit circle. We prove that after regularizing at a small scale {ɛN > 0}, the renormalized exponential of this field converges as N \\to ∞ to a Gaussian multiplicative chaos measure in the whole subcritical phase. We discuss implications of this result for obtaining a lower bound on the maximum of the field. We also show that the moments of the total mass converge to a Selberg-like integral and by taking a further limit as the size of the arc diverges, we establish part of the conjectures in Ostrovsky (Nonlinearity 29(2):426-464, 2016). By an analogous construction, we prove that the multiplicative chaos measure coming from the sine process has the same distribution, which strongly suggests that this limiting object should be universal. Our approach to the L 1-phase is based on a generalization of the construction in Berestycki (Electron Commun Probab 22(27):12, 2017) to random fields which are only asymptotically Gaussian. In particular, our method could have applications to other random fields coming from either random matrix theory or a different context.
DeVoe, Jennifer E; Marino, Miguel; Angier, Heather; O'Malley, Jean P; Crawford, Courtney; Nelson, Christine; Tillotson, Carrie J; Bailey, Steffani R; Gallia, Charles; Gold, Rachel
2015-01-01
In the United States, health insurance is not universal. Observational studies show an association between uninsured parents and children. This association persisted even after expansions in child-only public health insurance. Oregon's randomized Medicaid expansion for adults, known as the Oregon Experiment, created a rare opportunity to assess causality between parent and child coverage. To estimate the effect on a child's health insurance coverage status when (1) a parent randomly gains access to health insurance and (2) a parent obtains coverage. Oregon Experiment randomized natural experiment assessing the results of Oregon's 2008 Medicaid expansion. We used generalized estimating equation models to examine the longitudinal effect of a parent randomly selected to apply for Medicaid on their child's Medicaid or Children's Health Insurance Program (CHIP) coverage (intent-to-treat analyses). We used per-protocol analyses to understand the impact on children's coverage when a parent was randomly selected to apply for and obtained Medicaid. Participants included 14409 children aged 2 to 18 years whose parents participated in the Oregon Experiment. For intent-to-treat analyses, the date a parent was selected to apply for Medicaid was considered the date the child was exposed to the intervention. In per-protocol analyses, exposure was defined as whether a selected parent obtained Medicaid. Children's Medicaid or CHIP coverage, assessed monthly and in 6-month intervals relative to their parent's selection date. In the immediate period after selection, children whose parents were selected to apply significantly increased from 3830 (61.4%) to 4152 (66.6%) compared with a nonsignificant change from 5049 (61.8%) to 5044 (61.7%) for children whose parents were not selected to apply. Children whose parents were randomly selected to apply for Medicaid had 18% higher odds of being covered in the first 6 months after parent's selection compared with children whose parents were not selected (adjusted odds ratio [AOR]=1.18; 95% CI, 1.10-1.27). The effect remained significant during months 7 to 12 (AOR=1.11; 95% CI, 1.03-1.19); months 13 to 18 showed a positive but not significant effect (AOR=1.07; 95% CI, 0.99-1.14). Children whose parents were selected and obtained coverage had more than double the odds of having coverage compared with children whose parents were not selected and did not gain coverage (AOR=2.37; 95% CI, 2.14-2.64). Children's odds of having Medicaid or CHIP coverage increased when their parents were randomly selected to apply for Medicaid. Children whose parents were selected and subsequently obtained coverage benefited most. This study demonstrates a causal link between parents' access to Medicaid coverage and their children's coverage.
Li, Zhigang; Hu, Songnian; Yao, Nan; Dean, Ralph A.; Zhao, Wensheng; Shen, Mi; Zhang, Haiwang; Li, Chao; Liu, Liyuan; Cao, Lei; Xu, Xiaowen; Xing, Yunfei; Hsiang, Tom; Zhang, Ziding; Xu, Jin-Rong; Peng, You-Liang
2012-01-01
Rice blast caused by Magnaporthe oryzae is one of the most destructive diseases of rice worldwide. The fungal pathogen is notorious for its ability to overcome host resistance. To better understand its genetic variation in nature, we sequenced the genomes of two field isolates, Y34 and P131. In comparison with the previously sequenced laboratory strain 70-15, both field isolates had a similar genome size but slightly more genes. Sequences from the field isolates were used to improve genome assembly and gene prediction of 70-15. Although the overall genome structure is similar, a number of gene families that are likely involved in plant-fungal interactions are expanded in the field isolates. Genome-wide analysis on asynonymous to synonymous nucleotide substitution rates revealed that many infection-related genes underwent diversifying selection. The field isolates also have hundreds of isolate-specific genes and a number of isolate-specific gene duplication events. Functional characterization of randomly selected isolate-specific genes revealed that they play diverse roles, some of which affect virulence. Furthermore, each genome contains thousands of loci of transposon-like elements, but less than 30% of them are conserved among different isolates, suggesting active transposition events in M. oryzae. A total of approximately 200 genes were disrupted in these three strains by transposable elements. Interestingly, transposon-like elements tend to be associated with isolate-specific or duplicated sequences. Overall, our results indicate that gain or loss of unique genes, DNA duplication, gene family expansion, and frequent translocation of transposon-like elements are important factors in genome variation of the rice blast fungus. PMID:22876203
NASA Technical Reports Server (NTRS)
Weger, R. C.; Lee, J.; Zhu, Tianri; Welch, R. M.
1992-01-01
The current controversy existing in reference to the regularity vs. clustering in cloud fields is examined by means of analysis and simulation studies based upon nearest-neighbor cumulative distribution statistics. It is shown that the Poisson representation of random point processes is superior to pseudorandom-number-generated models and that pseudorandom-number-generated models bias the observed nearest-neighbor statistics towards regularity. Interpretation of this nearest-neighbor statistics is discussed for many cases of superpositions of clustering, randomness, and regularity. A detailed analysis is carried out of cumulus cloud field spatial distributions based upon Landsat, AVHRR, and Skylab data, showing that, when both large and small clouds are included in the cloud field distributions, the cloud field always has a strong clustering signal.
NASA Astrophysics Data System (ADS)
Hadjiagapiou, Ioannis A.; Velonakis, Ioannis N.
2018-07-01
The Sherrington-Kirkpatrick Ising spin glass model, in the presence of a random magnetic field, is investigated within the framework of the one-step replica symmetry breaking. The two random variables (exchange integral interaction Jij and random magnetic field hi) are drawn from a joint Gaussian probability density function characterized by a correlation coefficient ρ, assuming positive and negative values. The thermodynamic properties, the three different phase diagrams and system's parameters are computed with respect to the natural parameters of the joint Gaussian probability density function at non-zero and zero temperatures. The low temperature negative entropy controversy, a result of the replica symmetry approach, has been partly remedied in the current study, leading to a less negative result. In addition, the present system possesses two successive spin glass phase transitions with characteristic temperatures.
40 CFR 761.355 - Third level of sample selection.
Code of Federal Regulations, 2012 CFR
2012-07-01
... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...
40 CFR 761.355 - Third level of sample selection.
Code of Federal Regulations, 2011 CFR
2011-07-01
... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...
40 CFR 761.355 - Third level of sample selection.
Code of Federal Regulations, 2013 CFR
2013-07-01
... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...
40 CFR 761.355 - Third level of sample selection.
Code of Federal Regulations, 2010 CFR
2010-07-01
... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...
40 CFR 761.355 - Third level of sample selection.
Code of Federal Regulations, 2014 CFR
2014-07-01
... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...
Classical and quantum stability in putative landscapes
Dine, Michael
2017-01-18
Landscape analyses often assume the existence of large numbers of fields, N, with all of the many couplings among these fields (subject to constraints such as local supersymmetry) selected independently and randomly from simple (say Gaussian) distributions. We point out that unitarity and perturbativity place significant constraints on behavior of couplings with N, eliminating otherwise puzzling results. In would-be flux compactifications of string theory, we point out that in order that there be large numbers of light fields, the compactification radii must scale as a positive power of N; scaling of couplings with N may also be necessary for perturbativity.more » We show that in some simple string theory settings with large numbers of fields, for fixed R and string coupling, one can bound certain sums of squares of couplings by order one numbers. This may argue for strong correlations, possibly calling into question the assumption of uncorrelated distributions. Finally, we consider implications of these considerations for classical and quantum stability of states without supersymmetry, with low energy supersymmetry arising from tuning of parameters, and with dynamical breaking of supersymmetry.« less
Radon, K; Parera, D; Rose, D M; Jung, D; Vollrath, L
2001-05-01
There is growing public concern that radio frequency electromagnetic fields may have adverse biological effects. In the present study eight healthy male students were tested to see whether or not radio frequency electromagnetic fields as used in modern digital wireless telecommunication (GSM standard) have noticeable effects on salivary melatonin, cortisol, neopterin, and immunoglobulin A (sIgA) levels during and several hours after exposure. In a specifically designed, shielded experimental chamber, the circularly polarized electromagnetic field applied was transmitted by an antenna positioned 10 cm behind the head of upright sitting test persons. The carrier frequency of 900 MHz was pulsed with 217 Hz (average power flux density 1 W/m2). In double blind trials, each test person underwent a total of 20 randomly allotted 4 hour periods of exposure and sham exposure, equally distributed at day and night. The results obtained show that the salivary concentrations of melatonin, cortisol, neopterin and sIgA did not differ significantly between exposure and sham exposure. Copyright 2001 Wiley-Liss, Inc.
Classical and quantum stability in putative landscapes
NASA Astrophysics Data System (ADS)
Dine, Michael
2017-01-01
Landscape analyses often assume the existence of large numbers of fields, N , with all of the many couplings among these fields (subject to constraints such as local supersymmetry) selected independently and randomly from simple (say Gaussian) distributions. We point out that unitarity and perturbativity place significant constraints on behavior of couplings with N , eliminating otherwise puzzling results. In would-be flux compactifications of string theory, we point out that in order that there be large numbers of light fields, the compactification radii must scale as a positive power of N ; scaling of couplings with N may also be necessary for perturbativity. We show that in some simple string theory settings with large numbers of fields, for fixed R and string coupling, one can bound certain sums of squares of couplings by order one numbers. This may argue for strong correlations, possibly calling into question the assumption of uncorrelated distributions. We consider implications of these considerations for classical and quantum stability of states without supersymmetry, with low energy supersymmetry arising from tuning of parameters, and with dynamical breaking of supersymmetry.
Era, P; Pärssinen, O; Pykälä, P; Jokela, J; Suominen, H
1994-10-01
The sensitivity of the central visual field (0 degree-30 degrees) was studied using an automatic Octopus 500E perimeter in elderly male athletes and in a population sample of men of corresponding age. The athletes (N = 96) were endurance and power athletes, who were still active in competitive sports with training histories spanning tens of years. The athletes' results were compared with those of a sample of men of the same age (70-81 years, N = 41) randomly selected from the local population register. The sensitivity values of the athletes, and the endurance athletes in particular, were significantly better than those of the controls, with differences varying from 1 to 2.5 dB in the different areas of the central visual field. Multivariate analyses of the background factors of visual field sensitivity showed that the most important were age, amount of annual training, number of chronic diseases, HDL-cholesterol level, and vital capacity. The results suggest that a long training history, especially of the aerobic type, may be beneficial with respect to the sensitivity of the visual system.
Disruptive ecological selection on a mating cue.
Merrill, Richard M; Wallbank, Richard W R; Bull, Vanessa; Salazar, Patricio C A; Mallet, James; Stevens, Martin; Jiggins, Chris D
2012-12-22
Adaptation to divergent ecological niches can result in speciation. Traits subject to disruptive selection that also contribute to non-random mating will facilitate speciation with gene flow. Such 'magic' or 'multiple-effect' traits may be widespread and important for generating biodiversity, but strong empirical evidence is still lacking. Although there is evidence that putative ecological traits are indeed involved in assortative mating, evidence that these same traits are under divergent selection is considerably weaker. Heliconius butterfly wing patterns are subject to positive frequency-dependent selection by predators, owing to aposematism and Müllerian mimicry, and divergent colour patterns are used by closely related species to recognize potential mates. The amenability of colour patterns to experimental manipulation, independent of other traits, presents an excellent opportunity to test their role during speciation. We conducted field experiments with artificial butterflies, designed to match natural butterflies with respect to avian vision. These were complemented with enclosure trials with live birds and real butterflies. Our experiments showed that hybrid colour-pattern phenotypes are attacked more frequently than parental forms. For the first time, we demonstrate disruptive ecological selection on a trait that also acts as a mating cue.
NASA Astrophysics Data System (ADS)
Yuvchenko, S. A.; Ushakova, E. V.; Pavlova, M. V.; Alonova, M. V.; Zimnyakov, D. A.
2018-04-01
We consider the practical realization of a new optical probe method of the random media which is defined as the reference-free path length interferometry with the intensity moments analysis. A peculiarity in the statistics of the spectrally selected fluorescence radiation in laser-pumped dye-doped random medium is discussed. Previously established correlations between the second- and the third-order moments of the intensity fluctuations in the random interference patterns, the coherence function of the probe radiation, and the path difference probability density for the interfering partial waves in the medium are confirmed. The correlations were verified using the statistical analysis of the spectrally selected fluorescence radiation emitted by a laser-pumped dye-doped random medium. Water solution of Rhodamine 6G was applied as the doping fluorescent agent for the ensembles of the densely packed silica grains, which were pumped by the 532 nm radiation of a solid state laser. The spectrum of the mean path length for a random medium was reconstructed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dharmarajan, Kavita V.; Friedman, Debra L.; Schwartz, Cindy L.
2015-05-01
Purpose: The study was designed to determine whether response-based therapy improves outcomes in intermediate-risk Hodgkin lymphoma. We examined patterns of first relapse in the study. Patients and Methods: From September 2002 to July 2010, 1712 patients <22 years old with stage I-IIA with bulk, I-IIAE, I-IIB, and IIIA-IVA with or without doxorubicin, bleomycin, vincristine, etoposide, prednisone, and cyclophosphamide were enrolled. Patients were categorized as rapid (RER) or slow early responders (SER) after 2 cycles of doxorubicin, bleomycin, vincristine, etoposide, prednisone, and cyclophosphamide (ABVE-PC). The SER patients were randomized to 2 additional ABVE-PC cycles or augmented chemotherapy with 21 Gy involved field radiationmore » therapy (IFRT). RER patients were stipulated to undergo 2 additional ABVE-PC cycles and were then randomized to 21 Gy IFRT or no further treatment if complete response (CR) was achieved. RER without CR patients were non-randomly assigned to 21 Gy IFRT. Relapses were characterized without respect to site (initial, new, or both; and initial bulk or initial nonbulk), and involved field radiation therapy field (in-field, out-of-field, or both). Patients were grouped by treatment assignment (SER; RER/no CR; RER/CR/IFRT; and RER/CR/no IFRT). Summary statistics were reported. Results: At 4-year median follow-up, 244 patients had experienced relapse, 198 of whom were fully evaluable for review. Those who progressed during treatment (n=30) or lacked relapse imaging (n=16) were excluded. The median time to relapse was 12.8 months. Of the 198 evaluable patients, 30% were RER/no CR, 26% were SER, 26% were RER/CR/no IFRT, 16% were RER/CR/IFRT, and 2% remained uncategorized. The 74% and 75% relapses involved initially bulky and nonbulky sites, respectively. First relapses rarely occurred at exclusively new or out-of-field sites. By contrast, relapses usually occurred at nodal sites of initial bulky and nonbulky disease. Conclusion: Although response-based therapy has helped define treatment for selected RER patients, it has not improved outcome for SER patients or facilitated refinement of IFRT volumes or doses.« less
Enhancement of Electron Acceleration in Laser Wakefields by Random Fields
NASA Astrophysics Data System (ADS)
Tataronis, J. A.; Petržílka, V.
1999-11-01
There is increasing evidence that intense laser pulses can accelerate electrons to high energies. The energy appears to increase with the distance over which the electrons are accelerated. This is difficult to explain by electron trapping in a single wakefield wave.^1 We demonstrate that enhanced electron acceleration can arise in inhomogeneous laser wakefields through the effects of spontaneously excited random fields. This acceleration mechanism is analogous to fast electron production by random fields near rf antennae in fusion devices and helicon plasma sources.^2 Electron acceleration in a transverse laser wave due to random field effects was recently found.^3 In the present study we solve numerically the governing equations of an ensemble of test electrons in a longitudinal electric wakefield perturbed by random fields. [1pt] Supported by the Czech grant IGA A1043701 and the U.S. DOE under grant No. DE-FG02-97ER54398. [1pt] 1. A. Pukhov and J. Meyer-ter-Vehn, in Superstrong Fields in Plasmas, AIP Conf. Proc. 426, p. 93 (1997). 2. V. Petržílka, J. A. Tataronis, et al., in Proc. Varenna - Lausanne Fusion Theory Workshop, p. 95 (1998). 3. J. Meyer-ter-Vehn and Z. M. Sheng, Phys. Plasmas 6, 641 (1999).
Ma, Li; Fan, Suohai
2017-03-14
The random forests algorithm is a type of classifier with prominent universality, a wide application range, and robustness for avoiding overfitting. But there are still some drawbacks to random forests. Therefore, to improve the performance of random forests, this paper seeks to improve imbalanced data processing, feature selection and parameter optimization. We propose the CURE-SMOTE algorithm for the imbalanced data classification problem. Experiments on imbalanced UCI data reveal that the combination of Clustering Using Representatives (CURE) enhances the original synthetic minority oversampling technique (SMOTE) algorithms effectively compared with the classification results on the original data using random sampling, Borderline-SMOTE1, safe-level SMOTE, C-SMOTE, and k-means-SMOTE. Additionally, the hybrid RF (random forests) algorithm has been proposed for feature selection and parameter optimization, which uses the minimum out of bag (OOB) data error as its objective function. Simulation results on binary and higher-dimensional data indicate that the proposed hybrid RF algorithms, hybrid genetic-random forests algorithm, hybrid particle swarm-random forests algorithm and hybrid fish swarm-random forests algorithm can achieve the minimum OOB error and show the best generalization ability. The training set produced from the proposed CURE-SMOTE algorithm is closer to the original data distribution because it contains minimal noise. Thus, better classification results are produced from this feasible and effective algorithm. Moreover, the hybrid algorithm's F-value, G-mean, AUC and OOB scores demonstrate that they surpass the performance of the original RF algorithm. Hence, this hybrid algorithm provides a new way to perform feature selection and parameter optimization.
Markov Random Fields, Stochastic Quantization and Image Analysis
1990-01-01
Markov random fields based on the lattice Z2 have been extensively used in image analysis in a Bayesian framework as a-priori models for the...of Image Analysis can be given some fundamental justification then there is a remarkable connection between Probabilistic Image Analysis , Statistical Mechanics and Lattice-based Euclidean Quantum Field Theory.
Andersen, Søren K; Müller, Matthias M; Hillyard, Steven A
2015-07-08
Experiments that study feature-based attention have often examined situations in which selection is based on a single feature (e.g., the color red). However, in more complex situations relevant stimuli may not be set apart from other stimuli by a single defining property but by a specific combination of features. Here, we examined sustained attentional selection of stimuli defined by conjunctions of color and orientation. Human observers attended to one out of four concurrently presented superimposed fields of randomly moving horizontal or vertical bars of red or blue color to detect brief intervals of coherent motion. Selective stimulus processing in early visual cortex was assessed by recordings of steady-state visual evoked potentials (SSVEPs) elicited by each of the flickering fields of stimuli. We directly contrasted attentional selection of single features and feature conjunctions and found that SSVEP amplitudes on conditions in which selection was based on a single feature only (color or orientation) exactly predicted the magnitude of attentional enhancement of SSVEPs when attending to a conjunction of both features. Furthermore, enhanced SSVEP amplitudes elicited by attended stimuli were accompanied by equivalent reductions of SSVEP amplitudes elicited by unattended stimuli in all cases. We conclude that attentional selection of a feature-conjunction stimulus is accomplished by the parallel and independent facilitation of its constituent feature dimensions in early visual cortex. The ability to perceive the world is limited by the brain's processing capacity. Attention affords adaptive behavior by selectively prioritizing processing of relevant stimuli based on their features (location, color, orientation, etc.). We found that attentional mechanisms for selection of different features belonging to the same object operate independently and in parallel: concurrent attentional selection of two stimulus features is simply the sum of attending to each of those features separately. This result is key to understanding attentional selection in complex (natural) scenes, where relevant stimuli are likely to be defined by a combination of stimulus features. Copyright © 2015 the authors 0270-6474/15/359912-08$15.00/0.
Random isotropic one-dimensional XY-model
NASA Astrophysics Data System (ADS)
Gonçalves, L. L.; Vieira, A. P.
1998-01-01
The 1D isotropic s = ½XY-model ( N sites), with random exchange interaction in a transverse random field is considered. The random variables satisfy bimodal quenched distributions. The solution is obtained by using the Jordan-Wigner fermionization and a canonical transformation, reducing the problem to diagonalizing an N × N matrix, corresponding to a system of N noninteracting fermions. The calculations are performed numerically for N = 1000, and the field-induced magnetization at T = 0 is obtained by averaging the results for the different samples. For the dilute case, in the uniform field limit, the magnetization exhibits various discontinuities, which are the consequence of the existence of disconnected finite clusters distributed along the chain. Also in this limit, for finite exchange constants J A and J B, as the probability of J A varies from one to zero, the saturation field is seen to vary from Γ A to Γ B, where Γ A(Γ B) is the value of the saturation field for the pure case with exchange constant equal to J A(J B) .
THE SELECTION OF A NATIONAL RANDOM SAMPLE OF TEACHERS FOR EXPERIMENTAL CURRICULUM EVALUATION.
ERIC Educational Resources Information Center
WELCH, WAYNE W.; AND OTHERS
MEMBERS OF THE EVALUATION SECTION OF HARVARD PROJECT PHYSICS, DESCRIBING WHAT IS SAID TO BE THE FIRST ATTEMPT TO SELECT A NATIONAL RANDOM SAMPLE OF (HIGH SCHOOL PHYSICS) TEACHERS, LIST THE STEPS AS (1) PURCHASE OF A LIST OF PHYSICS TEACHERS FROM THE NATIONAL SCIENCE TEACHERS ASSOCIATION (MOST COMPLETE AVAILABLE), (2) SELECTION OF 136 NAMES BY A…
Critical review of immediate implant loading.
Gapski, Ricardo; Wang, Hom-Lay; Mascarenhas, Paulo; Lang, Niklaus P
2003-10-01
Implant dentistry has become successful with the discovery of the biological properties of titanium. In the original protocol, studies have advocated a 2-stage surgical protocol for load-free and submerged healing to ensure predictable osseointegration. However, the discomfort, inconvenience, and anxiety associated with waiting period remains a challenge to both patients and clinicians. Hence, loading implant right after placement was attempted and has gained popularity among clinicians. Issues/questions related to this approach remain unanswered. Therefore, it is the purpose of this review article to (1). review and analyze critically the current available literature in the field of immediate implant loading and (2). discuss, based on scientific evidence, factors that may influence this treatment modality. Literature published over the past 20 years was selected and reviewed. Findings from these studies were discussed and summarized in the tables. The advantages and disadvantages associated with immediate implant loading were analyzed. Factors that may influence the success of immediate implant loading, including patient selection, type of bone quality, required implant length, micro- and macrostructure of the implant, surgical skill, need for achieving primary stability/control of occlusal force, and prosthesis guidelines, were thoroughly reviewed and discussed. Various studies have demonstrated the feasibility and predictability of this technique. However, most of these articles are based on retrospective data or uncontrolled cases. Randomized, prospective, parallel-armed longitudinal human trials are primarily based on short-term results and long-term follow-ups are still scarce in this field. Nonetheless, from available literature, it may be concluded that anatomic locations, implant designs, and restricted prosthetic guidelines are key to ensure successful outcomes. Future studies, preferably randomized, prospective longitudinal studies, are certainly needed before this approach can be widely used.
Unbiased feature selection in learning random forests for high-dimensional data.
Nguyen, Thanh-Tung; Huang, Joshua Zhexue; Nguyen, Thuy Thi
2015-01-01
Random forests (RFs) have been widely used as a powerful classification method. However, with the randomization in both bagging samples and feature selection, the trees in the forest tend to select uninformative features for node splitting. This makes RFs have poor accuracy when working with high-dimensional data. Besides that, RFs have bias in the feature selection process where multivalued features are favored. Aiming at debiasing feature selection in RFs, we propose a new RF algorithm, called xRF, to select good features in learning RFs for high-dimensional data. We first remove the uninformative features using p-value assessment, and the subset of unbiased features is then selected based on some statistical measures. This feature subset is then partitioned into two subsets. A feature weighting sampling technique is used to sample features from these two subsets for building trees. This approach enables one to generate more accurate trees, while allowing one to reduce dimensionality and the amount of data needed for learning RFs. An extensive set of experiments has been conducted on 47 high-dimensional real-world datasets including image datasets. The experimental results have shown that RFs with the proposed approach outperformed the existing random forests in increasing the accuracy and the AUC measures.
Applications of random forest feature selection for fine-scale genetic population assignment.
Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G
2018-02-01
Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.
High-Tg Polynorbornene-Based Block and Random Copolymers for Butanol Pervaporation Membranes
NASA Astrophysics Data System (ADS)
Register, Richard A.; Kim, Dong-Gyun; Takigawa, Tamami; Kashino, Tomomasa; Burtovyy, Oleksandr; Bell, Andrew
Vinyl addition polymers of substituted norbornene (NB) monomers possess desirably high glass transition temperatures (Tg); however, until very recently, the lack of an applicable living polymerization chemistry has precluded the synthesis of such polymers with controlled architecture, or copolymers with controlled sequence distribution. We have recently synthesized block and random copolymers of NB monomers bearing hydroxyhexafluoroisopropyl and n-butyl substituents (HFANB and BuNB) via living vinyl addition polymerization with Pd-based catalysts. Both series of polymers were cast into the selective skin layers of thin film composite (TFC) membranes, and these organophilic membranes investigated for the isolation of n-butanol from dilute aqueous solution (model fermentation broth) via pervaporation. The block copolymers show well-defined microphase-separated morphologies, both in bulk and as the selective skin layers on TFC membranes, while the random copolymers are homogeneous. Both block and random vinyl addition copolymers are effective as n-butanol pervaporation membranes, with the block copolymers showing a better flux-selectivity balance. While polyHFANB has much higher permeability and n-butanol selectivity than polyBuNB, incorporating BuNB units into the polymer (in either a block or random sequence) limits the swelling of the polyHFANB and thereby improves the n-butanol pervaporation selectivity.
Prospecting of popcorn hybrids for resistance to fall armyworm.
Crubelati-Mulati, N C S; Scapim, C A; Albuquerque, F A; Amaral Junior, A T; Vivas, M; Rodovalho, M A
2014-08-26
The fall armyworm, Spodoptera frugiperda, is the pest that causes the greatest economic losses for both common corn and popcorn crops, and the use of resistant plant genotypes is an important tool for integrated pest management. The goal of the present study was to evaluate the damage caused by S. frugiperda on single-cross popcorn hybrids under field conditions with natural infestation as well as to study the effect of 11 popcorn hybrids on the S. frugiperda life cycle under laboratory conditions. A completely randomized block design with 4 replicates was used for the field experiment, and a completely randomized design with 10 replicates was used for the laboratory experiment. In the field experiment, the damage caused by fall armyworm, grain yield, and popping expansion were quantified, and a diallel analysis was performed to select the best hybrids. For the laboratory experiment, caterpillars were obtained from laboratory cultures kept on an artificial diet and were fed with leaves from the 11 hybrids. Hybrids P7.0 x P9.4, P7.1 x P9.6, P7.2.0 x P9.3, P7.4.0 x P9.1 and P7.4.1 x P9.4 exhibited negative specific combining ability for injury by fall armyworm and positive specific combining ability for yield and popping expansion. In the laboratory experiment, the hybrids influenced the mean larval stage duration, mean larval mass, final larval mass, pupal stage duration, mean pupal mass, and adult longevity.
Robust portfolio selection based on asymmetric measures of variability of stock returns
NASA Astrophysics Data System (ADS)
Chen, Wei; Tan, Shaohua
2009-10-01
This paper addresses a new uncertainty set--interval random uncertainty set for robust optimization. The form of interval random uncertainty set makes it suitable for capturing the downside and upside deviations of real-world data. These deviation measures capture distributional asymmetry and lead to better optimization results. We also apply our interval random chance-constrained programming to robust mean-variance portfolio selection under interval random uncertainty sets in the elements of mean vector and covariance matrix. Numerical experiments with real market data indicate that our approach results in better portfolio performance.
Engineering RNA phage MS2 virus-like particles for peptide display
NASA Astrophysics Data System (ADS)
Jordan, Sheldon Keith
Phage display is a powerful and versatile technology that enables the selection of novel binding functions from large populations of randomly generated peptide sequences. Random sequences are genetically fused to a viral structural protein to produce complex peptide libraries. From a sufficiently complex library, phage bearing peptides with practically any desired binding activity can be physically isolated by affinity selection, and, since each particle carries in its genome the genetic information for its own replication, the selectants can be amplified by infection of bacteria. For certain applications however, existing phage display platforms have limitations. One such area is in the field of vaccine development, where the goal is to identify relevant epitopes by affinity-selection against an antibody target, and then to utilize them as immunogens to elicit a desired antibody response. Today, affinity selection is usually conducted using display on filamentous phages like M13. This technology provides an efficient means for epitope identification, but, because filamentous phages do not display peptides in the high-density, multivalent arrays the immune system prefers to recognize, they generally make poor immunogens and are typically useless as vaccines. This makes it necessary to confer immunogenicity by conjugating synthetic versions of the peptides to more immunogenic carriers. Unfortunately, when introduced into these new structural environments, the epitopes often fail to elicit relevant antibody responses. Thus, it would be advantageous to combine the epitope selection and immunogen functions into a single platform where the structural constraints present during affinity selection can be preserved during immunization. This dissertation describes efforts to develop a peptide display system based on the virus-like particles (VLPs) of bacteriophage MS2. Phage display technologies rely on (1) the identification of a site in a viral structural protein that is present on the surface of the virus particle and can accept foreign sequence insertions without disruption of protein folding and viral particle assembly, and (2) on the encapsidation of nucleic acid sequences encoding both the VLP and the peptide it displays. The experiments described here are aimed at satisfying the first of these two requirements by engineering efficient peptide display at two different sites in MS2 coat protein. First, we evaluated the suitability of the N-terminus of MS2 coat for peptide insertions. It was observed that random N-terminal 10-mer fusions generally disrupted protein folding and VLP assembly, but by bracketing the foreign sequences with certain specific dipeptides, these defects could be suppressed. Next, the suitability of a coat protein surface loop for foreign sequence insertion was tested. Specifically, random sequence peptides were inserted into the N-terminal-most AB-loop of a coat protein single-chain dimer. Again we found that efficient display required the presence of appropriate dipeptides bracketing the peptide insertion. Finally, it was shown that an N-terminal fusion that tended to interfere specifically with capsid assembly could be efficiently incorporated into mosaic particles when co-expressed with wild-type coat protein.
Impact of an educational intervention on medical records documentation.
Vahedi, Hojat Sheikhmotahar; Mirfakhrai, Minasadat; Vahidi, Elnaz; Saeedi, Morteza
2018-01-01
Inaccurate and incomplete documentation can lead to poor treatment and medico-legal consequences. Studies indicate that teaching programs in this field can improve the documentation of medical records. The study aimed to evaluate the effect of an educational workshop on medical record documentation by emergency medicine residents in the emergency department. An interventional study was performed on 30 residents in their first year of training emergency medicine (PGY1), in three tertiary referral hospitals of Tehran University of Medical Sciences. The essential information that should be documented in a medical record was taught in a 3-day-workshop. The medical records completed by these residents before the training workshop were randomly selected and scored (300 records), as was a random selection of the records they completed one (300 records) and six months (300 records) after the workshop. Documentation of the majority of the essential items of information was improved significantly after the workshop. In particular documentation of the patients' date and time of admission, past medical and social history. Documentation of patient identity, requests for consultations by other specialties, first and final diagnoses were 100% complete and accurate up to 6 months of the workshop. This study confirms that an educational workshop improves medical record documentation by physicians in training.
Adaptive consensus of scale-free multi-agent system by randomly selecting links
NASA Astrophysics Data System (ADS)
Mou, Jinping; Ge, Huafeng
2016-06-01
This paper investigates an adaptive consensus problem for distributed scale-free multi-agent systems (SFMASs) by randomly selecting links, where the degree of each node follows a power-law distribution. The randomly selecting links are based on the assumption that every agent decides to select links among its neighbours according to the received data with a certain probability. Accordingly, a novel consensus protocol with the range of the received data is developed, and each node updates its state according to the protocol. By the iterative method and Cauchy inequality, the theoretical analysis shows that all errors among agents converge to zero, and in the meanwhile, several criteria of consensus are obtained. One numerical example shows the reliability of the proposed methods.
Robarts, Daniel W H; Wolfe, Andrea D
2014-07-01
In the past few decades, many investigations in the field of plant biology have employed selectively neutral, multilocus, dominant markers such as inter-simple sequence repeat (ISSR), random-amplified polymorphic DNA (RAPD), and amplified fragment length polymorphism (AFLP) to address hypotheses at lower taxonomic levels. More recently, sequence-related amplified polymorphism (SRAP) markers have been developed, which are used to amplify coding regions of DNA with primers targeting open reading frames. These markers have proven to be robust and highly variable, on par with AFLP, and are attained through a significantly less technically demanding process. SRAP markers have been used primarily for agronomic and horticultural purposes, developing quantitative trait loci in advanced hybrids and assessing genetic diversity of large germplasm collections. Here, we suggest that SRAP markers should be employed for research addressing hypotheses in plant systematics, biogeography, conservation, ecology, and beyond. We provide an overview of the SRAP literature to date, review descriptive statistics of SRAP markers in a subset of 171 publications, and present relevant case studies to demonstrate the applicability of SRAP markers to the diverse field of plant biology. Results of these selected works indicate that SRAP markers have the potential to enhance the current suite of molecular tools in a diversity of fields by providing an easy-to-use, highly variable marker with inherent biological significance.
Robarts, Daniel W. H.; Wolfe, Andrea D.
2014-01-01
In the past few decades, many investigations in the field of plant biology have employed selectively neutral, multilocus, dominant markers such as inter-simple sequence repeat (ISSR), random-amplified polymorphic DNA (RAPD), and amplified fragment length polymorphism (AFLP) to address hypotheses at lower taxonomic levels. More recently, sequence-related amplified polymorphism (SRAP) markers have been developed, which are used to amplify coding regions of DNA with primers targeting open reading frames. These markers have proven to be robust and highly variable, on par with AFLP, and are attained through a significantly less technically demanding process. SRAP markers have been used primarily for agronomic and horticultural purposes, developing quantitative trait loci in advanced hybrids and assessing genetic diversity of large germplasm collections. Here, we suggest that SRAP markers should be employed for research addressing hypotheses in plant systematics, biogeography, conservation, ecology, and beyond. We provide an overview of the SRAP literature to date, review descriptive statistics of SRAP markers in a subset of 171 publications, and present relevant case studies to demonstrate the applicability of SRAP markers to the diverse field of plant biology. Results of these selected works indicate that SRAP markers have the potential to enhance the current suite of molecular tools in a diversity of fields by providing an easy-to-use, highly variable marker with inherent biological significance. PMID:25202637
AGES: THE AGN AND GALAXY EVOLUTION SURVEY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kochanek, C. S.; Eisenstein, D. J.; Caldwell, N.
2012-05-01
The AGN and Galaxy Evolution Survey (AGES) is a redshift survey covering, in its standard fields, 7.7 deg{sup 2} of the Booetes field of the NOAO Deep Wide-Field Survey. The final sample consists of 23,745 redshifts. There are well-defined galaxy samples in 10 bands (the B{sub W} , R, I, J, K, IRAC 3.6, 4.5, 5.8, and 8.0 {mu}m, and MIPS 24 {mu}m bands) to a limiting magnitude of I < 20 mag for spectroscopy. For these galaxies, we obtained 18,163 redshifts from a sample of 35,200 galaxies, where random sparse sampling was used to define statistically complete sub-samples inmore » all 10 photometric bands. The median galaxy redshift is 0.31, and 90% of the redshifts are in the range 0.085 < z < 0.66. Active galactic nuclei (AGNs) were selected as radio, X-ray, IRAC mid-IR, and MIPS 24 {mu}m sources to fainter limiting magnitudes (I < 22.5 mag for point sources). Redshifts were obtained for 4764 quasars and galaxies with AGN signatures, with 2926, 1718, 605, 119, and 13 above redshifts of 0.5, 1, 2, 3, and 4, respectively. We detail all the AGES selection procedures and present the complete spectroscopic redshift catalogs and spectral energy distribution decompositions. Photometric redshift estimates are provided for all sources in the AGES samples.« less
Wright, Marvin N; Dankowski, Theresa; Ziegler, Andreas
2017-04-15
The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption may not always be fulfilled. An alternative approach for survival prediction is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistic, which favors splitting variables with many possible split points. Conditional inference forests avoid this split variable selection bias. However, linear rank statistics are utilized by default in conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. An alternative is to use maximally selected rank statistics for the split point selection. As in conditional inference forests, splitting variables are compared on the p-value scale. However, instead of the conditional Monte-Carlo approach used in conditional inference forests, p-value approximations are employed. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split variable selection is possible. However, there is a trade-off between unbiased split variable selection and runtime. In benchmark studies of prediction performance on simulated and real datasets, the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison, the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Dynamic Grover search: applications in recommendation systems and optimization problems
NASA Astrophysics Data System (ADS)
Chakrabarty, Indranil; Khan, Shahzor; Singh, Vanshdeep
2017-06-01
In the recent years, we have seen that Grover search algorithm (Proceedings, 28th annual ACM symposium on the theory of computing, pp. 212-219, 1996) by using quantum parallelism has revolutionized the field of solving huge class of NP problems in comparisons to classical systems. In this work, we explore the idea of extending Grover search algorithm to approximate algorithms. Here we try to analyze the applicability of Grover search to process an unstructured database with a dynamic selection function in contrast to the static selection function used in the original work (Grover in Proceedings, 28th annual ACM symposium on the theory of computing, pp. 212-219, 1996). We show that this alteration facilitates us to extend the application of Grover search to the field of randomized search algorithms. Further, we use the dynamic Grover search algorithm to define the goals for a recommendation system based on which we propose a recommendation algorithm which uses binomial similarity distribution space giving us a quadratic speedup over traditional classical unstructured recommendation systems. Finally, we see how dynamic Grover search can be used to tackle a wide range of optimization problems where we improve complexity over existing optimization algorithms.
Application of GIS-based Procedure on Slopeland Use Classification and Identification
NASA Astrophysics Data System (ADS)
KU, L. C.; LI, M. C.
2016-12-01
In Taiwan, the "Slopeland Conservation and Utilization Act" regulates the management of the slopelands. It categorizes the slopeland into land suitable for agricultural or animal husbandry, land suitable for forestry and land for enhanced conservation, according to the environmental factors of average slope, effective soil depth, soil erosion and parental rock. Traditionally, investigations of environmental factors require cost-effective field works. It has been confronted with many practical issues such as non-evaluated cadastral parcels, evaluation results depending on expert's opinion, difficulties in field measurement and judgment, and time consuming. This study aimed to develop a GIS-based procedure involved in the acceleration of slopeland use classification and quality improvement. First, the environmental factors of slopelands were analyzed by GIS and SPSS software. The analysis involved with the digital elevation model (DEM), soil depth map, land use map and satellite images. Second, 5% of the analyzed slopelands were selected to perform the site investigations and correct the results of classification. Finally, a 2nd examination was involved by randomly selected 2% of the analyzed slopelands to perform the accuracy evaluation. It was showed the developed procedure is effective in slopeland use classification and identification. Keywords: Slopeland Use Classification, GIS, Management
(Electro)Mechanical Properties of Olefinic Block Copolymers
NASA Astrophysics Data System (ADS)
Spontak, Richard
2014-03-01
Conventional styrenic triblock copolymers (SBCs) swollen with a midblock-selective oil have been previously shown to exhibit excellent electromechanical properties as dielectric elastomers. In this class of electroactive polymers, compliant electrodes applied as active areas to opposing surfaces of an elastomer attract each other, and thus compress the elastomer due to the onset of a Maxwell stress, upon application of an external electric field. This isochoric process is accompanied by an increase in lateral area, which yields the electroactuation strain (measuring beyond 300% in SBC systems). Performance parameters such as the Maxwell stress, transverse strain, dielectric breakdown, energy density and electromechanical efficiency are determined directly from the applied electric field and resulting electroactuation strain. In this study, the same principle used to evaluate SBC systems is extended to olefinic block copolymers (OBCs), which can be described as randomly-coupled multiblock copolymers that consist of crystallizable polyethylene hard segments and rubbery poly(ethylene-co-octene) soft segments. Considerations governing the development of a methodology to fabricate electroresponsive OBC systems are first discussed for several OBCs differing in composition and bulk properties. Evidence of electroactuation in selectively-solvated OBC systems is presented and performance metrics measured therefrom are quantitatively compared with dielectric elastomers derived from SBC and related materials.
Effect of Expanding Medicaid for Parents on Children’s Health Insurance Coverage
DeVoe, Jennifer E.; Marino, Miguel; Angier, Heather; O’Malley, Jean P.; Crawford, Courtney; Nelson, Christine; Tillotson, Carrie J.; Bailey, Steffani R.; Gallia, Charles; Gold, Rachel
2016-01-01
IMPORTANCE In the United States, health insurance is not universal. Observational studies show an association between uninsured parents and children. This association persisted even after expansions in child-only public health insurance. Oregon’s randomized Medicaid expansion for adults, known as the Oregon Experiment, created a rare opportunity to assess causality between parent and child coverage. OBJECTIVE To estimate the effect on a child’s health insurance coverage status when (1) a parent randomly gains access to health insurance and (2) a parent obtains coverage. DESIGN, SETTING, AND PARTICIPANTS Oregon Experiment randomized natural experiment assessing the results of Oregon’s 2008 Medicaid expansion. We used generalized estimating equation models to examine the longitudinal effect of a parent randomly selected to apply for Medicaid on their child’s Medicaid or Children’s Health Insurance Program (CHIP) coverage (intent-to-treat analyses). We used per-protocol analyses to understand the impact on children’s coverage when a parent was randomly selected to apply for and obtained Medicaid. Participants included 14 409 children aged 2 to 18 years whose parents participated in the Oregon Experiment. EXPOSURES For intent-to-treat analyses, the date a parent was selected to apply for Medicaid was considered the date the child was exposed to the intervention. In per-protocol analyses, exposure was defined as whether a selected parent obtained Medicaid. MAIN OUTCOMES AND MEASURES Children’s Medicaid or CHIP coverage, assessed monthly and in 6-month intervals relative to their parent’s selection date. RESULTS In the immediate period after selection, children whose parents were selected to apply significantly increased from 3830 (61.4%) to 4152 (66.6%) compared with a nonsignificant change from 5049 (61.8%) to 5044 (61.7%) for children whose parents were not selected to apply. Children whose parents were randomly selected to apply for Medicaid had 18% higher odds of being covered in the first 6 months after parent’s selection compared with children whose parents were not selected (adjusted odds ratio [AOR] = 1.18; 95% CI, 1.10–1.27). The effect remained significant during months 7 to 12 (AOR = 1.11; 95% CI, 1.03–1.19); months 13 to 18 showed a positive but not significant effect (AOR = 1.07; 95% CI, 0.99–1.14). Children whose parents were selected and obtained coverage had more than double the odds of having coverage compared with children whose parents were not selected and did not gain coverage (AOR = 2.37; 95% CI, 2.14–2.64). CONCLUSIONS AND RELEVANCE Children’s odds of having Medicaid or CHIP coverage increased when their parents were randomly selected to apply for Medicaid. Children whose parents were selected and subsequently obtained coverage benefited most. This study demonstrates a causal link between parents’ access to Medicaid coverage and their children’s coverage. PMID:25561041
Oligonucleotide probes functionalization of nanogap electrodes.
Zaffino, Rosa Letizia; Mir, Mònica; Samitier, Josep
2017-11-01
Nanogap electrodes have attracted a lot of consideration as promising platform for molecular electronic and biomolecules detection. This is mainly for their higher aspect ratio, and because their electrical properties are easily accessed by current-voltage measurements. Nevertheless, application of standard current-voltages measurements used to characterize nanogap response, and/or to modify specific nanogap electrodes properties, represents an issue. Since the strength of electrical fields in nanoscaled devices can reach high values, even at low voltages. Here, we analyzed the effects induced by different methods of surface modification of nanogap electrodes, in test-voltage application, employed for the electrical detection of a desoxyribonucleic acid (DNA) target. Nanogap electrodes were functionalized with two antisymmetric oligo-probes designed to have 20 terminal bases complementary to the edges of the target, which after hybridization bridges the nanogap, closing the electrical circuit. Two methods of functionalization were studied for this purpose; a random self-assembling of a mixture of the two oligo-probes (OPs) used in the platform, and a selective method that controls the position of each OP at selected side of nanogap electrodes. We used for this aim, the electrophoretic effect induced on negatively charged probes by the application of an external direct current voltage. The results obtained with both functionalization methods where characterized and compared in terms of electrode surface covering, calculated by using voltammetry analysis. Moreover, we contrasted the electrical detection of a DNA target in the nanogap platform either in site-selective and in randomly assembled nanogap. According to our results, a denser, although not selective surface functionalization, is advantageous for such kind of applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ultrafast random-access scanning in two-photon microscopy using acousto-optic deflectors.
Salomé, R; Kremer, Y; Dieudonné, S; Léger, J-F; Krichevsky, O; Wyart, C; Chatenay, D; Bourdieu, L
2006-06-30
Two-photon scanning microscopy (TPSM) is a powerful tool for imaging deep inside living tissues with sub-cellular resolution. The temporal resolution of TPSM is however strongly limited by the galvanometric mirrors used to steer the laser beam. Fast physiological events can therefore only be followed by scanning repeatedly a single line within the field of view. Because acousto-optic deflectors (AODs) are non-mechanical devices, they allow access at any point within the field of view on a microsecond time scale and are therefore excellent candidates to improve the temporal resolution of TPSM. However, the use of AOD-based scanners with femtosecond pulses raises several technical difficulties. In this paper, we describe an all-digital TPSM setup based on two crossed AODs. It includes in particular an acousto-optic modulator (AOM) placed at 45 degrees with respect to the AODs to pre-compensate for the large spatial distortions of femtosecond pulses occurring in the AODs, in order to optimize the spatial resolution and the fluorescence excitation. Our setup allows recording from freely selectable point-of-interest at high speed (1kHz). By maximizing the time spent on points of interest, random-access TPSM (RA-TPSM) constitutes a promising method for multiunit recordings with millisecond resolution in biological tissues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldman, Allen M.
The goal of this program was to study new physical phenomena that might be relevant to the performance of conductive devices and circuits of the smallest realizable feature sizes possible using physical rather than biological techniques. Although the initial scientific work supported involved the use of scanning tunneling microscopy and spectroscopy to ascertain the statistics of the energy level distribution of randomly sized and randomly shaped quantum dots, or nano-crystals, the main focus was on the investigation of selected properties, including superconductivity, of conducting and superconducting nanowires prepared using electron-beam-lithography. We discovered a magnetic-field-restoration of superconductivity in out-of-equilibrium nanowires drivenmore » resistive by current. This phenomenon was explained by the existence of a state in which dissipation coexisted with nonvanishing superconducting order. We also produced ultra-small superconducting loops to study a predicted anomalous fluxoid quantization, but instead, found a magnetic-field-dependent, high-resistance state, rather than superconductivity. Finally, we developed a simple and controllable nanowire in an induced charged layer near the surface of a masked single-crystal insulator, SrTiO 3. The layer was induced using an electric double layer transistor employing an ionic liquid (IL). The transport properties of the induced nanowire resembled those of collective electronic transport through an array of quantum dots.« less
Humphreys, Keith; Blodgett, Janet C.; Wagner, Todd H.
2014-01-01
Background Observational studies of Alcoholics Anonymous’ (AA) effectiveness are vulnerable to self-selection bias because individuals choose whether or not to attend AA. The present study therefore employed an innovative statistical technique to derive a selection bias-free estimate of AA’s impact. Methods Six datasets from 5 National Institutes of Health-funded randomized trials (one with two independent parallel arms) of AA facilitation interventions were analyzed using instrumental variables models. Alcohol dependent individuals in one of the datasets (n = 774) were analyzed separately from the rest of sample (n = 1582 individuals pooled from 5 datasets) because of heterogeneity in sample parameters. Randomization itself was used as the instrumental variable. Results Randomization was a good instrument in both samples, effectively predicting increased AA attendance that could not be attributed to self-selection. In five of the six data sets, which were pooled for analysis, increased AA attendance that was attributable to randomization (i.e., free of self-selection bias) was effective at increasing days of abstinence at 3-month (B = .38, p = .001) and 15-month (B = 0.42, p = .04) follow-up. However, in the remaining dataset, in which pre-existing AA attendance was much higher, further increases in AA involvement caused by the randomly assigned facilitation intervention did not affect drinking outcome. Conclusions For most individuals seeking help for alcohol problems, increasing AA attendance leads to short and long term decreases in alcohol consumption that cannot be attributed to self-selection. However, for populations with high pre-existing AA involvement, further increases in AA attendance may have little impact. PMID:25421504
Humphreys, Keith; Blodgett, Janet C; Wagner, Todd H
2014-11-01
Observational studies of Alcoholics Anonymous' (AA) effectiveness are vulnerable to self-selection bias because individuals choose whether or not to attend AA. The present study, therefore, employed an innovative statistical technique to derive a selection bias-free estimate of AA's impact. Six data sets from 5 National Institutes of Health-funded randomized trials (1 with 2 independent parallel arms) of AA facilitation interventions were analyzed using instrumental variables models. Alcohol-dependent individuals in one of the data sets (n = 774) were analyzed separately from the rest of sample (n = 1,582 individuals pooled from 5 data sets) because of heterogeneity in sample parameters. Randomization itself was used as the instrumental variable. Randomization was a good instrument in both samples, effectively predicting increased AA attendance that could not be attributed to self-selection. In 5 of the 6 data sets, which were pooled for analysis, increased AA attendance that was attributable to randomization (i.e., free of self-selection bias) was effective at increasing days of abstinence at 3-month (B = 0.38, p = 0.001) and 15-month (B = 0.42, p = 0.04) follow-up. However, in the remaining data set, in which preexisting AA attendance was much higher, further increases in AA involvement caused by the randomly assigned facilitation intervention did not affect drinking outcome. For most individuals seeking help for alcohol problems, increasing AA attendance leads to short- and long-term decreases in alcohol consumption that cannot be attributed to self-selection. However, for populations with high preexisting AA involvement, further increases in AA attendance may have little impact. Copyright © 2014 by the Research Society on Alcoholism.
Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos
2014-02-25
It is fundamental that randomised controlled trials (RCTs) are properly conducted in order to reach well-supported conclusions. However, there is emerging evidence that RCTs are subject to biases which can overestimate or underestimate the true treatment effect, due to flaws in the study design characteristics of such trials. The extent to which this holds true in oral health RCTs, which have some unique design characteristics compared to RCTs in other health fields, is unclear. As such, we aim to examine the empirical evidence quantifying the extent of bias associated with methodological and non-methodological characteristics in oral health RCTs. We plan to perform a meta-epidemiological study, where a sample size of 60 meta-analyses (MAs) including approximately 600 RCTs will be selected. The MAs will be randomly obtained from the Oral Health Database of Systematic Reviews using a random number table; and will be considered for inclusion if they include a minimum of five RCTs, and examine a therapeutic intervention related to one of the recognised dental specialties. RCTs identified in selected MAs will be subsequently included if their study design includes a comparison between an intervention group and a placebo group or another intervention group. Data will be extracted from selected trials included in MAs based on a number of methodological and non-methodological characteristics. Moreover, the risk of bias will be assessed using the Cochrane Risk of Bias tool. Effect size estimates and measures of variability for the main outcome will be extracted from each RCT included in selected MAs, and a two-level analysis will be conducted using a meta-meta-analytic approach with a random effects model to allow for intra-MA and inter-MA heterogeneity. The intended audiences of the findings will include dental clinicians, oral health researchers, policymakers and graduate students. The aforementioned will be introduced to the findings through workshops, seminars, round table discussions and targeted individual meetings. Other opportunities for knowledge transfer will be pursued such as key dental conferences. Finally, the results will be published as a scientific report in a dental peer-reviewed journal.
Spielman, Stephanie J; Wilke, Claus O
2016-11-01
The mutation-selection model of coding sequence evolution has received renewed attention for its use in estimating site-specific amino acid propensities and selection coefficient distributions. Two computationally tractable mutation-selection inference frameworks have been introduced: One framework employs a fixed-effects, highly parameterized maximum likelihood approach, whereas the other employs a random-effects Bayesian Dirichlet Process approach. While both implementations follow the same model, they appear to make distinct predictions about the distribution of selection coefficients. The fixed-effects framework estimates a large proportion of highly deleterious substitutions, whereas the random-effects framework estimates that all substitutions are either nearly neutral or weakly deleterious. It remains unknown, however, how accurately each method infers evolutionary constraints at individual sites. Indeed, selection coefficient distributions pool all site-specific inferences, thereby obscuring a precise assessment of site-specific estimates. Therefore, in this study, we use a simulation-based strategy to determine how accurately each approach recapitulates the selective constraint at individual sites. We find that the fixed-effects approach, despite its extensive parameterization, consistently and accurately estimates site-specific evolutionary constraint. By contrast, the random-effects Bayesian approach systematically underestimates the strength of natural selection, particularly for slowly evolving sites. We also find that, despite the strong differences between their inferred selection coefficient distributions, the fixed- and random-effects approaches yield surprisingly similar inferences of site-specific selective constraint. We conclude that the fixed-effects mutation-selection framework provides the more reliable software platform for model application and future development. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Ssekandi, W; Mulumba, J W; Colangelo, P; Nankya, R; Fadda, C; Karungi, J; Otim, M; De Santis, P; Jarvis, D I
The bean fly ( Ophiomyia spp.) is considered the most economically damaging field insect pest of common beans in Uganda. Despite the use of existing pest management approaches, reported damage has remained high. Forty-eight traditional and improved common bean varieties currently grown in farmers' fields were evaluated for resistance against bean fly. Data on bean fly incidence, severity and root damage from bean stem maggot were collected. Generalized linear mixed model (GLMM) revealed significant resistance to bean fly in the Ugandan traditional varieties. A popular resistant traditional variety and a popular susceptible commercial variety were selected from the 48 varieties and evaluated in pure and mixed stands. The incidence of bean fly infestation on both varieties in mixtures with different arrangements (systematic random versus rows), and different proportions within each of the two arrangements, was measured and analysed using GLMMs. The proportion of resistant varieties in a mixture and the arrangement type significantly decreased bean fly damage compared to pure stands, with the highest decrease in damage registered in the systematic random mixture with at least 50 % of resistant variety. The highest reduction in root damage, obvious 21 days after planting, was found in systematic random mixtures with at least 50 % of the resistant variety. Small holder farmers in East Africa and elsewhere in the world have local preferences for growing bean varieties in genetic mixtures. These mixtures can be enhanced by the use of resistant varieties in the mixtures to reduce bean fly damage on susceptible popular varieties.
The non-equilibrium allele frequency spectrum in a Poisson random field framework.
Kaj, Ingemar; Mugal, Carina F
2016-10-01
In population genetic studies, the allele frequency spectrum (AFS) efficiently summarizes genome-wide polymorphism data and shapes a variety of allele frequency-based summary statistics. While existing theory typically features equilibrium conditions, emerging methodology requires an analytical understanding of the build-up of the allele frequencies over time. In this work, we use the framework of Poisson random fields to derive new representations of the non-equilibrium AFS for the case of a Wright-Fisher population model with selection. In our approach, the AFS is a scaling-limit of the expectation of a Poisson stochastic integral and the representation of the non-equilibrium AFS arises in terms of a fixation time probability distribution. The known duality between the Wright-Fisher diffusion process and a birth and death process generalizing Kingman's coalescent yields an additional representation. The results carry over to the setting of a random sample drawn from the population and provide the non-equilibrium behavior of sample statistics. Our findings are consistent with and extend a previous approach where the non-equilibrium AFS solves a partial differential forward equation with a non-traditional boundary condition. Moreover, we provide a bridge to previous coalescent-based work, and hence tie several frameworks together. Since frequency-based summary statistics are widely used in population genetics, for example, to identify candidate loci of adaptive evolution, to infer the demographic history of a population, or to improve our understanding of the underlying mechanics of speciation events, the presented results are potentially useful for a broad range of topics. Copyright © 2016 Elsevier Inc. All rights reserved.
Faggion, Clovis Mariano; Wu, Yun-Chun; Scheidgen, Moritz; Tu, Yu-Kang
2015-01-01
Risk of bias (ROB) may threaten the internal validity of a clinical trial by distorting the magnitude of treatment effect estimates, although some conflicting information on this assumption exists. The objective of this study was evaluate the effect of ROB on the magnitude of treatment effect estimates in randomized controlled trials (RCTs) in periodontology and implant dentistry. A search for Cochrane systematic reviews (SRs), including meta-analyses of RCTs published in periodontology and implant dentistry fields, was performed in the Cochrane Library in September 2014. Random-effect meta-analyses were performed by grouping RCTs with different levels of ROBs in three domains (sequence generation, allocation concealment, and blinding of outcome assessment). To increase power and precision, only SRs with meta-analyses including at least 10 RCTs were included. Meta-regression was performed to investigate the association between ROB characteristics and the magnitudes of intervention effects in the meta-analyses. Of the 24 initially screened SRs, 21 SRs were excluded because they did not include at least 10 RCTs in the meta-analyses. Three SRs (two from periodontology field) generated information for conducting 27 meta-analyses. Meta-regression did not reveal significant differences in the relationship of the ROB level with the size of treatment effect estimates, although a trend for inflated estimates was observed in domains with unclear ROBs. In this sample of RCTs, high and (mainly) unclear risks of selection and detection biases did not seem to influence the size of treatment effect estimates, although several confounders might have influenced the strength of the association.
Turbulent, Extreme Multi-zone Model for Simulating Flux and Polarization Variability in Blazars
NASA Astrophysics Data System (ADS)
Marscher, Alan P.
2014-01-01
The author presents a model for variability of the flux and polarization of blazars in which turbulent plasma flowing at a relativistic speed down a jet crosses a standing conical shock. The shock compresses the plasma and accelerates electrons to energies up to γmax >~ 104 times their rest-mass energy, with the value of γmax determined by the direction of the magnetic field relative to the shock front. The turbulence is approximated in a computer code as many cells, each with a uniform magnetic field whose direction is selected randomly. The density of high-energy electrons in the plasma changes randomly with time in a manner consistent with the power spectral density of flux variations derived from observations of blazars. The variations in flux and polarization are therefore caused by continuous noise processes rather than by singular events such as explosive injection of energy at the base of the jet. Sample simulations illustrate the behavior of flux and linear polarization versus time that such a model produces. The variations in γ-ray flux generated by the code are often, but not always, correlated with those at lower frequencies, and many of the flares are sharply peaked. The mean degree of polarization of synchrotron radiation is higher and its timescale of variability shorter toward higher frequencies, while the polarization electric vector sometimes randomly executes apparent rotations. The slope of the spectral energy distribution exhibits sharper breaks than can arise solely from energy losses. All of these results correspond to properties observed in blazars.
Random Interchange of Magnetic Connectivity
NASA Astrophysics Data System (ADS)
Matthaeus, W. H.; Ruffolo, D. J.; Servidio, S.; Wan, M.; Rappazzo, A. F.
2015-12-01
Magnetic connectivity, the connection between two points along a magnetic field line, has a stochastic character associated with field lines random walking in space due to magnetic fluctuations, but connectivity can also change in time due to dynamical activity [1]. For fluctuations transverse to a strong mean field, this connectivity change be caused by stochastic interchange due to component reconnection. The process may be understood approximately by formulating a diffusion-like Fokker-Planck coefficient [2] that is asymptotically related to standard field line random walk. Quantitative estimates are provided, for transverse magnetic field models and anisotropic models such as reduced magnetohydrodynamics. In heliospheric applications, these estimates may be useful for understanding mixing between open and close field line regions near coronal hole boundaries, and large latitude excursions of connectivity associated with turbulence. [1] A. F. Rappazzo, W. H. Matthaeus, D. Ruffolo, S. Servidio & M. Velli, ApJL, 758, L14 (2012) [2] D. Ruffolo & W. Matthaeus, ApJ, 806, 233 (2015)
Evolving artificial metalloenzymes via random mutagenesis
NASA Astrophysics Data System (ADS)
Yang, Hao; Swartz, Alan M.; Park, Hyun June; Srivastava, Poonam; Ellis-Guardiola, Ken; Upp, David M.; Lee, Gihoon; Belsare, Ketaki; Gu, Yifan; Zhang, Chen; Moellering, Raymond E.; Lewis, Jared C.
2018-03-01
Random mutagenesis has the potential to optimize the efficiency and selectivity of protein catalysts without requiring detailed knowledge of protein structure; however, introducing synthetic metal cofactors complicates the expression and screening of enzyme libraries, and activity arising from free cofactor must be eliminated. Here we report an efficient platform to create and screen libraries of artificial metalloenzymes (ArMs) via random mutagenesis, which we use to evolve highly selective dirhodium cyclopropanases. Error-prone PCR and combinatorial codon mutagenesis enabled multiplexed analysis of random mutations, including at sites distal to the putative ArM active site that are difficult to identify using targeted mutagenesis approaches. Variants that exhibited significantly improved selectivity for each of the cyclopropane product enantiomers were identified, and higher activity than previously reported ArM cyclopropanases obtained via targeted mutagenesis was also observed. This improved selectivity carried over to other dirhodium-catalysed transformations, including N-H, S-H and Si-H insertion, demonstrating that ArMs evolved for one reaction can serve as starting points to evolve catalysts for others.
Defining an essence of structure determining residue contacts in proteins.
Sathyapriya, R; Duarte, Jose M; Stehr, Henning; Filippis, Ioannis; Lappe, Michael
2009-12-01
The network of native non-covalent residue contacts determines the three-dimensional structure of a protein. However, not all contacts are of equal structural significance, and little knowledge exists about a minimal, yet sufficient, subset required to define the global features of a protein. Characterisation of this "structural essence" has remained elusive so far: no algorithmic strategy has been devised to-date that could outperform a random selection in terms of 3D reconstruction accuracy (measured as the Ca RMSD). It is not only of theoretical interest (i.e., for design of advanced statistical potentials) to identify the number and nature of essential native contacts-such a subset of spatial constraints is very useful in a number of novel experimental methods (like EPR) which rely heavily on constraint-based protein modelling. To derive accurate three-dimensional models from distance constraints, we implemented a reconstruction pipeline using distance geometry. We selected a test-set of 12 protein structures from the four major SCOP fold classes and performed our reconstruction analysis. As a reference set, series of random subsets (ranging from 10% to 90% of native contacts) are generated for each protein, and the reconstruction accuracy is computed for each subset. We have developed a rational strategy, termed "cone-peeling" that combines sequence features and network descriptors to select minimal subsets that outperform the reference sets. We present, for the first time, a rational strategy to derive a structural essence of residue contacts and provide an estimate of the size of this minimal subset. Our algorithm computes sparse subsets capable of determining the tertiary structure at approximately 4.8 A Ca RMSD with as little as 8% of the native contacts (Ca-Ca and Cb-Cb). At the same time, a randomly chosen subset of native contacts needs about twice as many contacts to reach the same level of accuracy. This "structural essence" opens new avenues in the fields of structure prediction, empirical potentials and docking.
Defining an Essence of Structure Determining Residue Contacts in Proteins
Sathyapriya, R.; Duarte, Jose M.; Stehr, Henning; Filippis, Ioannis; Lappe, Michael
2009-01-01
The network of native non-covalent residue contacts determines the three-dimensional structure of a protein. However, not all contacts are of equal structural significance, and little knowledge exists about a minimal, yet sufficient, subset required to define the global features of a protein. Characterisation of this “structural essence” has remained elusive so far: no algorithmic strategy has been devised to-date that could outperform a random selection in terms of 3D reconstruction accuracy (measured as the Ca RMSD). It is not only of theoretical interest (i.e., for design of advanced statistical potentials) to identify the number and nature of essential native contacts—such a subset of spatial constraints is very useful in a number of novel experimental methods (like EPR) which rely heavily on constraint-based protein modelling. To derive accurate three-dimensional models from distance constraints, we implemented a reconstruction pipeline using distance geometry. We selected a test-set of 12 protein structures from the four major SCOP fold classes and performed our reconstruction analysis. As a reference set, series of random subsets (ranging from 10% to 90% of native contacts) are generated for each protein, and the reconstruction accuracy is computed for each subset. We have developed a rational strategy, termed “cone-peeling” that combines sequence features and network descriptors to select minimal subsets that outperform the reference sets. We present, for the first time, a rational strategy to derive a structural essence of residue contacts and provide an estimate of the size of this minimal subset. Our algorithm computes sparse subsets capable of determining the tertiary structure at approximately 4.8 Å Ca RMSD with as little as 8% of the native contacts (Ca-Ca and Cb-Cb). At the same time, a randomly chosen subset of native contacts needs about twice as many contacts to reach the same level of accuracy. This “structural essence” opens new avenues in the fields of structure prediction, empirical potentials and docking. PMID:19997489
2011-01-01
Background After many years of sanctions and conflict, Iraq is rebuilding its health system, with a strong emphasis on the traditional hospital-based services. A network exists of public sector hospitals and clinics, as well as private clinics and a few private hospitals. Little data are available about the approximately 1400 Primary Health Care clinics (PHCCs) staffed with doctors. How do Iraqis utilize primary health care services? What are their preferences and perceptions of public primary health care clinics and private primary care services in general? How does household wealth affect choice of services? Methods A 1256 household national survey was conducted in the catchment areas of randomly selected PHCCs in Iraq. A cluster of 10 households, beginning with a randomly selected start household, were interviewed in the service areas of seven public sector PHCC facilities in each of 17 of Iraq's 18 governorates. A questionnaire was developed using key informants. Teams of interviewers, including both males and females, were recruited and provided a week of training which included field practice. Teams then gathered data from households in the service areas of randomly selected clinics. Results Iraqi participants are generally satisfied with the quality of primary care services available both in the public and private sector. Private clinics are generally the most popular source of primary care, however the PHCCs are utilized more by poorer households. In spite of free services available at PHCCs many households expressed difficulty in affording health care, especially in the purchase of medications. There is no evidence of informal payments to secure health services in the public sector. Conclusions There is widespread satisfaction reported with primary health care services, and levels did not differ appreciably between public and private sectors. The public sector PHCCs are preferentially used by poorer populations where they are important providers. PHCC services are indeed free, with little evidence of informal payments to providers. PMID:22176866
NASA Astrophysics Data System (ADS)
Debats, Stephanie Renee
Smallholder farms dominate in many parts of the world, including Sub-Saharan Africa. These systems are characterized by small, heterogeneous, and often indistinct field patterns, requiring a specialized methodology to map agricultural landcover. In this thesis, we developed a benchmark labeled data set of high-resolution satellite imagery of agricultural fields in South Africa. We presented a new approach to mapping agricultural fields, based on efficient extraction of a vast set of simple, highly correlated, and interdependent features, followed by a random forest classifier. The algorithm achieved similar high performance across agricultural types, including spectrally indistinct smallholder fields, and demonstrated the ability to generalize across large geographic areas. In sensitivity analyses, we determined multi-temporal images provided greater performance gains than the addition of multi-spectral bands. We also demonstrated how active learning can be incorporated in the algorithm to create smaller, more efficient training data sets, which reduced computational resources, minimized the need for humans to hand-label data, and boosted performance. We designed a patch-based uncertainty metric to drive the active learning framework, based on the regular grid of a crowdsourcing platform, and demonstrated how subject matter experts can be replaced with fleets of crowdsourcing workers. Our active learning algorithm achieved similar performance as an algorithm trained with randomly selected data, but with 62% less data samples. This thesis furthers the goal of providing accurate agricultural landcover maps, at a scale that is relevant for the dominant smallholder class. Accurate maps are crucial for monitoring and promoting agricultural production. Furthermore, improved agricultural landcover maps will aid a host of other applications, including landcover change assessments, cadastral surveys to strengthen smallholder land rights, and constraints for crop modeling and famine prediction.
Analytical applications of aptamers
NASA Astrophysics Data System (ADS)
Tombelli, S.; Minunni, M.; Mascini, M.
2007-05-01
Aptamers are single stranded DNA or RNA ligands which can be selected for different targets starting from a library of molecules containing randomly created sequences. Aptamers have been selected to bind very different targets, from proteins to small organic dyes. Aptamers are proposed as alternatives to antibodies as biorecognition elements in analytical devices with ever increasing frequency. This in order to satisfy the demand for quick, cheap, simple and highly reproducible analytical devices, especially for protein detection in the medical field or for the detection of smaller molecules in environmental and food analysis. In our recent experience, DNA and RNA aptamers, specific for three different proteins (Tat, IgE and thrombin), have been exploited as bio-recognition elements to develop specific biosensors (aptasensors). These recognition elements have been coupled to piezoelectric quartz crystals and surface plasmon resonance (SPR) devices as transducers where the aptamers have been immobilized on the gold surface of the crystals electrodes or on SPR chips, respectively.
NASA Astrophysics Data System (ADS)
Kim, Pilnam; Kang, Tae June
2017-12-01
We present a simple and scalable fluidic-assembly approach, in which bundles of single-walled carbon nanotubes (SWCNTs) are selectively aligned and deposited by directionally controlled dip-coating and solvent evaporation processes. The patterned surface with alternating regions of hydrophobic polydimethyl siloxane (PDMS) (height 100 nm) strips and hydrophilic SiO2 substrate was withdrawn vertically at a constant speed ( 3 mm/min) from a solution bath containing SWCNTs ( 0.1 mg/ml), allowing for directional evaporation and subsequent selective deposition of nanotube bundles along the edges of horizontally aligned PDMS strips. In addition, the fluidic assembly was applied to fabricate a field effect transistor (FET) with highly oriented SWCNTs, which demonstrate significantly higher current density as well as high turn-off ratio (T/O ratio 100) as compared to that with randomly distributed carbon nanotube bundles (T/O ratio <10).
Evolutionary engineering for industrial microbiology.
Vanee, Niti; Fisher, Adam B; Fong, Stephen S
2012-01-01
Superficially, evolutionary engineering is a paradoxical field that balances competing interests. In natural settings, evolution iteratively selects and enriches subpopulations that are best adapted to a particular ecological niche using random processes such as genetic mutation. In engineering desired approaches utilize rational prospective design to address targeted problems. When considering details of evolutionary and engineering processes, more commonality can be found. Engineering relies on detailed knowledge of the problem parameters and design properties in order to predict design outcomes that would be an optimized solution. When detailed knowledge of a system is lacking, engineers often employ algorithmic search strategies to identify empirical solutions. Evolution epitomizes this iterative optimization by continuously diversifying design options from a parental design, and then selecting the progeny designs that represent satisfactory solutions. In this chapter, the technique of applying the natural principles of evolution to engineer microbes for industrial applications is discussed to highlight the challenges and principles of evolutionary engineering.
Shalev, Nir; De Wandel, Linde; Dockree, Paul; Demeyere, Nele; Chechlacz, Magdalena
2017-10-03
The Theory of Visual Attention (TVA) provides a mathematical formalisation of the "biased competition" account of visual attention. Applying this model to individual performance in a free recall task allows the estimation of 5 independent attentional parameters: visual short-term memory (VSTM) capacity, speed of information processing, perceptual threshold of visual detection; attentional weights representing spatial distribution of attention (spatial bias), and the top-down selectivity index. While the TVA focuses on selection in space, complementary accounts of attention describe how attention is maintained over time, and how temporal processes interact with selection. A growing body of evidence indicates that different facets of attention interact and share common neural substrates. The aim of the current study was to modulate a spatial attentional bias via transfer effects, based on a mechanistic understanding of the interplay between spatial, selective and temporal aspects of attention. Specifically, we examined here: (i) whether a single administration of a lateralized sustained attention task could prime spatial orienting and lead to transferable changes in attentional weights (assigned to the left vs right hemi-field) and/or other attentional parameters assessed within the framework of TVA (Experiment 1); (ii) whether the effects of such spatial-priming on TVA parameters could be further enhanced by bi-parietal high frequency transcranial random noise stimulation (tRNS) (Experiment 2). Our results demonstrate that spatial attentional bias, as assessed within the TVA framework, was primed by sustaining attention towards the right hemi-field, but this spatial-priming effect did not occur when sustaining attention towards the left. Furthermore, we show that bi-parietal high-frequency tRNS combined with the rightward spatial-priming resulted in an increased attentional selectivity. To conclude, we present a novel, theory-driven method for attentional modulation providing important insights into how the spatial and temporal processes in attention interact with attentional selection. Copyright © 2017 Elsevier Ltd. All rights reserved.
Refernce Conditions for Streams in the Grand Prairie Natural Division of Illinois
NASA Astrophysics Data System (ADS)
Sangunett, B.; Dewalt, R.
2005-05-01
As part of the Critical Trends Assessment Program (CTAP) of the Illinois Department of Natural Resources (IDNR), 12 potential reference quality stream sites in the Grand Prairie Natural Division were evaluated in May 2004. This agriculturally dominated region, located in east central Illinois, is the most highly modified in the state. The quality of these sites was assessed using a modified Hilsenhoff Biotic Index and species richness of Ephemeroptera, Plecoptera, and Trichoptera (EPT) insect orders and a 12 parameter Habitat Quality Index (HQI). Illinois EPA high quality fish stations, Illinois Natural History Survey insect collection data, and best professional knowledge were used to choose which streams to evaluate. For analysis, reference quality streams were compared to 37 randomly selected meandering streams and 26 randomly selected channelized streams which were assessed by CTAP between 1997 and 2001. The results showed that the reference streams exceeded both taxa richness and habitat quality of randomly selected streams in the region. Both random meandering sites and reference quality sites increased in taxa richness and HQI as stream width increased. Randomly selected channelized streams had about the same taxa richness and HQI regardless of width.
Key Aspects of Nucleic Acid Library Design for in Vitro Selection
Vorobyeva, Maria A.; Davydova, Anna S.; Vorobjev, Pavel E.; Pyshnyi, Dmitrii V.; Venyaminova, Alya G.
2018-01-01
Nucleic acid aptamers capable of selectively recognizing their target molecules have nowadays been established as powerful and tunable tools for biospecific applications, be it therapeutics, drug delivery systems or biosensors. It is now generally acknowledged that in vitro selection enables one to generate aptamers to almost any target of interest. However, the success of selection and the affinity of the resulting aptamers depend to a large extent on the nature and design of an initial random nucleic acid library. In this review, we summarize and discuss the most important features of the design of nucleic acid libraries for in vitro selection such as the nature of the library (DNA, RNA or modified nucleotides), the length of a randomized region and the presence of fixed sequences. We also compare and contrast different randomization strategies and consider computer methods of library design and some other aspects. PMID:29401748
Methods and analysis of realizing randomized grouping.
Hu, Liang-Ping; Bao, Xiao-Lei; Wang, Qi
2011-07-01
Randomization is one of the four basic principles of research design. The meaning of randomization includes two aspects: one is to randomly select samples from the population, which is known as random sampling; the other is to randomly group all the samples, which is called randomized grouping. Randomized grouping can be subdivided into three categories: completely, stratified and dynamically randomized grouping. This article mainly introduces the steps of complete randomization, the definition of dynamic randomization and the realization of random sampling and grouping by SAS software.
Bernardy, Nancy C; Friedman, Matthew J
2015-04-01
There have been significant advancements in the pharmacologic management of posttraumatic stress disorder (PTSD) in the past two decades. Multisite randomized clinical trials (RCTs) have noted the efficacy of selective serotonin reuptake inhibitors (SSRIs) and serotonin-norepinephrine reuptake inhibitors (SNR Is) for PTSD treatment. Unfortunately, there have been no new medications approved to treat PTSD in the past 10 years. Although there have been exciting new findings in our knowledge of the neurobiology of PTSD, clinical trials testing new medications have lagged. This review summarizes recent research that builds on the unique pathophysiology of PTSD and suggests ways to move the field forward.
ReaxFF Study of the Oxidation of Softwood Lignin in View of Carbon Fiber Production
Beste, Ariana
2014-10-06
We investigate the oxidative, thermal conversion of softwood lignin by performing molecular dynamics simulations based on a reactive force field (ReaxFF). The lignin samples are constructed from coniferyl alcohol units, which are connected through linkages that are randomly selected from a natural distribution of linkages in softwood. The goal of this work is to simulate the oxidative stabilization step during carbon fiber production from lignin precursor. We find that at simulation conditions where stabilization reactions occur, the lignin fragments have already undergone extensive degradation. The 5-5 linkage shows the highest reactivity towards cyclization and dehydrogenation.
Detecting duplicate biological entities using Shortest Path Edit Distance.
Rudniy, Alex; Song, Min; Geller, James
2010-01-01
Duplicate entity detection in biological data is an important research task. In this paper, we propose a novel and context-sensitive Shortest Path Edit Distance (SPED) extending and supplementing our previous work on Markov Random Field-based Edit Distance (MRFED). SPED transforms the edit distance computational problem to the calculation of the shortest path among two selected vertices of a graph. We produce several modifications of SPED by applying Levenshtein, arithmetic mean, histogram difference and TFIDF techniques to solve subtasks. We compare SPED performance to other well-known distance algorithms for biological entity matching. The experimental results show that SPED produces competitive outcomes.
Lunar Infrared Spectrometer (LIS) for Luna-Resurs and Luna-Glob missions
NASA Astrophysics Data System (ADS)
Korablev, O.; Ivanov, A.; Mantsevich, S.; Kiselev, A.; Vyazovetskiy, N.; Fedorova, A.; Evdokimova, N.; Stepanov, A.; Titov, A.; Kalinnikov, Y.
2012-09-01
Lunar Infrared Spectrometer (LIS) is an experiment onboard Luna-Glob (launch in 2015) and Luna- Resurs (launch in 2017) Russian surface missions. The experiment is dedicated to the studies of mineralogy of the lunar regolith in the vicinity of the lander. The instrument is mounted on the mechanic arm of landing module in the field of view (45°) of stereo TV camera. LIS will provide measurements of selected surface region in the spectral range of 1.15-3.3 μm. The electrically commanded acousto-optic filter scans sequentially at a desired sampling, with random access, over the entire spectral range.
The space transformation in the simulation of multidimensional random fields
Christakos, G.
1987-01-01
Space transformations are proposed as a mathematically meaningful and practically comprehensive approach to simulate multidimensional random fields. Within this context the turning bands method of simulation is reconsidered and improved in both the space and frequency domains. ?? 1987.
Small-World Network Spectra in Mean-Field Theory
NASA Astrophysics Data System (ADS)
Grabow, Carsten; Grosskinsky, Stefan; Timme, Marc
2012-05-01
Collective dynamics on small-world networks emerge in a broad range of systems with their spectra characterizing fundamental asymptotic features. Here we derive analytic mean-field predictions for the spectra of small-world models that systematically interpolate between regular and random topologies by varying their randomness. These theoretical predictions agree well with the actual spectra (obtained by numerical diagonalization) for undirected and directed networks and from fully regular to strongly random topologies. These results may provide analytical insights to empirically found features of dynamics on small-world networks from various research fields, including biology, physics, engineering, and social science.
Random field assessment of nanoscopic inhomogeneity of bone
Dong, X. Neil; Luo, Qing; Sparkman, Daniel M.; Millwater, Harry R.; Wang, Xiaodu
2010-01-01
Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to present the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. PMID:20817128
Directed self-assembly into low-density colloidal liquid crystal phases
NASA Astrophysics Data System (ADS)
Gao, Yongxiang; Romano, Flavio; Dullens, Roel P. A.; Doye, Jonathan K.; Aarts, Dirk G. A. L.
2018-01-01
Alignment of anisometric particles into liquid crystals (LCs) often results from an entropic competition between their rotational and translational degrees of freedom at dense packings. Here we show that by selectively functionalizing the heads of colloidal rods with magnetic nanoparticles this tendency can be broken to direct the particles into novel, low-density LC phases. Under an external magnetic field, the magnetic heads line up in columns whereas the nonmagnetic tails point out randomly in a plane perpendicular to the columns, forming bottle-brush-like objects; laterally, the bottle brushes are entropically stabilized against coalescence. Experiments and simulations show that upon increasing the particle density the system goes from a dilute gas to a dense two-dimensional liquid of bottle brushes with a density well below the zero-field nematic phase. Our findings offer a strategy for self-assembly into three-dimensional open phases that may find applications in switchable photonics, filtration, and light-weight materials.
NASA Technical Reports Server (NTRS)
Nalepka, R. F. (Principal Investigator); Richardson, W.; Pentland, A. P.
1976-01-01
The author has identified the following significant results. Fourteen different classification algorithms were tested for their ability to estimate the proportion of wheat in an area. For some algorithms, accuracy of classification in field centers was observed. The data base consisted of ground truth and LANDSAT data from 55 sections (1 x 1 mile) from five LACIE intensive test sites in Kansas and Texas. Signatures obtained from training fields selected at random from the ground truth were generally representative of the data distribution patterns. LIMMIX, an algorithm that chooses a pure signature when the data point is close enough to a signature mean and otherwise chooses the best mixture of a pair of signatures, reduced the average absolute error to 6.1% and the bias to 1.0%. QRULE run with a null test achieved a similar reduction.
NASA Astrophysics Data System (ADS)
Staroń, Waldemar; Herbowski, Leszek; Gurgul, Henryk
2007-04-01
The goal of the work was to determine the values of cumulative parameters of the cerebrospinal fluid. Values of the parameters characterise statistical cerebrospinal fluid obtained by puncture from the patients diagnosed due to suspicion of normotensive hydrocephalus. The cerebrospinal fluid taken by puncture for the routine examinations carried out at the patients suspected of normotensive hydrocephalus was analysed. In the paper there are presented results of examinations of several dozens of puncture samples of the cerebrospinal fluid coming from various patients. Each sample was examined under the microscope and photographed in 20 randomly chosen places. On the basis of analysis of the pictures showing the area of 100 x 100μm, the selected cumulative parameters such as count, numerical density, field area and field perimeter were determined for each sample. Then the average value of the parameters was determined as well.
Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...
Odegård, J; Klemetsdal, G; Heringstad, B
2005-04-01
Several selection criteria for reducing incidence of mastitis were developed from a random regression sire model for test-day somatic cell score (SCS). For comparison, sire transmitting abilities were also predicted based on a cross-sectional model for lactation mean SCS. Only first-crop daughters were used in genetic evaluation of SCS, and the different selection criteria were compared based on their correlation with incidence of clinical mastitis in second-crop daughters (measured as mean daughter deviations). Selection criteria were predicted based on both complete and reduced first-crop daughter groups (261 or 65 daughters per sire, respectively). For complete daughter groups, predicted transmitting abilities at around 30 d in milk showed the best predictive ability for incidence of clinical mastitis, closely followed by average predicted transmitting abilities over the entire lactation. Both of these criteria were derived from the random regression model. These selection criteria improved accuracy of selection by approximately 2% relative to a cross-sectional model. However, for reduced daughter groups, the cross-sectional model yielded increased predictive ability compared with the selection criteria based on the random regression model. This result may be explained by the cross-sectional model being more robust, i.e., less sensitive to precision of (co)variance components estimates and effects of data structure.
ERIC Educational Resources Information Center
Kariuki, Patrick N. K.; Bush, Elizabeth Danielle
2008-01-01
The purpose of this study was to examine the effects of Total Physical Response by Storytelling and the traditional teaching method on a foreign language in a selected high school. The sample consisted of 30 students who were randomly selected and randomly assigned to experimental and control group. The experimental group was taught using Total…
Hindersin, Laura; Traulsen, Arne
2015-11-01
We analyze evolutionary dynamics on graphs, where the nodes represent individuals of a population. The links of a node describe which other individuals can be displaced by the offspring of the individual on that node. Amplifiers of selection are graphs for which the fixation probability is increased for advantageous mutants and decreased for disadvantageous mutants. A few examples of such amplifiers have been developed, but so far it is unclear how many such structures exist and how to construct them. Here, we show that almost any undirected random graph is an amplifier of selection for Birth-death updating, where an individual is selected to reproduce with probability proportional to its fitness and one of its neighbors is replaced by that offspring at random. If we instead focus on death-Birth updating, in which a random individual is removed and its neighbors compete for the empty spot, then the same ensemble of graphs consists of almost only suppressors of selection for which the fixation probability is decreased for advantageous mutants and increased for disadvantageous mutants. Thus, the impact of population structure on evolutionary dynamics is a subtle issue that will depend on seemingly minor details of the underlying evolutionary process.
A stochastic-geometric model of soil variation in Pleistocene patterned ground
NASA Astrophysics Data System (ADS)
Lark, Murray; Meerschman, Eef; Van Meirvenne, Marc
2013-04-01
In this paper we examine the spatial variability of soil in parent material with complex spatial structure which arises from complex non-linear geomorphic processes. We show that this variability can be better-modelled by a stochastic-geometric model than by a standard Gaussian random field. The benefits of the new model are seen in the reproduction of features of the target variable which influence processes like water movement and pollutant dispersal. Complex non-linear processes in the soil give rise to properties with non-Gaussian distributions. Even under a transformation to approximate marginal normality, such variables may have a more complex spatial structure than the Gaussian random field model of geostatistics can accommodate. In particular the extent to which extreme values of the variable are connected in spatially coherent regions may be misrepresented. As a result, for example, geostatistical simulation generally fails to reproduce the pathways for preferential flow in an environment where coarse infill of former fluvial channels or coarse alluvium of braided streams creates pathways for rapid movement of water. Multiple point geostatistics has been developed to deal with this problem. Multiple point methods proceed by sampling from a set of training images which can be assumed to reproduce the non-Gaussian behaviour of the target variable. The challenge is to identify appropriate sources of such images. In this paper we consider a mode of soil variation in which the soil varies continuously, exhibiting short-range lateral trends induced by local effects of the factors of soil formation which vary across the region of interest in an unpredictable way. The trends in soil variation are therefore only apparent locally, and the soil variation at regional scale appears random. We propose a stochastic-geometric model for this mode of soil variation called the Continuous Local Trend (CLT) model. We consider a case study of soil formed in relict patterned ground with pronounced lateral textural variations arising from the presence of infilled ice-wedges of Pleistocene origin. We show how knowledge of the pedogenetic processes in this environment, along with some simple descriptive statistics, can be used to select and fit a CLT model for the apparent electrical conductivity (ECa) of the soil. We use the model to simulate realizations of the CLT process, and compare these with realizations of a fitted Gaussian random field. We show how statistics that summarize the spatial coherence of regions with small values of ECa, which are expected to have coarse texture and so larger saturated hydraulic conductivity, are better reproduced by the CLT model than by the Gaussian random field. This suggests that the CLT model could be used to generate an unlimited supply of training images to allow multiple point geostatistical simulation or prediction of this or similar variables.
The role of color and attention-to-color in mirror-symmetry perception.
Gheorghiu, Elena; Kingdom, Frederick A A; Remkes, Aaron; Li, Hyung-Chul O; Rainville, Stéphane
2016-07-11
The role of color in the visual perception of mirror-symmetry is controversial. Some reports support the existence of color-selective mirror-symmetry channels, others that mirror-symmetry perception is merely sensitive to color-correlations across the symmetry axis. Here we test between the two ideas. Stimuli consisted of colored Gaussian-blobs arranged either mirror-symmetrically or quasi-randomly. We used four arrangements: (1) 'segregated' - symmetric blobs were of one color, random blobs of the other color(s); (2) 'random-segregated' - as above but with the symmetric color randomly selected on each trial; (3) 'non-segregated' - symmetric blobs were of all colors in equal proportions, as were the random blobs; (4) 'anti-symmetric' - symmetric blobs were of opposite-color across the symmetry axis. We found: (a) near-chance levels for the anti-symmetric condition, suggesting that symmetry perception is sensitive to color-correlations across the symmetry axis; (b) similar performance for random-segregated and non-segregated conditions, giving no support to the idea that mirror-symmetry is color selective; (c) highest performance for the color-segregated condition, but only when the observer knew beforehand the symmetry color, suggesting that symmetry detection benefits from color-based attention. We conclude that mirror-symmetry detection mechanisms, while sensitive to color-correlations across the symmetry axis and subject to the benefits of attention-to-color, are not color selective.
The role of color and attention-to-color in mirror-symmetry perception
Gheorghiu, Elena; Kingdom, Frederick A. A.; Remkes, Aaron; Li, Hyung-Chul O.; Rainville, Stéphane
2016-01-01
The role of color in the visual perception of mirror-symmetry is controversial. Some reports support the existence of color-selective mirror-symmetry channels, others that mirror-symmetry perception is merely sensitive to color-correlations across the symmetry axis. Here we test between the two ideas. Stimuli consisted of colored Gaussian-blobs arranged either mirror-symmetrically or quasi-randomly. We used four arrangements: (1) ‘segregated’ – symmetric blobs were of one color, random blobs of the other color(s); (2) ‘random-segregated’ – as above but with the symmetric color randomly selected on each trial; (3) ‘non-segregated’ – symmetric blobs were of all colors in equal proportions, as were the random blobs; (4) ‘anti-symmetric’ – symmetric blobs were of opposite-color across the symmetry axis. We found: (a) near-chance levels for the anti-symmetric condition, suggesting that symmetry perception is sensitive to color-correlations across the symmetry axis; (b) similar performance for random-segregated and non-segregated conditions, giving no support to the idea that mirror-symmetry is color selective; (c) highest performance for the color-segregated condition, but only when the observer knew beforehand the symmetry color, suggesting that symmetry detection benefits from color-based attention. We conclude that mirror-symmetry detection mechanisms, while sensitive to color-correlations across the symmetry axis and subject to the benefits of attention-to-color, are not color selective. PMID:27404804
Ensemble Feature Learning of Genomic Data Using Support Vector Machine
Anaissi, Ali; Goyal, Madhu; Catchpoole, Daniel R.; Braytee, Ali; Kennedy, Paul J.
2016-01-01
The identification of a subset of genes having the ability to capture the necessary information to distinguish classes of patients is crucial in bioinformatics applications. Ensemble and bagging methods have been shown to work effectively in the process of gene selection and classification. Testament to that is random forest which combines random decision trees with bagging to improve overall feature selection and classification accuracy. Surprisingly, the adoption of these methods in support vector machines has only recently received attention but mostly on classification not gene selection. This paper introduces an ensemble SVM-Recursive Feature Elimination (ESVM-RFE) for gene selection that follows the concepts of ensemble and bagging used in random forest but adopts the backward elimination strategy which is the rationale of RFE algorithm. The rationale behind this is, building ensemble SVM models using randomly drawn bootstrap samples from the training set, will produce different feature rankings which will be subsequently aggregated as one feature ranking. As a result, the decision for elimination of features is based upon the ranking of multiple SVM models instead of choosing one particular model. Moreover, this approach will address the problem of imbalanced datasets by constructing a nearly balanced bootstrap sample. Our experiments show that ESVM-RFE for gene selection substantially increased the classification performance on five microarray datasets compared to state-of-the-art methods. Experiments on the childhood leukaemia dataset show that an average 9% better accuracy is achieved by ESVM-RFE over SVM-RFE, and 5% over random forest based approach. The selected genes by the ESVM-RFE algorithm were further explored with Singular Value Decomposition (SVD) which reveals significant clusters with the selected data. PMID:27304923
Thermal ecology of the Australian agamid Pogona barbata.
Schäuble, Chloe S; Grigg, Gordon C
1998-05-01
This study compares the thermal ecology of male bearded dragon lizards (Pogona barbata) from south-east Queensland across two seasons: summer (1994-1995) and autumn (1995). Seasonal patterns of body temperature (T b ) were explored in terms of changes in the physical properties of the thermal environment and thermoregulatory effort. To quantify thermoregulatory effort, we compared behavioral and physiological variables recorded for observed lizards with those estimated for a thermoconforming lizard. The study lizards' field T b s varied seasonally (summer: grand daily mean (GDM) 34.6 ± 0.6°C, autumn: GDM 27.5 ± 0.3°C) as did maximum and minimum available operative temperatures (summer: GDM T max 42.1 ± 1.7°C, T min 32.2 ± 1.0°C, autumn: GDM T max 31.7 ± 1.2°C, T min 26.4 ± 0.5°C). Interestingly, the range of temperatures that lizards selected in a gradient (selected range) did not change seasonally. However, P. barbata thermoregulated more extensively and more accurately in summer than in autumn; lizards generally displayed behaviors affecting heat load nonrandomly in summer and randomly in autumn, leading to the GDM of the mean deviations of lizards' field T b s from their selected ranges being only 2.1 ± 0.5°C in summer, compared to 4.4 ± 0.5°C in autumn. This seasonal difference was not a consequence of different heat availability in the two seasons, because the seasonally available ranges of operative temperatures rarely precluded lizards from attaining field T b s within their selected range, should that have been the goal. Rather, thermal microhabitat distribution and social behavior appear to have had an important influence on seasonal levels of thermoregulatory effort.
A Probabilistic Cell Tracking Algorithm
NASA Astrophysics Data System (ADS)
Steinacker, Reinhold; Mayer, Dieter; Leiding, Tina; Lexer, Annemarie; Umdasch, Sarah
2013-04-01
The research described below was carried out during the EU-Project Lolight - development of a low cost, novel and accurate lightning mapping and thunderstorm (supercell) tracking system. The Project aims to develop a small-scale tracking method to determine and nowcast characteristic trajectories and velocities of convective cells and cell complexes. The results of the algorithm will provide a higher accuracy than current locating systems distributed on a coarse scale. Input data for the developed algorithm are two temporally separated lightning density fields. Additionally a Monte Carlo method minimizing a cost function is utilizied which leads to a probabilistic forecast for the movement of thunderstorm cells. In the first step the correlation coefficients between the first and the second density field are computed. Hence, the first field is shifted by all shifting vectors which are physically allowed. The maximum length of each vector is determined by the maximum possible speed of thunderstorm cells and the difference in time for both density fields. To eliminate ambiguities in determination of directions and velocities, the so called Random Walker of the Monte Carlo process is used. Using this method a grid point is selected at random. Moreover, one vector out of all predefined shifting vectors is suggested - also at random but with a probability that is related to the correlation coefficient. If this exchange of shifting vectors reduces the cost function, the new direction and velocity are accepted. Otherwise it is discarded. This process is repeated until the change of cost functions falls below a defined threshold. The Monte Carlo run gives information about the percentage of accepted shifting vectors for all grid points. In the course of the forecast, amplifications of cell density are permitted. For this purpose, intensity changes between the investigated areas of both density fields are taken into account. Knowing the direction and speed of thunderstorm cells is important for nowcasting. Therefore, the presented method is based on IC discharges which account for most lightning discharges and occur minutes before the first CG discharge. The cell tracking algorithm will be used as part of the integrated LoLight system. The research leading to these results has received funding from the European Union's Seventh Framework Programme managed by REA-Research Executive Agency http://ec.europa.eu/research/rea ([FP7/2007-2013] [FP7/2007-2011]) under grant agreement n° [262200].
Randomization Methods in Emergency Setting Trials: A Descriptive Review
ERIC Educational Resources Information Center
Corbett, Mark Stephen; Moe-Byrne, Thirimon; Oddie, Sam; McGuire, William
2016-01-01
Background: Quasi-randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods: We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic…
Middle Level Practices in European International and Department of Defense Schools.
ERIC Educational Resources Information Center
Waggoner, V. Christine; McEwin, C. Kenneth
1993-01-01
Discusses results of a 1989-90 survey of 70 randomly selected international schools and 70 randomly selected Department of Defense Schools in Europe. Programs and practices surveyed included enrollments, grade organization, curriculum and instructional plans, core subjects, grouping patterns, exploratory courses, advisory programs, and scheduling.…
Random forest (RF) is popular in ecological and environmental modeling, in part, because of its insensitivity to correlated predictors and resistance to overfitting. Although variable selection has been proposed to improve both performance and interpretation of RF models, it is u...
Hebbian Learning in a Random Network Captures Selectivity Properties of the Prefrontal Cortex.
Lindsay, Grace W; Rigotti, Mattia; Warden, Melissa R; Miller, Earl K; Fusi, Stefano
2017-11-08
Complex cognitive behaviors, such as context-switching and rule-following, are thought to be supported by the prefrontal cortex (PFC). Neural activity in the PFC must thus be specialized to specific tasks while retaining flexibility. Nonlinear "mixed" selectivity is an important neurophysiological trait for enabling complex and context-dependent behaviors. Here we investigate (1) the extent to which the PFC exhibits computationally relevant properties, such as mixed selectivity, and (2) how such properties could arise via circuit mechanisms. We show that PFC cells recorded from male and female rhesus macaques during a complex task show a moderate level of specialization and structure that is not replicated by a model wherein cells receive random feedforward inputs. While random connectivity can be effective at generating mixed selectivity, the data show significantly more mixed selectivity than predicted by a model with otherwise matched parameters. A simple Hebbian learning rule applied to the random connectivity, however, increases mixed selectivity and enables the model to match the data more accurately. To explain how learning achieves this, we provide analysis along with a clear geometric interpretation of the impact of learning on selectivity. After learning, the model also matches the data on measures of noise, response density, clustering, and the distribution of selectivities. Of two styles of Hebbian learning tested, the simpler and more biologically plausible option better matches the data. These modeling results provide clues about how neural properties important for cognition can arise in a circuit and make clear experimental predictions regarding how various measures of selectivity would evolve during animal training. SIGNIFICANCE STATEMENT The prefrontal cortex is a brain region believed to support the ability of animals to engage in complex behavior. How neurons in this area respond to stimuli-and in particular, to combinations of stimuli ("mixed selectivity")-is a topic of interest. Even though models with random feedforward connectivity are capable of creating computationally relevant mixed selectivity, such a model does not match the levels of mixed selectivity seen in the data analyzed in this study. Adding simple Hebbian learning to the model increases mixed selectivity to the correct level and makes the model match the data on several other relevant measures. This study thus offers predictions on how mixed selectivity and other properties evolve with training. Copyright © 2017 the authors 0270-6474/17/3711021-16$15.00/0.
Spatial-Temporal Data Collection with Compressive Sensing in Mobile Sensor Networks
Li, Jiayin; Guo, Wenzhong; Chen, Zhonghui; Xiong, Neal
2017-01-01
Compressive sensing (CS) provides an energy-efficient paradigm for data gathering in wireless sensor networks (WSNs). However, the existing work on spatial-temporal data gathering using compressive sensing only considers either multi-hop relaying based or multiple random walks based approaches. In this paper, we exploit the mobility pattern for spatial-temporal data collection and propose a novel mobile data gathering scheme by employing the Metropolis-Hastings algorithm with delayed acceptance, an improved random walk algorithm for a mobile collector to collect data from a sensing field. The proposed scheme exploits Kronecker compressive sensing (KCS) for spatial-temporal correlation of sensory data by allowing the mobile collector to gather temporal compressive measurements from a small subset of randomly selected nodes along a random routing path. More importantly, from the theoretical perspective we prove that the equivalent sensing matrix constructed from the proposed scheme for spatial-temporal compressible signal can satisfy the property of KCS models. The simulation results demonstrate that the proposed scheme can not only significantly reduce communication cost but also improve recovery accuracy for mobile data gathering compared to the other existing schemes. In particular, we also show that the proposed scheme is robust in unreliable wireless environment under various packet losses. All this indicates that the proposed scheme can be an efficient alternative for data gathering application in WSNs. PMID:29117152
Spatial-Temporal Data Collection with Compressive Sensing in Mobile Sensor Networks.
Zheng, Haifeng; Li, Jiayin; Feng, Xinxin; Guo, Wenzhong; Chen, Zhonghui; Xiong, Neal
2017-11-08
Compressive sensing (CS) provides an energy-efficient paradigm for data gathering in wireless sensor networks (WSNs). However, the existing work on spatial-temporal data gathering using compressive sensing only considers either multi-hop relaying based or multiple random walks based approaches. In this paper, we exploit the mobility pattern for spatial-temporal data collection and propose a novel mobile data gathering scheme by employing the Metropolis-Hastings algorithm with delayed acceptance, an improved random walk algorithm for a mobile collector to collect data from a sensing field. The proposed scheme exploits Kronecker compressive sensing (KCS) for spatial-temporal correlation of sensory data by allowing the mobile collector to gather temporal compressive measurements from a small subset of randomly selected nodes along a random routing path. More importantly, from the theoretical perspective we prove that the equivalent sensing matrix constructed from the proposed scheme for spatial-temporal compressible signal can satisfy the property of KCS models. The simulation results demonstrate that the proposed scheme can not only significantly reduce communication cost but also improve recovery accuracy for mobile data gathering compared to the other existing schemes. In particular, we also show that the proposed scheme is robust in unreliable wireless environment under various packet losses. All this indicates that the proposed scheme can be an efficient alternative for data gathering application in WSNs .
Bashir, Muhammad Mustehsan; Qayyum, Rehan; Saleem, Muhammad Hammad; Siddique, Kashif; Khan, Farid Ahmad
2015-08-01
To determine the optimal time interval between tumescent local anesthesia infiltration and the start of hand surgery without a tourniquet for improved operative field visibility. Patients aged 16 to 60 years who needed contracture release and tendon repair in the hand were enrolled from the outpatient clinic. Patients were randomized to 10-, 15-, or 25-minute intervals between tumescent anesthetic solution infiltration (0.18% lidocaine and 1:221,000 epinephrine) and the start of surgery. The end point of tumescence anesthetic infiltration was pale and firm skin. The surgical team was blinded to the time of anesthetic infiltration. At the completion of the procedure, the surgeon and the first assistant rated the operative field visibility as excellent, fair, or poor. We used logistic regression models without and with adjustment for confounding variables. Of the 75 patients enrolled in the study, 59 (79%) were males, 7 were randomized to 10-minute time intervals (further randomization was stopped after interim analysis found consistently poor operative field visibility), and 34 were randomized to the each of the 15- and 25-minute groups. Patients who were randomized to the 25-minute delay group had 29 times higher odds of having an excellent operative visual field than those randomized to the 15-minute delay group. After adjusting for age, sex, amount of tumescent solution infiltration, and duration of operation, the odds ratio remained highly significant. We found that an interval of 25 minutes provides vastly superior operative field visibility; 10-minute delay had the poorest results. Therapeutic I. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Cadogan, Beresford L; Scharbach, Roger D
2003-04-01
A field trial using true replicates was conducted successfully in a boreal forest in 1996 to evaluate the efficacy of two aerially applied Bacillus thuringiensis formulations, ABG 6429 and ABG 6430. A complete randomized design with four replicates per treatment was chosen. Twelve to 15 balsam fir (Abies balsamea [L.] Mill.) per plot were randomly selected as sample trees. Interplot buffer zones, > or = 200 m wide, adequately prevented cross contamination from sprays that were atomized with four rotary atomizers (volume median diameters ranging from 64.6 to 139.4 microm) and released approximately 30 m above the ground. The B. thuringiensis formulations were not significantly different (P > 0.05) from each other in reducing spruce budworm (Choristoneura fumiferana [Clem.]) populations and protecting balsam trees from defoliation but both formulations were significantly more efficacious than the controls. The results suggest that true replicates are a feasible alternative to pseudoreplication in experimental forest aerial applications.
Establishing the need for nutrition education: I. Methodology.
Vaden, A G; Newell, G K; Dayton, A D; Foley, C S
1983-10-01
Developmental and data collection phases for a comprehensive needs assessment project designed to provide baseline data for planning a statewide nutrition education and training project are summarized. To meet project objectives, 97 Kansas elementary schools were selected randomly as sampling units. A mail questionnaire was used to assess nutrition knowledge and attitudes and dietary and nutrition education practices of elementary teachers and food service personnel. Data from fifth grade students were collected on-site at each school. A written test was used to measure students' nutrition knowledge, attitudes, and practices. Students' nutritional status was partially assessed by measuring their height, weight, skinfold thickness, and upper arm circumference. As an additional assessment, 24-hour dietary recall interviews were conducted with a random sample of the students. In this article, each aspect of the data collection methodology is described in detail. As emphasized by authorities in the field, more complete information is needed in many nutrition survey reports to assist in useful interpretations and for comparisons among studies.
Sustained State-Independent Quantum Contextual Correlations from a Single Ion
NASA Astrophysics Data System (ADS)
Leupold, F. M.; Malinowski, M.; Zhang, C.; Negnevitsky, V.; Alonso, J.; Home, J. P.; Cabello, A.
2018-05-01
We use a single trapped-ion qutrit to demonstrate the quantum-state-independent violation of noncontextuality inequalities using a sequence of randomly chosen quantum nondemolition projective measurements. We concatenate 53 ×106 sequential measurements of 13 observables, and unambiguously violate an optimal noncontextual bound. We use the same data set to characterize imperfections including signaling and repeatability of the measurements. The experimental sequence was generated in real time with a quantum random number generator integrated into our control system to select the subsequent observable with a latency below 50 μ s , which can be used to constrain contextual hidden-variable models that might describe our results. The state-recycling experimental procedure is resilient to noise and independent of the qutrit state, substantiating the fact that the contextual nature of quantum physics is connected to measurements and not necessarily to designated states. The use of extended sequences of quantum nondemolition measurements finds applications in the fields of sensing and quantum information.
Environmental Health Practice: Statistically Based Performance Measurement
Enander, Richard T.; Gagnon, Ronald N.; Hanumara, R. Choudary; Park, Eugene; Armstrong, Thomas; Gute, David M.
2007-01-01
Objectives. State environmental and health protection agencies have traditionally relied on a facility-by-facility inspection-enforcement paradigm to achieve compliance with government regulations. We evaluated the effectiveness of a new approach that uses a self-certification random sampling design. Methods. Comprehensive environmental and occupational health data from a 3-year statewide industry self-certification initiative were collected from representative automotive refinishing facilities located in Rhode Island. Statistical comparisons between baseline and postintervention data facilitated a quantitative evaluation of statewide performance. Results. The analysis of field data collected from 82 randomly selected automotive refinishing facilities showed statistically significant improvements (P<.05, Fisher exact test) in 4 major performance categories: occupational health and safety, air pollution control, hazardous waste management, and wastewater discharge. Statistical significance was also shown when a modified Bonferroni adjustment for multiple comparisons was performed. Conclusions. Our findings suggest that the new self-certification approach to environmental and worker protection is effective and can be used as an adjunct to further enhance state and federal enforcement programs. PMID:17267709
Modeling and statistical analysis of non-Gaussian random fields with heavy-tailed distributions.
Nezhadhaghighi, Mohsen Ghasemi; Nakhlband, Abbas
2017-04-01
In this paper, we investigate and develop an alternative approach to the numerical analysis and characterization of random fluctuations with the heavy-tailed probability distribution function (PDF), such as turbulent heat flow and solar flare fluctuations. We identify the heavy-tailed random fluctuations based on the scaling properties of the tail exponent of the PDF, power-law growth of qth order correlation function, and the self-similar properties of the contour lines in two-dimensional random fields. Moreover, this work leads to a substitution for the fractional Edwards-Wilkinson (EW) equation that works in the presence of μ-stable Lévy noise. Our proposed model explains the configuration dynamics of the systems with heavy-tailed correlated random fluctuations. We also present an alternative solution to the fractional EW equation in the presence of μ-stable Lévy noise in the steady state, which is implemented numerically, using the μ-stable fractional Lévy motion. Based on the analysis of the self-similar properties of contour loops, we numerically show that the scaling properties of contour loop ensembles can qualitatively and quantitatively distinguish non-Gaussian random fields from Gaussian random fluctuations.
Zhang, J; Chen, X; Zhu, Q; Cui, J; Cao, L; Su, J
2016-11-01
In recent years, the number of randomized controlled trials (RCTs) in the field of orthopaedics is increasing in Mainland China. However, randomized controlled trials (RCTs) are inclined to bias if they lack methodological quality. Therefore, we performed a survey of RCT to assess: (1) What about the quality of RCTs in the field of orthopedics in Mainland China? (2) Whether there is difference between the core journals of the Chinese department of orthopedics and Orthopaedics Traumatology Surgery & Research (OTSR). This research aimed to evaluate the methodological reporting quality according to the CONSORT statement of randomized controlled trials (RCTs) in seven key orthopaedic journals published in Mainland China over 5 years from 2010 to 2014. All of the articles were hand researched on Chongqing VIP database between 2010 and 2014. Studies were considered eligible if the words "random", "randomly", "randomization", "randomized" were employed to describe the allocation way. Trials including animals, cadavers, trials published as abstracts and case report, trials dealing with subgroups analysis, or trials without the outcomes were excluded. In addition, eight articles selected from Orthopaedics Traumatology Surgery & Research (OTSR) between 2010 and 2014 were included in this study for comparison. The identified RCTs are analyzed using a modified version of the Consolidated Standards of Reporting Trials (CONSORT), including the sample size calculation, allocation sequence generation, allocation concealment, blinding and handling of dropouts. A total of 222 RCTs were identified in seven core orthopaedic journals. No trials reported adequate sample size calculation, 74 (33.4%) reported adequate allocation generation, 8 (3.7%) trials reported adequate allocation concealment, 18 (8.1%) trials reported adequate blinding and 16 (7.2%) trials reported handling of dropouts. In OTSR, 1 (12.5%) trial reported adequate sample size calculation, 4 (50.0%) reported adequate allocation generation, 1 (12.5%) trials reported adequate allocation concealment, 2 (25.0%) trials reported adequate blinding and 5 (62.5%) trials reported handling of dropouts. There were statistical differences as for sample size calculation and handling of dropouts between papers from Mainland China and OTSR (P<0.05). The findings of this study show that the methodological reporting quality of RCTs in seven core orthopaedic journals from the Mainland China is far from satisfaction and it needs to further improve to keep up with the standards of the CONSORT statement. Level III case control. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Ma, F; Yang, J; Kang, G; Sun, Q; Lu, P; Zhao, Y; Wang, Z; Luo, J; Wang, Z
2016-09-01
For large-scale immunization of children with hepatitis A (HA) vaccines in China, accurately designed studies comparing the safety and immunogenicity of the live attenuated HA vaccine (HA-L) and inactivated HA vaccine (HA-I) are necessary. A randomized, parallel controlled, phase IV clinical trial was conducted with 6000 healthy children aged 18 months to 16 years. HA-L or HA-I was administered at a ratio of 1: 1 to randomized selected participants. The safety and immunogenicity were evaluated. Both HA-L and HA-I were well tolerated by all participants. The immunogenicity results showed that the seroconversion rates (HA-L versus HA-I: 98.0% versus 100%, respectively, p >0.05), and geometric mean concentrations in participants negative for antibodies against HA virus IgG (anti-HAV IgG) before vaccination did not differ significantly between the two types of vaccines (HA-L versus HA-I first dose: 898.9 versus 886.2 mIU/mL, respectively, p >0.05). After administration of the booster dose of HA-I, the geometric mean concentrations of anti-HAV IgG (HA-I booster dose: 2591.2 mIU/mL) was higher than that after the first dose (p <0.05) and that reported in participants administered HA-L (p <0.05). Additionally, 12 (25%) of the 48 randomized selected participants who received HA-L tested positive for HA antigen in stool samples. Hence, both HA-L and HA-I could provide acceptable immunogenicity in children. The effects of long-term immunogenicity after natural exposure to wild-type HA virus and the possibility of mutational shifts of the live vaccine virus in the field need to be studied in more detail. Copyright © 2016. Published by Elsevier Ltd.
The performance of sample selection estimators to control for attrition bias.
Grasdal, A
2001-07-01
Sample attrition is a potential source of selection bias in experimental, as well as non-experimental programme evaluation. For labour market outcomes, such as employment status and earnings, missing data problems caused by attrition can be circumvented by the collection of follow-up data from administrative registers. For most non-labour market outcomes, however, investigators must rely on participants' willingness to co-operate in keeping detailed follow-up records and statistical correction procedures to identify and adjust for attrition bias. This paper combines survey and register data from a Norwegian randomized field trial to evaluate the performance of parametric and semi-parametric sample selection estimators commonly used to correct for attrition bias. The considered estimators work well in terms of producing point estimates of treatment effects close to the experimental benchmark estimates. Results are sensitive to exclusion restrictions. The analysis also demonstrates an inherent paradox in the 'common support' approach, which prescribes exclusion from the analysis of observations outside of common support for the selection probability. The more important treatment status is as a determinant of attrition, the larger is the proportion of treated with support for the selection probability outside the range, for which comparison with untreated counterparts is possible. Copyright 2001 John Wiley & Sons, Ltd.
Gorobets, Yu I; Gorobets, O Yu
2015-01-01
The statistical model is proposed in this paper for description of orientation of trajectories of unicellular diamagnetic organisms in a magnetic field. The statistical parameter such as the effective energy is calculated on basis of this model. The resulting effective energy is the statistical characteristics of trajectories of diamagnetic microorganisms in a magnetic field connected with their metabolism. The statistical model is applicable for the case when the energy of the thermal motion of bacteria is negligible in comparison with their energy in a magnetic field and the bacteria manifest the significant "active random movement", i.e. there is the randomizing motion of the bacteria of non thermal nature, for example, movement of bacteria by means of flagellum. The energy of the randomizing active self-motion of bacteria is characterized by the new statistical parameter for biological objects. The parameter replaces the energy of the randomizing thermal motion in calculation of the statistical distribution. Copyright © 2014 Elsevier Ltd. All rights reserved.
Approximate ground states of the random-field Potts model from graph cuts
NASA Astrophysics Data System (ADS)
Kumar, Manoj; Kumar, Ravinder; Weigel, Martin; Banerjee, Varsha; Janke, Wolfhard; Puri, Sanjay
2018-05-01
While the ground-state problem for the random-field Ising model is polynomial, and can be solved using a number of well-known algorithms for maximum flow or graph cut, the analog random-field Potts model corresponds to a multiterminal flow problem that is known to be NP-hard. Hence an efficient exact algorithm is very unlikely to exist. As we show here, it is nevertheless possible to use an embedding of binary degrees of freedom into the Potts spins in combination with graph-cut methods to solve the corresponding ground-state problem approximately in polynomial time. We benchmark this heuristic algorithm using a set of quasiexact ground states found for small systems from long parallel tempering runs. For a not-too-large number q of Potts states, the method based on graph cuts finds the same solutions in a fraction of the time. We employ the new technique to analyze the breakup length of the random-field Potts model in two dimensions.
Adaptive Electronic Camouflage Using Texture Synthesis
2012-04-01
algorithm begins by computing the GLCMs, GIN and GOUT , of the input image (e.g., image of local environment) and output image (randomly generated...respectively. The algorithm randomly selects a pixel from the output image and cycles its gray-level through all values. For each value, GOUT is updated...The value of the selected pixel is permanently changed to the gray-level value that minimizes the error between GIN and GOUT . Without selecting a
Fingerprint recognition of alien invasive weeds based on the texture character and machine learning
NASA Astrophysics Data System (ADS)
Yu, Jia-Jia; Li, Xiao-Li; He, Yong; Xu, Zheng-Hao
2008-11-01
Multi-spectral imaging technique based on texture analysis and machine learning was proposed to discriminate alien invasive weeds with similar outline but different categories. The objectives of this study were to investigate the feasibility of using Multi-spectral imaging, especially the near-infrared (NIR) channel (800 nm+/-10 nm) to find the weeds' fingerprints, and validate the performance with specific eigenvalues by co-occurrence matrix. Veronica polita Pries, Veronica persica Poir, longtube ground ivy, Laminum amplexicaule Linn. were selected in this study, which perform different effect in field, and are alien invasive species in China. 307 weed leaves' images were randomly selected for the calibration set, while the remaining 207 samples for the prediction set. All images were pretreated by Wallis filter to adjust the noise by uneven lighting. Gray level co-occurrence matrix was applied to extract the texture character, which shows density, randomness correlation, contrast and homogeneity of texture with different algorithms. Three channels (green channel by 550 nm+/-10 nm, red channel by 650 nm+/-10 nm and NIR channel by 800 nm+/-10 nm) were respectively calculated to get the eigenvalues.Least-squares support vector machines (LS-SVM) was applied to discriminate the categories of weeds by the eigenvalues from co-occurrence matrix. Finally, recognition ratio of 83.35% by NIR channel was obtained, better than the results by green channel (76.67%) and red channel (69.46%). The prediction results of 81.35% indicated that the selected eigenvalues reflected the main characteristics of weeds' fingerprint based on multi-spectral (especially by NIR channel) and LS-SVM model.
Acute nonlymphocytic leukemia and residential exposure to power frequency magnetic fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Severson, R.K.
1986-01-01
A population-based case-control study of adult acute nonlymphocytic leukemia (ANLL) and residential exposure to power frequency magnetic fields was conducted in King, Pierce and Snohomish Counties in Washington state. Of 164 cases who were diagnosed from January 1, 1981 through December 31, 1984, 114 were interviewed. Controls were selected from the study area on the basis of random digit dialing and frequency matched to the cases by age and sex. Analyses were undertaken to evaluate whether exposure to high levels of power frequency magnetic fields in the residence were associated with an increased risk of ANLL. Neither the directly measuredmore » magnetic fields nor the surrogate values based on the wiring configurations were associated with ANLL. Additional analyses suggested that persons with prior allergies were at decreased risk of acute myelocytic leukemia (AML). Also, persons with prior autoimmune diseases were at increased risk of AML. The increase in AML risk in rheumatoid arthritics was of borderline statistical significance. Finally, cigarette smoking was associated with an increased risk of AML. The risk of AML increased significantly with the number of years of cigarette smoking.« less
Hebbian Learning in a Random Network Captures Selectivity Properties of the Prefrontal Cortex
Lindsay, Grace W.
2017-01-01
Complex cognitive behaviors, such as context-switching and rule-following, are thought to be supported by the prefrontal cortex (PFC). Neural activity in the PFC must thus be specialized to specific tasks while retaining flexibility. Nonlinear “mixed” selectivity is an important neurophysiological trait for enabling complex and context-dependent behaviors. Here we investigate (1) the extent to which the PFC exhibits computationally relevant properties, such as mixed selectivity, and (2) how such properties could arise via circuit mechanisms. We show that PFC cells recorded from male and female rhesus macaques during a complex task show a moderate level of specialization and structure that is not replicated by a model wherein cells receive random feedforward inputs. While random connectivity can be effective at generating mixed selectivity, the data show significantly more mixed selectivity than predicted by a model with otherwise matched parameters. A simple Hebbian learning rule applied to the random connectivity, however, increases mixed selectivity and enables the model to match the data more accurately. To explain how learning achieves this, we provide analysis along with a clear geometric interpretation of the impact of learning on selectivity. After learning, the model also matches the data on measures of noise, response density, clustering, and the distribution of selectivities. Of two styles of Hebbian learning tested, the simpler and more biologically plausible option better matches the data. These modeling results provide clues about how neural properties important for cognition can arise in a circuit and make clear experimental predictions regarding how various measures of selectivity would evolve during animal training. SIGNIFICANCE STATEMENT The prefrontal cortex is a brain region believed to support the ability of animals to engage in complex behavior. How neurons in this area respond to stimuli—and in particular, to combinations of stimuli (“mixed selectivity”)—is a topic of interest. Even though models with random feedforward connectivity are capable of creating computationally relevant mixed selectivity, such a model does not match the levels of mixed selectivity seen in the data analyzed in this study. Adding simple Hebbian learning to the model increases mixed selectivity to the correct level and makes the model match the data on several other relevant measures. This study thus offers predictions on how mixed selectivity and other properties evolve with training. PMID:28986463
Corrected Mean-Field Model for Random Sequential Adsorption on Random Geometric Graphs
NASA Astrophysics Data System (ADS)
Dhara, Souvik; van Leeuwaarden, Johan S. H.; Mukherjee, Debankur
2018-03-01
A notorious problem in mathematics and physics is to create a solvable model for random sequential adsorption of non-overlapping congruent spheres in the d-dimensional Euclidean space with d≥ 2 . Spheres arrive sequentially at uniformly chosen locations in space and are accepted only when there is no overlap with previously deposited spheres. Due to spatial correlations, characterizing the fraction of accepted spheres remains largely intractable. We study this fraction by taking a novel approach that compares random sequential adsorption in Euclidean space to the nearest-neighbor blocking on a sequence of clustered random graphs. This random network model can be thought of as a corrected mean-field model for the interaction graph between the attempted spheres. Using functional limit theorems, we characterize the fraction of accepted spheres and its fluctuations.
Multi-field inflation with a random potential
NASA Astrophysics Data System (ADS)
Tye, S.-H. Henry; Xu, Jiajun; Zhang, Yang
2009-04-01
Motivated by the possibility of inflation in the cosmic landscape, which may be approximated by a complicated potential, we study the density perturbations in multi-field inflation with a random potential. The random potential causes the inflaton to undergo a Brownian-like motion with a drift in the D-dimensional field space, allowing entropic perturbation modes to continuously and randomly feed into the adiabatic mode. To quantify such an effect, we employ a stochastic approach to evaluate the two-point and three-point functions of primordial perturbations. We find that in the weakly random scenario where the stochastic scatterings are frequent but mild, the resulting power spectrum resembles that of the single field slow-roll case, with up to 2% more red tilt. The strongly random scenario, in which the coarse-grained motion of the inflaton is significantly slowed down by the scatterings, leads to rich phenomenologies. The power spectrum exhibits primordial fluctuations on all angular scales. Such features may already be hiding in the error bars of observed CMB TT (as well as TE and EE) power spectrum and have been smoothed out by binning of data points. With more data coming in the future, we expect these features can be detected or falsified. On the other hand the tensor power spectrum itself is free of fluctuations and the tensor to scalar ratio is enhanced by the large ratio of the Brownian-like motion speed over the drift speed. In addition a large negative running of the power spectral index is possible. Non-Gaussianity is generically suppressed by the growth of adiabatic perturbations on super-horizon scales, and is negligible in the weakly random scenario. However, non-Gaussianity can possibly be enhanced by resonant effects in the strongly random scenario or arise from the entropic perturbations during the onset of (p)reheating if the background inflaton trajectory exhibits particular properties. The formalism developed in this paper can be applied to a wide class of multi-field inflation models including, e.g. the N-flation scenario.
Stochastic isotropic hyperelastic materials: constitutive calibration and model selection
NASA Astrophysics Data System (ADS)
Mihai, L. Angela; Woolley, Thomas E.; Goriely, Alain
2018-03-01
Biological and synthetic materials often exhibit intrinsic variability in their elastic responses under large strains, owing to microstructural inhomogeneity or when elastic data are extracted from viscoelastic mechanical tests. For these materials, although hyperelastic models calibrated to mean data are useful, stochastic representations accounting also for data dispersion carry extra information about the variability of material properties found in practical applications. We combine finite elasticity and information theories to construct homogeneous isotropic hyperelastic models with random field parameters calibrated to discrete mean values and standard deviations of either the stress-strain function or the nonlinear shear modulus, which is a function of the deformation, estimated from experimental tests. These quantities can take on different values, corresponding to possible outcomes of the experiments. As multiple models can be derived that adequately represent the observed phenomena, we apply Occam's razor by providing an explicit criterion for model selection based on Bayesian statistics. We then employ this criterion to select a model among competing models calibrated to experimental data for rubber and brain tissue under single or multiaxial loads.
NASA Astrophysics Data System (ADS)
Ding, Jian; Li, Li
2018-05-01
We initiate the study on chemical distances of percolation clusters for level sets of two-dimensional discrete Gaussian free fields as well as loop clusters generated by two-dimensional random walk loop soups. One of our results states that the chemical distance between two macroscopic annuli away from the boundary for the random walk loop soup at the critical intensity is of dimension 1 with positive probability. Our proof method is based on an interesting combination of a theorem of Makarov, isomorphism theory, and an entropic repulsion estimate for Gaussian free fields in the presence of a hard wall.
NASA Astrophysics Data System (ADS)
Ding, Jian; Li, Li
2018-06-01
We initiate the study on chemical distances of percolation clusters for level sets of two-dimensional discrete Gaussian free fields as well as loop clusters generated by two-dimensional random walk loop soups. One of our results states that the chemical distance between two macroscopic annuli away from the boundary for the random walk loop soup at the critical intensity is of dimension 1 with positive probability. Our proof method is based on an interesting combination of a theorem of Makarov, isomorphism theory, and an entropic repulsion estimate for Gaussian free fields in the presence of a hard wall.
Random-anisotropy model: Monotonic dependence of the coercive field on D/J
NASA Astrophysics Data System (ADS)
Saslow, W. M.; Koon, N. C.
1994-02-01
We present the results of a numerical study of the zero-temperature remanence and coercivity for the random anisotropy model (RAM), showing that, contrary to early calculations for this model, the coercive field increases monotonically with increases in the strength D of the random anisotropy relative to the strength J at the exchange field. Local-field adjustments with and without spin flips are considered. Convergence is difficult to obtain for small values of the anisotropy, suggesting that this is the likely source of the nonmonotonic behavior found in earlier studies. For both large and small anisotropy, each spin undergoes about one flip per hysteresis cycle, and about half of the spin flips occur in the vicinity of the coercive field. When only non-spin-flip adjustments are considered, at large anisotropy the coercivity is proportional to the anisotropy. At small anisotropy, the rate of convergence is comparable to that when spin flips are included.
Hu, Kun; Lu, Houbing; Wang, Xu; Li, Feng; Liang, Futian; Jin, Ge
2015-01-01
The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.
In Defense of the Randomized Controlled Trial for Health Promotion Research
Rosen, Laura; Manor, Orly; Engelhard, Dan; Zucker, David
2006-01-01
The overwhelming evidence about the role lifestyle plays in mortality, morbidity, and quality of life has pushed the young field of modern health promotion to center stage. The field is beset with intense debate about appropriate evaluation methodologies. Increasingly, randomized designs are considered inappropriate for health promotion research. We have reviewed criticisms against randomized trials that raise philosophical and practical issues, and we will show how most of these criticisms can be overcome with minor design modifications. By providing rebuttal to arguments against randomized trials, our work contributes to building a sound methodological base for health promotion research. PMID:16735622
Note: The design of thin gap chamber simulation signal source based on field programmable gate array
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Kun; Wang, Xu; Li, Feng
The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.
Mendes, M P; Ramalho, M A P; Abreu, A F B
2012-04-10
The objective of this study was to compare the BLUP selection method with different selection strategies in F(2:4) and assess the efficiency of this method on the early choice of the best common bean (Phaseolus vulgaris) lines. Fifty-one F(2:4) progenies were produced from a cross between the CVIII8511 x RP-26 lines. A randomized block design was used with 20 replications and one-plant field plots. Character data on plant architecture and grain yield were obtained and then the sum of the standardized variables was estimated for simultaneous selection of both traits. Analysis was carried out by mixed models (BLUP) and the least squares method to compare different selection strategies, like mass selection, stratified mass selection and between and within progeny selection. The progenies selected by BLUP were assessed in advanced generations, always selecting the greatest and smallest sum of the standardized variables. Analyses by the least squares method and BLUP procedure ranked the progenies in the same way. The coincidence of the individuals identified by BLUP and between and within progeny selection was high and of the greatest magnitude when BLUP was compared with mass selection. Although BLUP is the best estimator of genotypic value, its efficiency in the response to long term selection is not different from any of the other methods, because it is also unable to predict the future effect of the progenies x environments interaction. It was inferred that selection success will always depend on the most accurate possible progeny assessment and using alternatives to reduce the progenies x environments interaction effect.
Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee
2015-08-01
Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool , a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.
NASA Technical Reports Server (NTRS)
Earl, James A.
1992-01-01
When charged particles spiral along a large constant magnetic field, their trajectories are scattered by any random field components that are superposed on the guiding field. If the random field configuration embodies helicity, the scattering is asymmetrical with respect to a plane perpendicular to the guiding field, for particles moving into the forward hemisphere are scattered at different rates from those moving into the backward hemisphere. This asymmetry gives rise to new terms in the transport equations that describe propagation of charged particles. Helicity has virtually no impact on qualitative features of the diffusive mode of propagation. However, characteristic velocities of the coherent modes that appear after a highly anisotropic injection exhibit an asymmetry related to helicity. Explicit formulas, which embody the effects of helicity, are given for the anisotropies, the coefficient diffusion, and the coherent velocities. Predictions derived from these expressions are in good agreement with Monte Carlo simulations of particle transport, but the simulations reveal certain phenomena whose explanation calls for further analytical work.
Time Correlations of Lightning Flash Sequences in Thunderstorms Revealed by Fractal Analysis
NASA Astrophysics Data System (ADS)
Gou, Xueqiang; Chen, Mingli; Zhang, Guangshu
2018-01-01
By using the data of lightning detection and ranging system at the Kennedy Space Center, the temporal fractal and correlation of interevent time series of lightning flash sequences in thunderstorms have been investigated with Allan factor (AF), Fano factor (FF), and detrended fluctuation analysis (DFA) methods. AF, FF, and DFA methods are powerful tools to detect the time-scaling structures and correlations in point processes. Totally 40 thunderstorms with distinguishing features of a single-cell storm and apparent increase and decrease in the total flash rate were selected for the analysis. It is found that the time-scaling exponents for AF (
The spectral expansion of the elasticity random field
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malyarenko, Anatoliy; Ostoja-Starzewski, Martin
2014-12-10
We consider a deformable body that occupies a region D in the plane. In our model, the body’s elasticity tensor H(x) is the restriction to D of a second-order mean-square continuous random field. Under translation, the expected value and the correlation tensor of the field H(x) do not change. Under action of an arbitrary element k of the orthogonal group O(2), they transform according to the reducible orthogonal representation k ⟼ S{sup 2}(S{sup 2}(k)) of the above group. We find the spectral expansion of the correlation tensor R(x) of the elasticity field as well as the expansion of the fieldmore » itself in terms of stochastic integrals with respect to a family of orthogonal scattered random measures.« less
The Effect of CAI on Reading Achievement.
ERIC Educational Resources Information Center
Hardman, Regina
A study determined whether computer assisted instruction (CAI) had an effect on students' reading achievement. Subjects were 21 randomly selected fourth-grade students at D. S. Wentworth Elementary School on the south side of Chicago in a low-income neighborhood who received a year's exposure to a CAI program, and 21 randomly selected students at…
78 FR 57033 - United States Standards for Condition of Food Containers
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-17
... containers during production. Stationary lot sampling is the process of randomly selecting sample units from.... * * * * * Stationary lot sampling. The process of randomly selecting sample units from a lot whose production has been... less than \\1/16\\-inch Stringy seal (excessive plastic threads showing at edge of seal 222 area...
Access to Higher Education by the Luck of the Draw
ERIC Educational Resources Information Center
Stone, Peter
2013-01-01
Random selection is a fair way to break ties between applicants of equal merit seeking admission to institutions of higher education (with "merit" defined here in terms of the intrinsic contribution higher education would make to the applicant's life). Opponents of random selection commonly argue that differences in strength between…
ERIC Educational Resources Information Center
Beretvas, S. Natasha; Murphy, Daniel L.
2013-01-01
The authors assessed correct model identification rates of Akaike's information criterion (AIC), corrected criterion (AICC), consistent AIC (CAIC), Hannon and Quinn's information criterion (HQIC), and Bayesian information criterion (BIC) for selecting among cross-classified random effects models. Performance of default values for the 5…
1977 Survey of the American Professoriate. Technical Report.
ERIC Educational Resources Information Center
Ladd, Everett Carll, Jr.; And Others
The development and data validation of the 1977 Ladd-Lipset national survey of the American professoriate are described. The respondents were selected from a random sample of colleges and universities and from a random sample of individual faculty members from the universities. The 158 institutions in the 1977 survey were selected from 2,406…
Site Selection in Experiments: A Follow-Up Evaluation of Site Recruitment in Two Scale-Up Studies
ERIC Educational Resources Information Center
Tipton, Elizabeth; Fellers, Lauren; Caverly, Sarah; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Ruiz de Castillo, Veronica
2015-01-01
Randomized experiments are commonly used to evaluate if particular interventions improve student achievement. While these experiments can establish that a treatment actually "causes" changes, typically the participants are not randomly selected from a well-defined population and therefore the results do not readily generalize. Three…
Random field assessment of nanoscopic inhomogeneity of bone.
Dong, X Neil; Luo, Qing; Sparkman, Daniel M; Millwater, Harry R; Wang, Xiaodu
2010-12-01
Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to represent the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. Copyright © 2010 Elsevier Inc. All rights reserved.
Selection dynamic of Escherichia coli host in M13 combinatorial peptide phage display libraries.
Zanconato, Stefano; Minervini, Giovanni; Poli, Irene; De Lucrezia, Davide
2011-01-01
Phage display relies on an iterative cycle of selection and amplification of random combinatorial libraries to enrich the initial population of those peptides that satisfy a priori chosen criteria. The effectiveness of any phage display protocol depends directly on library amino acid sequence diversity and the strength of the selection procedure. In this study we monitored the dynamics of the selective pressure exerted by the host organism on a random peptide library in the absence of any additional selection pressure. The results indicate that sequence censorship exerted by Escherichia coli dramatically reduces library diversity and can significantly impair phage display effectiveness.
NASA Astrophysics Data System (ADS)
Tsukanov, A. A.; Gorbatnikov, A. V.
2018-01-01
Study of the statistical parameters of the Earth's random microseismic field makes it possible to obtain estimates of the properties and structure of the Earth's crust and upper mantle. Different approaches are used to observe and process the microseismic records, which are divided into several groups of passive seismology methods. Among them are the well-known methods of surface-wave tomography, the spectral H/ V ratio of the components in the surface wave, and microseismic sounding, currently under development, which uses the spectral ratio V/ V 0 of the vertical components between pairs of spatially separated stations. In the course of previous experiments, it became clear that these ratios are stable statistical parameters of the random field that do not depend on the properties of microseism sources. This paper proposes to expand the mentioned approach and study the possibilities for using the ratio of the horizontal components H 1/ H 2 of the microseismic field. Numerical simulation was used to study the influence of an embedded velocity inhomogeneity on the spectral ratio of the horizontal components of the random field of fundamental Rayleigh modes, based on the concept that the Earth's microseismic field is represented by these waves in a significant part of the frequency spectrum.
Silvis, Alexander; Ford, W. Mark; Britzke, Eric R.
2015-01-01
Bat day-roost selection often is described through comparisons of day-roosts with randomly selected, and assumed unused, trees. Relatively few studies, however, look at patterns of multi-year selection or compare day-roosts used across years. We explored day-roost selection using 2 years of roost selection data for female northern long-eared bats (Myotis septentrionalis) on the Fort Knox Military Reservation, Kentucky, USA. We compared characteristics of randomly selected non-roost trees and day-roosts using a multinomial logistic model and day-roost species selection using chi-squared tests. We found that factors differentiating day-roosts from non-roosts and day-roosts between years varied. Day-roosts differed from non-roosts in the first year of data in all measured factors, but only in size and decay stage in the second year. Between years, day-roosts differed in size and canopy position, but not decay stage. Day-roost species selection was non-random and did not differ between years. Although bats used multiple trees, our results suggest that there were additional unused trees that were suitable as roosts at any time. Day-roost selection pattern descriptions will be inadequate if based only on a single year of data, and inferences of roost selection based only on comparisons of roost to non-roosts should be limited.
Silvis, Alexander; Ford, W. Mark; Britzke, Eric R.
2015-01-01
Bat day-roost selection often is described through comparisons of day-roosts with randomly selected, and assumed unused, trees. Relatively few studies, however, look at patterns of multi-year selection or compare day-roosts used across years. We explored day-roost selection using 2 years of roost selection data for female northern long-eared bats (Myotis septentrionalis) on the Fort Knox Military Reservation, Kentucky, USA. We compared characteristics of randomly selected non-roost trees and day-roosts using a multinomial logistic model and day-roost species selection using chi-squared tests. We found that factors differentiating day-roosts from non-roosts and day-roosts between years varied. Day-roosts differed from non-roosts in the first year of data in all measured factors, but only in size and decay stage in the second year. Between years, day-roosts differed in size and canopy position, but not decay stage. Day-roost species selection was non-random and did not differ between years. Although bats used multiple trees, our results suggest that there were additional unused trees that were suitable as roosts at any time. Day-roost selection pattern descriptions will be inadequate if based only on a single year of data, and inferences of roost selection based only on comparisons of roost to non-roosts should be limited.
Valenzuela, Carlos Y
2013-01-01
The Neutral Theory of Evolution (NTE) proposes mutation and random genetic drift as the most important evolutionary factors. The most conspicuous feature of evolution is the genomic stability during paleontological eras and lack of variation among taxa; 98% or more of nucleotide sites are monomorphic within a species. NTE explains this homology by random fixation of neutral bases and negative selection (purifying selection) that does not contribute either to evolution or polymorphisms. Purifying selection is insufficient to account for this evolutionary feature and the Nearly-Neutral Theory of Evolution (N-NTE) included negative selection with coefficients as low as mutation rate. These NTE and N-NTE propositions are thermodynamically (tendency to random distributions, second law), biotically (recurrent mutation), logically and mathematically (resilient equilibria instead of fixation by drift) untenable. Recurrent forward and backward mutation and random fluctuations of base frequencies alone in a site make life organization and fixations impossible. Drift is not a directional evolutionary factor, but a directional tendency of matter-energy processes (second law) which threatens the biotic organization. Drift cannot drive evolution. In a site, the mutation rates among bases and selection coefficients determine the resilient equilibrium frequency of bases that genetic drift cannot change. The expected neutral random interaction among nucleotides is zero; however, huge interactions and periodicities were found between bases of dinucleotides separated by 1, 2... and more than 1,000 sites. Every base is co-adapted with the whole genome. Neutralists found that neutral evolution is independent of population size (N); thus neutral evolution should be independent of drift, because drift effect is dependent upon N. Also, chromosome size and shape as well as protein size are far from random.
Scott, J.C.
1990-01-01
Computer software was written to randomly select sites for a ground-water-quality sampling network. The software uses digital cartographic techniques and subroutines from a proprietary geographic information system. The report presents the approaches, computer software, and sample applications. It is often desirable to collect ground-water-quality samples from various areas in a study region that have different values of a spatial characteristic, such as land-use or hydrogeologic setting. A stratified network can be used for testing hypotheses about relations between spatial characteristics and water quality, or for calculating statistical descriptions of water-quality data that account for variations that correspond to the spatial characteristic. In the software described, a study region is subdivided into areal subsets that have a common spatial characteristic to stratify the population into several categories from which sampling sites are selected. Different numbers of sites may be selected from each category of areal subsets. A population of potential sampling sites may be defined by either specifying a fixed population of existing sites, or by preparing an equally spaced population of potential sites. In either case, each site is identified with a single category, depending on the value of the spatial characteristic of the areal subset in which the site is located. Sites are selected from one category at a time. One of two approaches may be used to select sites. Sites may be selected randomly, or the areal subsets in the category can be grouped into cells and sites selected randomly from each cell.
Vision Sensor-Based Road Detection for Field Robot Navigation
Lu, Keyu; Li, Jian; An, Xiangjing; He, Hangen
2015-01-01
Road detection is an essential component of field robot navigation systems. Vision sensors play an important role in road detection for their great potential in environmental perception. In this paper, we propose a hierarchical vision sensor-based method for robust road detection in challenging road scenes. More specifically, for a given road image captured by an on-board vision sensor, we introduce a multiple population genetic algorithm (MPGA)-based approach for efficient road vanishing point detection. Superpixel-level seeds are then selected in an unsupervised way using a clustering strategy. Then, according to the GrowCut framework, the seeds proliferate and iteratively try to occupy their neighbors. After convergence, the initial road segment is obtained. Finally, in order to achieve a globally-consistent road segment, the initial road segment is refined using the conditional random field (CRF) framework, which integrates high-level information into road detection. We perform several experiments to evaluate the common performance, scale sensitivity and noise sensitivity of the proposed method. The experimental results demonstrate that the proposed method exhibits high robustness compared to the state of the art. PMID:26610514
Park, Changhoon; Seo, Hwi Won; Kang, Ikjae; Jeong, Jiwoon; Choi, Kyuhyung; Chae, Chanhee
2014-09-01
The change in growth performance resulting from a new modified live porcine reproductive and respiratory syndrome (PRRS) vaccine was evaluated under field conditions for registration with the government as guided by the Republic of Korea's Animal and Plant Quarantine Agency. Three farms were selected based on their history of PRRS-associated respiratory diseases. On each farm, a total of 45 3-week-old pigs were randomly allocated to one of two treatment groups, (i) vaccinated (n = 25) or (ii) control (n = 20) animals. A new modified live PRRSV vaccine increased market weight by 1.26 kg/pig (104.71 kg versus 103.45 kg; P < 0.05) and decreased mortality by 17% (1.33% versus 18.33%; P < 0.05). Pathological examination indicated that vaccination effectively reduced microscopic lung lesions compared with control animals on the 3 farms. Thus, the new modified live PRRS vaccine improved growth performance and decreased mortality and lung lesions when evaluated under field conditions. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Röösli, Martin; Frei, Patrizia; Bolte, John; Neubauer, Georg; Cardis, Elisabeth; Feychting, Maria; Gajsek, Peter; Heinrich, Sabine; Joseph, Wout; Mann, Simon; Martens, Luc; Mohler, Evelyn; Parslow, Roger C; Poulsen, Aslak Harbo; Radon, Katja; Schüz, Joachim; Thuroczy, György; Viel, Jean-François; Vrijheid, Martine
2010-05-20
The development of new wireless communication technologies that emit radio frequency electromagnetic fields (RF-EMF) is ongoing, but little is known about the RF-EMF exposure distribution in the general population. Previous attempts to measure personal exposure to RF-EMF have used different measurement protocols and analysis methods making comparisons between exposure situations across different study populations very difficult. As a result, observed differences in exposure levels between study populations may not reflect real exposure differences but may be in part, or wholly due to methodological differences. The aim of this paper is to develop a study protocol for future personal RF-EMF exposure studies based on experience drawn from previous research. Using the current knowledge base, we propose procedures for the measurement of personal exposure to RF-EMF, data collection, data management and analysis, and methods for the selection and instruction of study participants. We have identified two basic types of personal RF-EMF measurement studies: population surveys and microenvironmental measurements. In the case of a population survey, the unit of observation is the individual and a randomly selected representative sample of the population is needed to obtain reliable results. For microenvironmental measurements, study participants are selected in order to represent typical behaviours in different microenvironments. These two study types require different methods and procedures. Applying our proposed common core procedures in future personal measurement studies will allow direct comparisons of personal RF-EMF exposures in different populations and study areas.
2010-01-01
Background The development of new wireless communication technologies that emit radio frequency electromagnetic fields (RF-EMF) is ongoing, but little is known about the RF-EMF exposure distribution in the general population. Previous attempts to measure personal exposure to RF-EMF have used different measurement protocols and analysis methods making comparisons between exposure situations across different study populations very difficult. As a result, observed differences in exposure levels between study populations may not reflect real exposure differences but may be in part, or wholly due to methodological differences. Methods The aim of this paper is to develop a study protocol for future personal RF-EMF exposure studies based on experience drawn from previous research. Using the current knowledge base, we propose procedures for the measurement of personal exposure to RF-EMF, data collection, data management and analysis, and methods for the selection and instruction of study participants. Results We have identified two basic types of personal RF-EMF measurement studies: population surveys and microenvironmental measurements. In the case of a population survey, the unit of observation is the individual and a randomly selected representative sample of the population is needed to obtain reliable results. For microenvironmental measurements, study participants are selected in order to represent typical behaviours in different microenvironments. These two study types require different methods and procedures. Conclusion Applying our proposed common core procedures in future personal measurement studies will allow direct comparisons of personal RF-EMF exposures in different populations and study areas. PMID:20487532
Degerman, Alexander; Rinne, Teemu; Särkkä, Anna-Kaisa; Salmi, Juha; Alho, Kimmo
2008-06-01
Event-related brain potentials (ERPs) and magnetic fields (ERFs) were used to compare brain activity associated with selective attention to sound location or pitch in humans. Sixteen healthy adults participated in the ERP experiment, and 11 adults in the ERF experiment. In different conditions, the participants focused their attention on a designated sound location or pitch, or pictures presented on a screen, in order to detect target sounds or pictures among the attended stimuli. In the Attend Location condition, the location of sounds varied randomly (left or right), while their pitch (high or low) was kept constant. In the Attend Pitch condition, sounds of varying pitch (high or low) were presented at a constant location (left or right). Consistent with previous ERP results, selective attention to either sound feature produced a negative difference (Nd) between ERPs to attended and unattended sounds. In addition, ERPs showed a more posterior scalp distribution for the location-related Nd than for the pitch-related Nd, suggesting partially different generators for these Nds. The ERF source analyses found no source distribution differences between the pitch-related Ndm (the magnetic counterpart of the Nd) and location-related Ndm in the superior temporal cortex (STC), where the main sources of the Ndm effects are thought to be located. Thus, the ERP scalp distribution differences between the location-related and pitch-related Nd effects may have been caused by activity of areas outside the STC, perhaps in the inferior parietal regions.
NASA Astrophysics Data System (ADS)
Tudino, T.; Bortoluzzi, G.; Aliani, S.
2014-03-01
Marine water dynamics in the near field of a massive gas eruption near Panarea (Aeolian Islands volcanic arc, SE Tyrrhenian Sea) is described. ADCP current-meters were deployed during the paroxysmal phase in 2002 and 2003 a few meters from the degassing vent, recording day-long time series. Datasets were sorted to remove errors and select good quality ensembles over the entire water column. Standard deviation of error velocity was considered a proxy for inhomogeneous velocity fields over beams. Time series intervals had been selected when the basic ADCP assumptions were fulfilled and random errors minimized. Backscatter data were also processed to identify bubbles in the water column with the aim of locating bubble-free ensembles. Reliable time series are selected combining these data. Two possible scenarios have been described: firstly, a highly dynamic situation with visible surface diverging rings of waves, entrainment on the lower part of the gas column, detrainment in the upper part and a stagnation line (SL) at mid depth where currents were close to zero and most of the gas bubbles spread laterally; secondly, a less dynamic situation with water entraining into the gas plume at all depths and no surface rings of diverging waves. Reasons for these different dynamics may be ascribed to changes in gas fluxes (one order of magnitude higher in 2002). Description of SL is important to quantify its position in the water column and timing for entrainment-detrainment, and it can be measured by ADCP and calculated from models.
Cooperation and charity in spatial public goods game under different strategy update rules
NASA Astrophysics Data System (ADS)
Li, Yixiao; Jin, Xiaogang; Su, Xianchuang; Kong, Fansheng; Peng, Chengbin
2010-03-01
Human cooperation can be influenced by other human behaviors and recent years have witnessed the flourishing of studying the coevolution of cooperation and punishment, yet the common behavior of charity is seldom considered in game-theoretical models. In this article, we investigate the coevolution of altruistic cooperation and egalitarian charity in spatial public goods game, by considering charity as the behavior of reducing inter-individual payoff differences. Our model is that, in each generation of the evolution, individuals play games first and accumulate payoff benefits, and then each egalitarian makes a charity donation by payoff transfer in its neighborhood. To study the individual-level evolutionary dynamics, we adopt different strategy update rules and investigate their effects on charity and cooperation. These rules can be classified into two global rules: random selection rule in which individuals randomly update strategies, and threshold selection rule where only those with payoffs below a threshold update strategies. Simulation results show that random selection enhances the cooperation level, while threshold selection lowers the threshold of the multiplication factor to maintain cooperation. When charity is considered, it is incapable in promoting cooperation under random selection, whereas it promotes cooperation under threshold selection. Interestingly, the evolution of charity strongly depends on the dispersion of payoff acquisitions of the population, which agrees with previous results. Our work may shed light on understanding human egalitarianism.
Le, Trang T; Simmons, W Kyle; Misaki, Masaya; Bodurka, Jerzy; White, Bill C; Savitz, Jonathan; McKinney, Brett A
2017-09-15
Classification of individuals into disease or clinical categories from high-dimensional biological data with low prediction error is an important challenge of statistical learning in bioinformatics. Feature selection can improve classification accuracy but must be incorporated carefully into cross-validation to avoid overfitting. Recently, feature selection methods based on differential privacy, such as differentially private random forests and reusable holdout sets, have been proposed. However, for domains such as bioinformatics, where the number of features is much larger than the number of observations p≫n , these differential privacy methods are susceptible to overfitting. We introduce private Evaporative Cooling, a stochastic privacy-preserving machine learning algorithm that uses Relief-F for feature selection and random forest for privacy preserving classification that also prevents overfitting. We relate the privacy-preserving threshold mechanism to a thermodynamic Maxwell-Boltzmann distribution, where the temperature represents the privacy threshold. We use the thermal statistical physics concept of Evaporative Cooling of atomic gases to perform backward stepwise privacy-preserving feature selection. On simulated data with main effects and statistical interactions, we compare accuracies on holdout and validation sets for three privacy-preserving methods: the reusable holdout, reusable holdout with random forest, and private Evaporative Cooling, which uses Relief-F feature selection and random forest classification. In simulations where interactions exist between attributes, private Evaporative Cooling provides higher classification accuracy without overfitting based on an independent validation set. In simulations without interactions, thresholdout with random forest and private Evaporative Cooling give comparable accuracies. We also apply these privacy methods to human brain resting-state fMRI data from a study of major depressive disorder. Code available at http://insilico.utulsa.edu/software/privateEC . brett-mckinney@utulsa.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Martinez, J C; Caprio, M A
2016-03-27
Recent detection of western corn rootworm resistance to Bt (Bacillus thuringiensis) corn prompted recommendations for the use of integrated pest management (IPM) with planting refuges to prolong the durability of Bt technologies. We conducted a simulation experiment exploring the effectiveness of various IPM tools at extending durability of pyramided Bt traits. Results indicate that some IPM practices have greater merits than others. Crop rotation was the most effective strategy, followed by increasing the non-Bt refuge size from 5 to 20%. Soil-applied insecticide use for Bt corn did not increase the durability compared with planting Bt with refuges alone, and both projected lower durabilities. When IPM participation with randomly selected management tools was increased at the time of Bt commercialization, durability of pyramided traits increased as well. When non-corn rootworm expressing corn was incorporated as an IPM option, the durability further increased.For corn rootworm, a local resistance phenomenon appeared immediately surrounding the resistant field (hotspot) and spread throughout the local neighborhood in six generations in absence of mitigation. Hotspot mitigation with random selection of strategies was ineffective at slowing resistance, unless crop rotation occurred immediately; regional mitigation was superior to random mitigation in the hotspot and reduced observed resistance allele frequencies in the neighborhood. As resistance alleles of mobile pests can escape hotspots, the scope of mitigation should extend beyond resistant sites. In the case of widespread resistance, regional mitigation was less effective at prolonging the life of the pyramid than IPM with Bt deployment at the time of commercialization. Published by Oxford University Press on behalf of Entomological Society of America 2016. This work is written by US Government employees and is in the public domain in the United States.
Nest-site selection and success of mottled ducks on agricultural lands in southwest Louisiana
Durham, R.S.; Afton, A.D.
2003-01-01
Listing of the mottled duck (Anas fulvigula maculosa) as a priority species in the Gulf Coast Joint Venture of the North American Waterfowl Management Plan, coupled with recent declines of rice (Oryza sativa) acreage, led us to investigate the nesting ecology of this species on agricultural lands in southwest Louisiana. We examined nest-site selection at macro- and microhabitat levels, nest success, causes of nest failures, and habitat features influencing nest success. We found that female mottled ducks preferred to nest in permanent pastures with knolls (53% of nests) and idle fields (22% of nests). Vegetation height was greater at nests than at random points within the same macrohabitat patch. Successful nests were associated with greater numbers of plant species, located farther from water, and associated with higher vegetation density values than were unsuccessful nests. We determined that mammalian predators caused most nest failures (77% of 52 unsuccessful nests). Our results suggest that nest success of mottled ducks on agricultural lands in southwest Louisiana could be improved by 1) locating large permanent pastures and idle fields near rice fields and other available wetlands, 2) managing plant communities in these upland areas to favor dense stands of perennial bunch grasses, tall composites, dewberry (Rubus trivialis), and other native grasses and forbs, and 3) managing cattle-stocking rates and the duration and timing of grazing to promote tall, dense stands of these plant taxa during the nesting season (March-June).
Li, Xiao-Zhou; Li, Song-Sui; Zhuang, Jun-Ping; Chan, Sze-Chun
2015-09-01
A semiconductor laser with distributed feedback from a fiber Bragg grating (FBG) is investigated for random bit generation (RBG). The feedback perturbs the laser to emit chaotically with the intensity being sampled periodically. The samples are then converted into random bits by a simple postprocessing of self-differencing and selecting bits. Unlike a conventional mirror that provides localized feedback, the FBG provides distributed feedback which effectively suppresses the information of the round-trip feedback delay time. Randomness is ensured even when the sampling period is commensurate with the feedback delay between the laser and the grating. Consequently, in RBG, the FBG feedback enables continuous tuning of the output bit rate, reduces the minimum sampling period, and increases the number of bits selected per sample. RBG is experimentally investigated at a sampling period continuously tunable from over 16 ns down to 50 ps, while the feedback delay is fixed at 7.7 ns. By selecting 5 least-significant bits per sample, output bit rates from 0.3 to 100 Gbps are achieved with randomness examined by the National Institute of Standards and Technology test suite.
Computational Study of the Blood Flow in Three Types of 3D Hollow Fiber Membrane Bundles
Zhang, Jiafeng; Chen, Xiaobing; Ding, Jun; Fraser, Katharine H.; Ertan Taskin, M.; Griffith, Bartley P.; Wu, Zhongjun J.
2013-01-01
The goal of this study is to develop a computational fluid dynamics (CFD) modeling approach to better estimate the blood flow dynamics in the bundles of the hollow fiber membrane based medical devices (i.e., blood oxygenators, artificial lungs, and hemodialyzers). Three representative types of arrays, square, diagonal, and random with the porosity value of 0.55, were studied. In addition, a 3D array with the same porosity was studied. The flow fields between the individual fibers in these arrays at selected Reynolds numbers (Re) were simulated with CFD modeling. Hemolysis is not significant in the fiber bundles but the platelet activation may be essential. For each type of array, the average wall shear stress is linearly proportional to the Re. For the same Re but different arrays, the average wall shear stress also exhibits a linear dependency on the pressure difference across arrays, while Darcy′s law prescribes a power-law relationship, therefore, underestimating the shear stress level. For the same Re, the average wall shear stress of the diagonal array is approximately 3.1, 1.8, and 2.0 times larger than that of the square, random, and 3D arrays, respectively. A coefficient C is suggested to correlate the CFD predicted data with the analytical solution, and C is 1.16, 1.51, and 2.05 for the square, random, and diagonal arrays in this paper, respectively. It is worth noting that C is strongly dependent on the array geometrical properties, whereas it is weakly dependent on the flow field. Additionally, the 3D fiber bundle simulation results show that the three-dimensional effect is not negligible. Specifically, velocity and shear stress distribution can vary significantly along the fiber axial direction. PMID:24141394
Development of machine learning models for diagnosis of glaucoma.
Kim, Seong Jae; Cho, Kyong Jin; Oh, Sejong
2017-01-01
The study aimed to develop machine learning models that have strong prediction power and interpretability for diagnosis of glaucoma based on retinal nerve fiber layer (RNFL) thickness and visual field (VF). We collected various candidate features from the examination of retinal nerve fiber layer (RNFL) thickness and visual field (VF). We also developed synthesized features from original features. We then selected the best features proper for classification (diagnosis) through feature evaluation. We used 100 cases of data as a test dataset and 399 cases of data as a training and validation dataset. To develop the glaucoma prediction model, we considered four machine learning algorithms: C5.0, random forest (RF), support vector machine (SVM), and k-nearest neighbor (KNN). We repeatedly composed a learning model using the training dataset and evaluated it by using the validation dataset. Finally, we got the best learning model that produces the highest validation accuracy. We analyzed quality of the models using several measures. The random forest model shows best performance and C5.0, SVM, and KNN models show similar accuracy. In the random forest model, the classification accuracy is 0.98, sensitivity is 0.983, specificity is 0.975, and AUC is 0.979. The developed prediction models show high accuracy, sensitivity, specificity, and AUC in classifying among glaucoma and healthy eyes. It will be used for predicting glaucoma against unknown examination records. Clinicians may reference the prediction results and be able to make better decisions. We may combine multiple learning models to increase prediction accuracy. The C5.0 model includes decision rules for prediction. It can be used to explain the reasons for specific predictions.
Taradaj, Jakub; Ozon, Marcin; Dymarek, Robert; Bolach, Bartosz; Walewicz, Karolina; Rosińczuk, Joanna
2018-03-23
Interdisciplinary physical therapy together with pharmacological treatment constitute conservative treatment strategies related to low back pain (LBP). There is still a lack of high quality studies aimed at an objective evaluation of physiotherapeutic procedures according to their effectiveness in LBP. The aim of this study is to carry out a prospective, randomized, single-blinded, and placebocontrolled clinical trial to evaluate the effectiveness of magnetic fields in discopathy-related LBP. A group of 177 patients was assessed for eligibility based on inclusion and exclusion criteria. In the end, 106 patients were randomly assigned into 5 comparative groups: A (n = 23; magnetic therapy: 10 mT, 50 Hz); B (n = 23; magnetic therapy: 5 mT, 50 Hz); C (n = 20; placebo magnetic therapy); D (n = 20; magnetic stimulation: 49.2 μT, 195 Hz); and E (n = 20; placebo magnetic stimulation). All patients were assessed using tests for pain intensity, degree of disability and range of motion. Also, postural stability was assessed using a stabilographic platform. In this study, positive changes in all clinical outcomes were demonstrated in group A (p < 0.05). The most effective clinical effect was observed for pain reduction (p < 0.05), improvement of the range of motion (p < 0.05) and functional ability of the spine (p <0.05). It is also worth noting that the effects in the majority of the measured indicators were mostly short-term (p > 0.05). It was determined that the application of magnetic therapy (10 mT, 50 Hz, 20 min) significantly reduces pain symptoms and leads to an improvement of functional ability in patients with LBP.
Faggion, Clovis Mariano; Wu, Yun-Chun; Scheidgen, Moritz; Tu, Yu-Kang
2015-01-01
Background Risk of bias (ROB) may threaten the internal validity of a clinical trial by distorting the magnitude of treatment effect estimates, although some conflicting information on this assumption exists. Objective The objective of this study was evaluate the effect of ROB on the magnitude of treatment effect estimates in randomized controlled trials (RCTs) in periodontology and implant dentistry. Methods A search for Cochrane systematic reviews (SRs), including meta-analyses of RCTs published in periodontology and implant dentistry fields, was performed in the Cochrane Library in September 2014. Random-effect meta-analyses were performed by grouping RCTs with different levels of ROBs in three domains (sequence generation, allocation concealment, and blinding of outcome assessment). To increase power and precision, only SRs with meta-analyses including at least 10 RCTs were included. Meta-regression was performed to investigate the association between ROB characteristics and the magnitudes of intervention effects in the meta-analyses. Results Of the 24 initially screened SRs, 21 SRs were excluded because they did not include at least 10 RCTs in the meta-analyses. Three SRs (two from periodontology field) generated information for conducting 27 meta-analyses. Meta-regression did not reveal significant differences in the relationship of the ROB level with the size of treatment effect estimates, although a trend for inflated estimates was observed in domains with unclear ROBs. Conclusion In this sample of RCTs, high and (mainly) unclear risks of selection and detection biases did not seem to influence the size of treatment effect estimates, although several confounders might have influenced the strength of the association. PMID:26422698
Vegada, Bhavisha; Shukla, Apexa; Khilnani, Ajeetkumar; Charan, Jaykaran; Desai, Chetna
2016-01-01
Most of the academic teachers use four or five options per item of multiple choice question (MCQ) test as formative and summative assessment. Optimal number of options in MCQ item is a matter of considerable debate among academic teachers of various educational fields. There is a scarcity of the published literature regarding the optimum number of option in each item of MCQ in the field of medical education. To compare three options, four options, and five options MCQs test for the quality parameters - reliability, validity, item analysis, distracter analysis, and time analysis. Participants were 3 rd semester M.B.B.S. students. Students were divided randomly into three groups. Each group was given one set of MCQ test out of three options, four options, and five option randomly. Following the marking of the multiple choice tests, the participants' option selections were analyzed and comparisons were conducted of the mean marks, mean time, validity, reliability and facility value, discrimination index, point biserial value, distracter analysis of three different option formats. Students score more ( P = 0.000) and took less time ( P = 0.009) for the completion of three options as compared to four options and five options groups. Facility value was more ( P = 0.004) in three options group as compared to four and five options groups. There was no significant difference between three groups for the validity, reliability, and item discrimination. Nonfunctioning distracters were more in the four and five options group as compared to three option group. Assessment based on three option MCQs is can be preferred over four option and five option MCQs.
NASA Astrophysics Data System (ADS)
Fedrigo, Melissa; Newnham, Glenn J.; Coops, Nicholas C.; Culvenor, Darius S.; Bolton, Douglas K.; Nitschke, Craig R.
2018-02-01
Light detection and ranging (lidar) data have been increasingly used for forest classification due to its ability to penetrate the forest canopy and provide detail about the structure of the lower strata. In this study we demonstrate forest classification approaches using airborne lidar data as inputs to random forest and linear unmixing classification algorithms. Our results demonstrated that both random forest and linear unmixing models identified a distribution of rainforest and eucalypt stands that was comparable to existing ecological vegetation class (EVC) maps based primarily on manual interpretation of high resolution aerial imagery. Rainforest stands were also identified in the region that have not previously been identified in the EVC maps. The transition between stand types was better characterised by the random forest modelling approach. In contrast, the linear unmixing model placed greater emphasis on field plots selected as endmembers which may not have captured the variability in stand structure within a single stand type. The random forest model had the highest overall accuracy (84%) and Cohen's kappa coefficient (0.62). However, the classification accuracy was only marginally better than linear unmixing. The random forest model was applied to a region in the Central Highlands of south-eastern Australia to produce maps of stand type probability, including areas of transition (the 'ecotone') between rainforest and eucalypt forest. The resulting map provided a detailed delineation of forest classes, which specifically recognised the coalescing of stand types at the landscape scale. This represents a key step towards mapping the structural and spatial complexity of these ecosystems, which is important for both their management and conservation.
Optimization Of Mean-Semivariance-Skewness Portfolio Selection Model In Fuzzy Random Environment
NASA Astrophysics Data System (ADS)
Chatterjee, Amitava; Bhattacharyya, Rupak; Mukherjee, Supratim; Kar, Samarjit
2010-10-01
The purpose of the paper is to construct a mean-semivariance-skewness portfolio selection model in fuzzy random environment. The objective is to maximize the skewness with predefined maximum risk tolerance and minimum expected return. Here the security returns in the objectives and constraints are assumed to be fuzzy random variables in nature and then the vagueness of the fuzzy random variables in the objectives and constraints are transformed into fuzzy variables which are similar to trapezoidal numbers. The newly formed fuzzy model is then converted into a deterministic optimization model. The feasibility and effectiveness of the proposed method is verified by numerical example extracted from Bombay Stock Exchange (BSE). The exact parameters of fuzzy membership function and probability density function are obtained through fuzzy random simulating the past dates.
Vrijheid, Martine; Deltour, Isabelle; Krewski, Daniel; Sanchez, Marie; Cardis, Elisabeth
2006-07-01
This paper examines the effects of systematic and random errors in recall and of selection bias in case-control studies of mobile phone use and cancer. These sensitivity analyses are based on Monte-Carlo computer simulations and were carried out within the INTERPHONE Study, an international collaborative case-control study in 13 countries. Recall error scenarios simulated plausible values of random and systematic, non-differential and differential recall errors in amount of mobile phone use reported by study subjects. Plausible values for the recall error were obtained from validation studies. Selection bias scenarios assumed varying selection probabilities for cases and controls, mobile phone users, and non-users. Where possible these selection probabilities were based on existing information from non-respondents in INTERPHONE. Simulations used exposure distributions based on existing INTERPHONE data and assumed varying levels of the true risk of brain cancer related to mobile phone use. Results suggest that random recall errors of plausible levels can lead to a large underestimation in the risk of brain cancer associated with mobile phone use. Random errors were found to have larger impact than plausible systematic errors. Differential errors in recall had very little additional impact in the presence of large random errors. Selection bias resulting from underselection of unexposed controls led to J-shaped exposure-response patterns, with risk apparently decreasing at low to moderate exposure levels. The present results, in conjunction with those of the validation studies conducted within the INTERPHONE study, will play an important role in the interpretation of existing and future case-control studies of mobile phone use and cancer risk, including the INTERPHONE study.
Tehran Air Pollutants Prediction Based on Random Forest Feature Selection Method
NASA Astrophysics Data System (ADS)
Shamsoddini, A.; Aboodi, M. R.; Karami, J.
2017-09-01
Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.
Good, Andrew C; Hermsmeier, Mark A
2007-01-01
Research into the advancement of computer-aided molecular design (CAMD) has a tendency to focus on the discipline of algorithm development. Such efforts are often wrought to the detriment of the data set selection and analysis used in said algorithm validation. Here we highlight the potential problems this can cause in the context of druglikeness classification. More rigorous efforts are applied to the selection of decoy (nondruglike) molecules from the ACD. Comparisons are made between model performance using the standard technique of random test set creation with test sets derived from explicit ontological separation by drug class. The dangers of viewing druglike space as sufficiently coherent to permit simple classification are highlighted. In addition the issues inherent in applying unfiltered data and random test set selection to (Q)SAR models utilizing large and supposedly heterogeneous databases are discussed.
Relativistic diffusive motion in random electromagnetic fields
NASA Astrophysics Data System (ADS)
Haba, Z.
2011-08-01
We show that the relativistic dynamics in a Gaussian random electromagnetic field can be approximated by the relativistic diffusion of Schay and Dudley. Lorentz invariant dynamics in the proper time leads to the diffusion in the proper time. The dynamics in the laboratory time gives the diffusive transport equation corresponding to the Jüttner equilibrium at the inverse temperature β-1 = mc2. The diffusion constant is expressed by the field strength correlation function (Kubo's formula).
Ventura, Dora F; Costa, Marcelo T V; Costa, Marcelo F; Berezovsky, Adriana; Salomão, Solange R; Simões, Ana Luíza; Lago, Marcos; Pereira, Luiz H M Canto; Faria, Marcília A M; De Souza, John M; Silveira, Luiz Carlos L
2004-01-01
We evaluated the color vision of mercury-contaminated patients and investigated possible retinal origins of losses using electroretinography. Participants were retired workers from a fluorescent lamp industry diagnosed with mercury contamination (n = 43) and age-matched controls (n = 21). Color discrimination was assessed with the Cambridge Colour Test (CCT). Retinal function was evaluated by using the ISCEV protocol for full-field electroretinography (full-field ERG), as well as by means of multifocal electroretinography (mfERG). Color-vision losses assessed by the CCT consisted of higher color-discrimination thresholds along the protan, deutan, and tritan axes and significantly larger discrimination ellipses in mercury-exposed patients compared to controls. Full-field ERG amplitudes from patients were smaller than those of the controls for the scotopic response b-wave, maximum response, sum of oscillatory potentials (OPs), 30-Hz flicker response, and light-adapted cone response. OP amplitudes measured in patients were smaller than those of controls for O2 and O3. Multifocal ERGs recorded from ten randomly selected patients showed smaller N1-P1 amplitudes and longer latencies throughout the 25-deg central field. Full-field ERGs showed that scotopic, photopic, peripheral, and midperipheral retinal functions were affected, and the mfERGs indicated that central retinal function was also significantly depressed. To our knowledge, this is the first demonstration of retinal involvement in visual losses caused by mercury toxicity.
Patterning of polymer nanofiber meshes by electrospinning for biomedical applications
Neves, Nuno M; Campos, Rui; Pedro, Adriano; Cunha, José; Macedo, Francisco; Reis, Rui L
2007-01-01
The end-product of the electrospinning process is typically a randomly aligned fiber mesh or membrane. This is a result of the electric field generated between the drop of polymer solution at the needle and the collector. The developed electric field causes the stretching of the fibers and their random deposition. By judicious selection of the collector architecture, it is thus possible to develop other morphologies on the nanofiber meshes. The aim of this work is to prepare fiber meshes using various patterned collectors with specific dimensions and designs and to evaluate how those patterns can affect the properties of the meshes relevant to biomedical applications. This study aims at verifying whether it is possible to control the architecture of the fiber meshes by tailoring the geometry of the collector. Three different metallic collector topographies are used to test this hypothesis. Electrospun nonwoven patterned meshes of polyethylene oxide (PEO) and poly(ε-capro-lactone) (PCL) were successfully prepared. Those fiber meshes were analyzed by scanning electron microscopy (SEM). Both mechanical properties of the meshes and cell contacting experiments were performed to test the effect of the produced patterns over the properties of the meshes relevant for biomedical applications. The present study will evaluate cell adhesion sensitivity to the patterns generated and the effect of those patterns on the tensile properties of the fiber meshes. PMID:18019842
The Effects of Social Capital Levels in Elementary Schools on Organizational Information Sharing
ERIC Educational Resources Information Center
Ekinci, Abdurrahman
2012-01-01
This study aims to assess the effects of social capital levels at elementary schools on organizational information sharing as reported by teachers. Participants were 267 teachers selected randomly from 16 elementary schools; schools also selected randomly among 42 elementary schools located in the city center of Batman. The data were analyzed by…
ERIC Educational Resources Information Center
Rafferty, Karen; Watson, Patrice; Lappe, Joan M.
2011-01-01
Objective: To assess the impact of calcium-fortified food and dairy food on selected nutrient intakes in the diets of adolescent girls. Design: Randomized controlled trial, secondary analysis. Setting and Participants: Adolescent girls (n = 149) from a midwestern metropolitan area participated in randomized controlled trials of bone physiology…
ERIC Educational Resources Information Center
Thomas, Henry B.; Kaplan, E. Joseph
A national survey was conducted of randomly selected chief student personnel officers as listed in the 1979 "Education Directory of Colleges and Universities." The survey addressed specific institutional demographics, policy-making authority, reporting structure, and areas of responsibility of the administrators. Over 93 percent of the respondents…
Nonmanufacturing Businesses. U.S. Metric Study Interim Report.
ERIC Educational Resources Information Center
Cornog, June R.; Bunten, Elaine D.
In this fifth interim report on the feasibility of a United States changeover to a metric system stems from the U.S. Metric Study, a primary stratified sample of 2,828 nonmanufacturing firms was randomly selected from 28,184 businesses taken from Social Security files, a secondary sample of 2,258 firms was randomly selected for replacement…
ERIC Educational Resources Information Center
Juhasz, Stephen; And Others
Table of contents (TOC) practices of some 120 primary journals were analyzed. The journals were randomly selected. The method of randomization is described. The samples were selected from a university library with a holding of approximately 12,000 titles published worldwide. A questionnaire was designed. Purpose was to find uniformity and…
Molecular selection in a unified evolutionary sequence
NASA Technical Reports Server (NTRS)
Fox, S. W.
1986-01-01
With guidance from experiments and observations that indicate internally limited phenomena, an outline of unified evolutionary sequence is inferred. Such unification is not visible for a context of random matrix and random mutation. The sequence proceeds from Big Bang through prebiotic matter, protocells, through the evolving cell via molecular and natural selection, to mind, behavior, and society.
Selection of Variables in Cluster Analysis: An Empirical Comparison of Eight Procedures
ERIC Educational Resources Information Center
Steinley, Douglas; Brusco, Michael J.
2008-01-01
Eight different variable selection techniques for model-based and non-model-based clustering are evaluated across a wide range of cluster structures. It is shown that several methods have difficulties when non-informative variables (i.e., random noise) are included in the model. Furthermore, the distribution of the random noise greatly impacts the…
ERIC Educational Resources Information Center
Bibiso, Abyot; Olango, Menna; Bibiso, Mesfin
2017-01-01
The purpose of this study was to investigate the relationship between teacher's commitment and female students academic achievement in selected secondary school of Wolaita zone, Southern Ethiopia. The research method employed was survey study and the sampling techniques were purposive, simple random and stratified random sampling. Questionnaire…
ERIC Educational Resources Information Center
Martinez, John; Fraker, Thomas; Manno, Michelle; Baird, Peter; Mamun, Arif; O'Day, Bonnie; Rangarajan, Anu; Wittenburg, David
2010-01-01
This report focuses on the seven original Youth Transition Demonstration (YTD) projects selected for funding in 2003. Three of the original seven projects were selected for a national random assignment evaluation in 2005; however, this report only focuses on program operations prior to joining the random assignment evaluation for the three…
Dai, Huanping; Micheyl, Christophe
2010-01-01
A major concern when designing a psychophysical experiment is that participants may use another stimulus feature (“cue”) than that intended by the experimenter. One way to avoid this involves applying random variations to the corresponding feature across stimulus presentations, to make the “unwanted” cue unreliable. An important question facing experimenters who use this randomization (“roving”) technique is: How large should the randomization range be to ensure that participants cannot achieve a certain proportion correct (PC) by using the unwanted cue, while at the same time avoiding unnecessary interference of the randomization with task performance? Previous publications have provided formulas for the selection of adequate randomization ranges in yes-no and multiple-alternative, forced-choice tasks. In this article, we provide figures and tables, which can be used to select randomization ranges that are better suited to experiments involving a same-different, dual-pair, or oddity task. PMID:20139466
Continuous-variable quantum authentication of physical unclonable keys
NASA Astrophysics Data System (ADS)
Nikolopoulos, Georgios M.; Diamanti, Eleni
2017-04-01
We propose a scheme for authentication of physical keys that are materialized by optical multiple-scattering media. The authentication relies on the optical response of the key when probed by randomly selected coherent states of light, and the use of standard wavefront-shaping techniques that direct the scattered photons coherently to a specific target mode at the output. The quadratures of the electromagnetic field of the scattered light at the target mode are analysed using a homodyne detection scheme, and the acceptance or rejection of the key is decided upon the outcomes of the measurements. The proposed scheme can be implemented with current technology and offers collision resistance and robustness against key cloning.
Al-Fatlawi, Ali H; Fatlawi, Hayder K; Sai Ho Ling
2017-07-01
Daily physical activities monitoring is benefiting the health care field in several ways, in particular with the development of the wearable sensors. This paper adopts effective ways to calculate the optimal number of the necessary sensors and to build a reliable and a high accuracy monitoring system. Three data mining algorithms, namely Decision Tree, Random Forest and PART Algorithm, have been applied for the sensors selection process. Furthermore, the deep belief network (DBN) has been investigated to recognise 33 physical activities effectively. The results indicated that the proposed method is reliable with an overall accuracy of 96.52% and the number of sensors is minimised from nine to six sensors.
Propagation of terahertz pulses in random media.
Pearce, Jeremy; Jian, Zhongping; Mittleman, Daniel M
2004-02-15
We describe measurements of single-cycle terahertz pulse propagation in a random medium. The unique capabilities of terahertz time-domain spectroscopy permit the characterization of a multiply scattered field with unprecedented spatial and temporal resolution. With these results, we can develop a framework for understanding the statistics of broadband laser speckle. Also, the ability to extract information on the phase of the field opens up new possibilities for characterizing multiply scattered waves. We illustrate this with a simple example, which involves computing a time-windowed temporal correlation between fields measured at different spatial locations. This enables the identification of individual scattering events, and could lead to a new method for imaging in random media.
Random electric field instabilities of relaxor ferroelectrics
NASA Astrophysics Data System (ADS)
Arce-Gamboa, José R.; Guzmán-Verri, Gian G.
2017-06-01
Relaxor ferroelectrics are complex oxide materials which are rather unique to study the effects of compositional disorder on phase transitions. Here, we study the effects of quenched cubic random electric fields on the lattice instabilities that lead to a ferroelectric transition and show that, within a microscopic model and a statistical mechanical solution, even weak compositional disorder can prohibit the development of long-range order and that a random field state with anisotropic and power-law correlations of polarization emerges from the combined effect of their characteristic dipole forces and their inherent charge disorder. We compare and reproduce several key experimental observations in the well-studied relaxor PbMg1/3Nb2/3O3-PbTiO3.
Random crystal field effects on the integer and half-integer mixed-spin system
NASA Astrophysics Data System (ADS)
Yigit, Ali; Albayrak, Erhan
2018-05-01
In this work, we have focused on the random crystal field effects on the phase diagrams of the mixed spin-1 and spin-5/2 Ising system obtained by utilizing the exact recursion relations (ERR) on the Bethe lattice (BL). The distribution function P(Di) = pδ [Di - D(1 + α) ] +(1 - p) δ [Di - D(1 - α) ] is used to randomize the crystal field.The phase diagrams are found to exhibit second- and first-order phase transitions depending on the values of α, D and p. It is also observed that the model displays tricritical point, isolated point, critical end point and three compensation temperatures for suitable values of the system parameters.
Vallée, Julie; Souris, Marc; Fournet, Florence; Bochaton, Audrey; Mobillion, Virginie; Peyronnie, Karine; Salem, Gérard
2007-01-01
Background Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. Methods We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. Application We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. Conclusion This paper describes the conceptual reasoning behind the construction of the survey sample and shows that it can be advantageous to choose clusters using reasoned hypotheses, based on both probability and geographical approaches, in contrast to a conventional, random cluster selection strategy. PMID:17543100
Vallée, Julie; Souris, Marc; Fournet, Florence; Bochaton, Audrey; Mobillion, Virginie; Peyronnie, Karine; Salem, Gérard
2007-06-01
Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. This paper describes the conceptual reasoning behind the construction of the survey sample and shows that it can be advantageous to choose clusters using reasoned hypotheses, based on both probability and geographical approaches, in contrast to a conventional, random cluster selection strategy.
[Evaluation of using statistical methods in selected national medical journals].
Sych, Z
1996-01-01
The paper covers the performed evaluation of frequency with which the statistical methods were applied in analyzed works having been published in six selected, national medical journals in the years 1988-1992. For analysis the following journals were chosen, namely: Klinika Oczna, Medycyna Pracy, Pediatria Polska, Polski Tygodnik Lekarski, Roczniki Państwowego Zakładu Higieny, Zdrowie Publiczne. Appropriate number of works up to the average in the remaining medical journals was randomly selected from respective volumes of Pol. Tyg. Lek. The studies did not include works wherein the statistical analysis was not implemented, which referred both to national and international publications. That exemption was also extended to review papers, casuistic ones, reviews of books, handbooks, monographies, reports from scientific congresses, as well as papers on historical topics. The number of works was defined in each volume. Next, analysis was performed to establish the mode of finding out a suitable sample in respective studies, differentiating two categories: random and target selections. Attention was also paid to the presence of control sample in the individual works. In the analysis attention was also focussed on the existence of sample characteristics, setting up three categories: complete, partial and lacking. In evaluating the analyzed works an effort was made to present the results of studies in tables and figures (Tab. 1, 3). Analysis was accomplished with regard to the rate of employing statistical methods in analyzed works in relevant volumes of six selected, national medical journals for the years 1988-1992, simultaneously determining the number of works, in which no statistical methods were used. Concurrently the frequency of applying the individual statistical methods was analyzed in the scrutinized works. Prominence was given to fundamental statistical methods in the field of descriptive statistics (measures of position, measures of dispersion) as well as most important methods of mathematical statistics such as parametric tests of significance, analysis of variance (in single and dual classifications). non-parametric tests of significance, correlation and regression. The works, in which use was made of either multiple correlation or multiple regression or else more complex methods of studying the relationship for two or more numbers of variables, were incorporated into the works whose statistical methods were constituted by correlation and regression as well as other methods, e.g. statistical methods being used in epidemiology (coefficients of incidence and morbidity, standardization of coefficients, survival tables) factor analysis conducted by Jacobi-Hotellng's method, taxonomic methods and others. On the basis of the performed studies it has been established that the frequency of employing statistical methods in the six selected national, medical journals in the years 1988-1992 was 61.1-66.0% of the analyzed works (Tab. 3), and they generally were almost similar to the frequency provided in English language medical journals. On a whole, no significant differences were disclosed in the frequency of applied statistical methods (Tab. 4) as well as in frequency of random tests (Tab. 3) in the analyzed works, appearing in the medical journals in respective years 1988-1992. The most frequently used statistical methods in analyzed works for 1988-1992 were the measures of position 44.2-55.6% and measures of dispersion 32.5-38.5% as well as parametric tests of significance 26.3-33.1% of the works analyzed (Tab. 4). For the purpose of increasing the frequency and reliability of the used statistical methods, the didactics should be widened in the field of biostatistics at medical studies and postgraduation training designed for physicians and scientific-didactic workers.
NASA Astrophysics Data System (ADS)
Zaim, N.; Zaim, A.; Kerouad, M.
2017-02-01
In this work, the magnetic behavior of the cylindrical nanowire, consisting of a ferromagnetic core of spin-1 atoms surrounded by a ferromagnetic shell of spin-1 atoms is studied in the presence of a random crystal field interaction. Based on Metropolis algorithm, the Monte Carlo simulation has been used to investigate the effects of the concentration of the random crystal field p, the crystal field D and the shell exchange interaction Js on the phase diagrams and the hysteresis behavior of the system. Some characteristic behaviors have been found, such as the first and second-order phase transitions joined by tricritical point for appropriate values of the system parameters, triple and isolated critical points can be also found. Depending on the Hamiltonian parameters, single, double and para hysteresis regions are explicitly determined.
3D vector distribution of the electro-magnetic fields on a random gold film
NASA Astrophysics Data System (ADS)
Canneson, Damien; Berini, Bruno; Buil, Stéphanie; Hermier, Jean-Pierre; Quélin, Xavier
2018-05-01
The 3D vector distribution of the electro-magnetic fields at the very close vicinity of the surface of a random gold film is studied. Such films are well known for their properties of light confinement and large fluctuations of local density of optical states. Using Finite-Difference Time-Domain simulations, we show that it is possible to determine the local orientation of the electro-magnetic fields. This allows us to obtain a complete characterization of the fields. Large fluctuations of their amplitude are observed as previously shown. Here, we demonstrate large variations of their direction depending both on the position on the random gold film, and on the distance to it. Such characterization could be useful for a better understanding of applications like the coupling of point-like dipoles to such films.
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Yurkin, Maxim A.
2017-01-01
Although the model of randomly oriented nonspherical particles has been used in a great variety of applications of far-field electromagnetic scattering, it has never been defined in strict mathematical terms. In this Letter we use the formalism of Euler rigid-body rotations to clarify the concept of statistically random particle orientations and derive its immediate corollaries in the form of most general mathematical properties of the orientation-averaged extinction and scattering matrices. Our results serve to provide a rigorous mathematical foundation for numerous publications in which the notion of randomly oriented particles and its light-scattering implications have been considered intuitively obvious.
Eni, Angela O; Efekemo, Oghenevwairhe P; Soluade, Mojisola G; Popoola, Segun I; Atayero, Aderemi A
2018-08-01
Cassava mosaic disease (CMD) is one of the most economically important viral diseases of cassava, an important staple food for over 800 million people in the tropics. Although several Cassava mosaic virus species associated with CMD have been isolated and characterized over the years, several new super virulent strains of these viruses have evolved due to genetic recombination between diverse species. In this data article, field survey data collected from 184 cassava farms in 12 South Western and North Central States of Nigeria in 2015 are presented and extensively explored. In each State, one cassava farm was randomly selected as the first farm and subsequent farms were selected at 10 km intervals, except in locations were cassava farms are sporadically located. In each selected farm, 30 cassava plants were sampled along two diagonals and all selected plant was scored for the presence or absence of CMD symptoms. Cassava mosaic disease incidence and associated whitefly vectors in South West and North Central Nigeria are explored using relevant descriptive statistics, box plots, bar charts, line graphs, and pie charts. In addition, correlation analysis, Analysis of Variance (ANOVA), and multiple comparison post-hoc tests are performed to understand the relationship between the numbers of whiteflies counted, uninfected farms, infected farms, and the mean of symptom severity in and across the States under investigation. The data exploration provided in this data article is considered adequate for objective assessment of the incidence and symptom severity of cassava mosaic disease and associated whitefly vectors in farmers' fields in these parts of Nigeria where cassava is heavily cultivated.
SDSS-IV MaNGA: Galaxy Pair Fraction and Correlated Active Galactic Nuclei
NASA Astrophysics Data System (ADS)
Fu, Hai; Steffen, Joshua L.; Gross, Arran C.; Dai, Y. Sophia; Isbell, Jacob W.; Lin, Lihwai; Wake, David; Xue, Rui; Bizyaev, Dmitry; Pan, Kaike
2018-04-01
We have identified 105 galaxy pairs at z ∼ 0.04 with the MaNGA integral-field spectroscopic data. The pairs have projected separations between 1 and 30 kpc, and are selected to have radial velocity offsets less than 600 km s‑1 and stellar mass ratio between 0.1 and 1. The pair fraction increases with both the physical size of the integral-field unit and the stellar mass, consistent with theoretical expectations. We provide the best-fit analytical function of the pair fraction and find that ∼3% of M* galaxies are in close pairs. For both isolated galaxies and paired galaxies, active galactic nuclei (AGNs) are selected using emission-line ratios and Hα equivalent widths measured inside apertures at a fixed physical size. We find AGNs in ∼24% of the paired galaxies and binary AGNs in ∼13% of the pairs. To account for the selection biases in both the pair sample and the MaNGA sample, we compare the AGN comoving volume densities with those expected from the mass- and redshift-dependent AGN fractions. We find a strong (∼5×) excess of binary AGNs over random pairing and a mild (∼20%) deficit of single AGNs. The binary AGN excess increases from ∼2× to ∼6× as the projected separation decreases from 10–30 to 1–10 kpc. Our results indicate that the pairing of galaxies preserves the AGN duty cycle in individual galaxies but increases the population of binary AGNs through correlated activities. We suggest tidally induced galactic-scale shocks and AGN cross-ionization as two plausible channels to produce low-luminosity narrow-line-selected binary AGNs.
NASA Astrophysics Data System (ADS)
Albeverio, Sergio; Tamura, Hiroshi
2018-04-01
We consider a model describing the coupling of a vector-valued and a scalar homogeneous Markovian random field over R4, interpreted as expressing the interaction between a charged scalar quantum field coupled with a nonlinear quantized electromagnetic field. Expectations of functionals of the random fields are expressed by Brownian bridges. Using this, together with Feynman-Kac-Itô type formulae and estimates on the small time and large time behaviour of Brownian functionals, we prove asymptotic upper and lower bounds on the kernel of the transition semigroup for our model. The upper bound gives faster than exponential decay for large distances of the corresponding resolvent (propagator).
Digital servo control of random sound fields
NASA Technical Reports Server (NTRS)
Nakich, R. B.
1973-01-01
It is necessary to place number of sensors at different positions in sound field to determine actual sound intensities to which test object is subjected. It is possible to determine whether specification is being met adequately or exceeded. Since excitation is of random nature, signals are essentially coherent and it is impossible to obtain true average.
Random potentials and cosmological attractors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linde, Andrei, E-mail: alinde@stanford.edu
I show that the problem of realizing inflation in theories with random potentials of a limited number of fields can be solved, and agreement with the observational data can be naturally achieved if at least one of these fields has a non-minimal kinetic term of the type used in the theory of cosmological α-attractors.
Random phase approximation and cluster mean field studies of hard core Bose Hubbard model
NASA Astrophysics Data System (ADS)
Alavani, Bhargav K.; Gaude, Pallavi P.; Pai, Ramesh V.
2018-04-01
We investigate zero temperature and finite temperature properties of the Bose Hubbard Model in the hard core limit using Random Phase Approximation (RPA) and Cluster Mean Field Theory (CMFT). We show that our RPA calculations are able to capture quantum and thermal fluctuations significantly better than CMFT.
The influence of an uncertain force environment on reshaping trial-to-trial motor variability.
Izawa, Jun; Yoshioka, Toshinori; Osu, Rieko
2014-09-10
Motor memory is updated to generate ideal movements in a novel environment. When the environment changes every trial randomly, how does the brain incorporate this uncertainty into motor memory? To investigate how the brain adapts to an uncertain environment, we considered a reach adaptation protocol where individuals practiced moving in a force field where a noise was injected. After they had adapted, we measured the trial-to-trial variability in the temporal profiles of the produced hand force. We found that the motor variability was significantly magnified by the adaptation to the random force field. Temporal profiles of the motor variance were significantly dissociable between two different types of random force fields experienced. A model-based analysis suggests that the variability is generated by noise in the gains of the internal model. It further suggests that the trial-to-trial motor variability magnified by the adaptation in a random force field is generated by the uncertainty of the internal model formed in the brain as a result of the adaptation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shimizu, Y; Yoon, Y; Iwase, K
Purpose: We are trying to develop an image-searching technique to identify misfiled images in a picture archiving and communication system (PACS) server by using five biological fingerprints: the whole lung field, cardiac shadow, superior mediastinum, lung apex, and right lower lung. Each biological fingerprint in a chest radiograph includes distinctive anatomical structures to identify misfiled images. The whole lung field was less effective for evaluating the similarity between two images than the other biological fingerprints. This was mainly due to the variation in the positioning for chest radiographs. The purpose of this study is to develop new biological fingerprints thatmore » could reduce influence of differences in the positioning for chest radiography. Methods: Two hundred patients were selected randomly from our database (36,212 patients). These patients had two images each (current and previous images). Current images were used as the misfiled images in this study. A circumscribed rectangular area of the lung and the upper half of the rectangle were selected automatically as new biological fingerprints. These biological fingerprints were matched to all previous images in the database. The degrees of similarity between the two images were calculated for the same and different patients. The usefulness of new the biological fingerprints for automated patient recognition was examined in terms of receiver operating characteristic (ROC) analysis. Results: Area under the ROC curves (AUCs) for the circumscribed rectangle of the lung, upper half of the rectangle, and whole lung field were 0.980, 0.994, and 0.950, respectively. The new biological fingerprints showed better performance in identifying the patients correctly than the whole lung field. Conclusion: We have developed new biological fingerprints: circumscribed rectangle of the lung and upper half of the rectangle. These new biological fingerprints would be useful for automated patient identification system because they are less affected by positioning differences during imaging.« less
Learning From Past Failures of Oral Insulin Trials.
Michels, Aaron W; Gottlieb, Peter A
2018-07-01
Very recently one of the largest type 1 diabetes prevention trials using daily administration of oral insulin or placebo was completed. After 9 years of study enrollment and follow-up, the randomized controlled trial failed to delay the onset of clinical type 1 diabetes, which was the primary end point. The unfortunate outcome follows the previous large-scale trial, the Diabetes Prevention Trial-Type 1 (DPT-1), which again failed to delay diabetes onset with oral insulin or low-dose subcutaneous insulin injections in a randomized controlled trial with relatives at risk for type 1 diabetes. These sobering results raise the important question, "Where does the type 1 diabetes prevention field move next?" In this Perspective, we advocate for a paradigm shift in which smaller mechanistic trials are conducted to define immune mechanisms and potentially identify treatment responders. The stage is set for these interventions in individuals at risk for type 1 diabetes as Type 1 Diabetes TrialNet has identified thousands of relatives with islet autoantibodies and general population screening for type 1 diabetes risk is under way. Mechanistic trials will allow for better trial design and patient selection based upon molecular markers prior to large randomized controlled trials, moving toward a personalized medicine approach for the prevention of type 1 diabetes. © 2018 by the American Diabetes Association.
NASA Astrophysics Data System (ADS)
Tsao, Shih-Ming; Lai, Ji-Ching; Horng, Horng-Er; Liu, Tu-Chen; Hong, Chin-Yih
2017-04-01
Aptamers are oligonucleotides that can bind to specific target molecules. Most aptamers are generated using random libraries in the standard systematic evolution of ligands by exponential enrichment (SELEX). Each random library contains oligonucleotides with a randomized central region and two fixed primer regions at both ends. The fixed primer regions are necessary for amplifying target-bound sequences by PCR. However, these extra-sequences may cause non-specific bindings, which potentially interfere with good binding for random sequences. The Magnetic-Assisted Rapid Aptamer Selection (MARAS) is a newly developed protocol for generating single-strand DNA aptamers. No repeat selection cycle is required in the protocol. This study proposes and demonstrates a method to isolate aptamers for C-reactive proteins (CRP) from a randomized ssDNA library containing no fixed sequences at 5‧ and 3‧ termini using the MARAS platform. Furthermore, the isolated primer-free aptamer was sequenced and binding affinity for CRP was analyzed. The specificity of the obtained aptamer was validated using blind serum samples. The result was consistent with monoclonal antibody-based nephelometry analysis, which indicated that a primer-free aptamer has high specificity toward targets. MARAS is a feasible platform for efficiently generating primer-free aptamers for clinical diagnoses.
DeLay, Dawn; Ha, Thao; Van Ryzin, Mark; Winter, Charlotte; Dishion, Thomas J.
2015-01-01
Adolescent friendships that promote problem behavior are often chosen in middle school. The current study examines the unintended impact of a randomized school based intervention on the selection of friends in middle school, as well as on observations of deviant talk with friends five years later. Participants included 998 middle school students (526 boys and 472 girls) recruited at the onset of middle school (age 11-12 years) from three public middle schools participating in the Family Check-up model intervention. The current study focuses only on the effects of the SHAPe curriculum—one level of the Family Check-up model—on friendship choices. Participants nominated friends and completed measures of deviant peer affiliation. Approximately half of the sample (n=500) was randomly assigned to the intervention and the other half (n=498) comprised the control group within each school. The results indicate that the SHAPe curriculum affected friend selection within School 1, but not within Schools 2 or 3. The effects of friend selection in School 1 translated into reductions in observed deviancy training five years later (age 16-17 years). By coupling longitudinal social network analysis with a randomized intervention study the current findings provide initial evidence that a randomized public middle school intervention can disrupt the formation of deviant peer groups and diminish levels of adolescent deviance five years later. PMID:26377235
Central Limit Theorem for Exponentially Quasi-local Statistics of Spin Models on Cayley Graphs
NASA Astrophysics Data System (ADS)
Reddy, Tulasi Ram; Vadlamani, Sreekar; Yogeshwaran, D.
2018-04-01
Central limit theorems for linear statistics of lattice random fields (including spin models) are usually proven under suitable mixing conditions or quasi-associativity. Many interesting examples of spin models do not satisfy mixing conditions, and on the other hand, it does not seem easy to show central limit theorem for local statistics via quasi-associativity. In this work, we prove general central limit theorems for local statistics and exponentially quasi-local statistics of spin models on discrete Cayley graphs with polynomial growth. Further, we supplement these results by proving similar central limit theorems for random fields on discrete Cayley graphs taking values in a countable space, but under the stronger assumptions of α -mixing (for local statistics) and exponential α -mixing (for exponentially quasi-local statistics). All our central limit theorems assume a suitable variance lower bound like many others in the literature. We illustrate our general central limit theorem with specific examples of lattice spin models and statistics arising in computational topology, statistical physics and random networks. Examples of clustering spin models include quasi-associated spin models with fast decaying covariances like the off-critical Ising model, level sets of Gaussian random fields with fast decaying covariances like the massive Gaussian free field and determinantal point processes with fast decaying kernels. Examples of local statistics include intrinsic volumes, face counts, component counts of random cubical complexes while exponentially quasi-local statistics include nearest neighbour distances in spin models and Betti numbers of sub-critical random cubical complexes.
Durga, Padmaja; Raavula, Parvathi; Gurajala, Indira; Gunnam, Poojita; Veerabathula, Prardhana; Reddy, Mukund; Upputuri, Omkar; Ramachandran, Gopinath
2015-09-01
To assess the effect of tranexamic acid on the quality of the surgical field. Prospective, randomized, double-blind study. Institutional, tertiary referral hospital. American Society of Anesthesiologists physical status class I patients, aged 8 to 60 months with Group II or III (Balakrishnan's classification) clefts scheduled for cleft palate repair. Children were randomized into two groups. The control group received saline, and the tranexamic acid group received tranexamic acid 10 mg/kg as a bolus, 15 minutes before incision. Grade of surgical field on a 10-point scale, surgeon satisfaction, and primary hemorrhage. Significant improvements were noted in surgeon satisfaction and median grade of assessment of the surgical field (4 [interquartile range, 4 to 6] in the control group vs. 3 [interquartile range, 2 to 4] in the test group; P = .003) in the tranexamic acid group compared to the control group. Preincision administration of 10 mg/kg of tranexamic acid significantly improved the surgical field during cleft palate repair.
NASA Astrophysics Data System (ADS)
Schießl, Stefan P.; Rother, Marcel; Lüttgens, Jan; Zaumseil, Jana
2017-11-01
The field-effect mobility is an important figure of merit for semiconductors such as random networks of single-walled carbon nanotubes (SWNTs). However, owing to their network properties and quantum capacitance, the standard models for field-effect transistors cannot be applied without modifications. Several different methods are used to determine the mobility with often very different results. We fabricated and characterized field-effect transistors with different polymer-sorted, semiconducting SWNT network densities ranging from low (≈6 μm-1) to densely packed quasi-monolayers (≈26 μm-1) with a maximum on-conductance of 0.24 μS μm-1 and compared four different techniques to evaluate the field-effect mobility. We demonstrate the limits and requirements for each method with regard to device layout and carrier accumulation. We find that techniques that take into account the measured capacitance on the active device give the most reliable mobility values. Finally, we compare our experimental results to a random-resistor-network model.
Seven lessons from manyfield inflation in random potentials
NASA Astrophysics Data System (ADS)
Dias, Mafalda; Frazer, Jonathan; Marsh, M. C. David
2018-01-01
We study inflation in models with many interacting fields subject to randomly generated scalar potentials. We use methods from non-equilibrium random matrix theory to construct the potentials and an adaption of the `transport method' to evolve the two-point correlators during inflation. This construction allows, for the first time, for an explicit study of models with up to 100 interacting fields supporting a period of `approximately saddle-point' inflation. We determine the statistical predictions for observables by generating over 30,000 models with 2–100 fields supporting at least 60 efolds of inflation. These studies lead us to seven lessons: i) Manyfield inflation is not single-field inflation, ii) The larger the number of fields, the simpler and sharper the predictions, iii) Planck compatibility is not rare, but future experiments may rule out this class of models, iv) The smoother the potentials, the sharper the predictions, v) Hyperparameters can transition from stiff to sloppy, vi) Despite tachyons, isocurvature can decay, vii) Eigenvalue repulsion drives the predictions. We conclude that many of the `generic predictions' of single-field inflation can be emergent features of complex inflation models.
NASA Astrophysics Data System (ADS)
Pradillo, Gerardo; Heintz, Aneesh; Vlahovska, Petia
2017-11-01
The spontaneous rotation of a sphere in an applied uniform DC electric field (Quincke effect) has been utilized to engineer self-propelled particles: if the sphere is initially resting on a surface, it rolls. The Quincke rollers have been widely used as a model system to study collective behavior in ``active'' suspensions. If the applied field is DC, an isolated Quincke roller follows a straight line trajectory. In this talk, we discuss the design of a Quincke roller that executes a random-walk-like behavior. We utilize AC field - upon reversal of the field direction a fluctuation in the axis of rotation (which is degenerate in the plane perpendicular to the field and parallel to the surface) introduces randomness in the direction of motion. The MSD of an isolated Quincke walker depends on frequency, amplitude, and waveform of the electric field. Experiment and theory are compared. We also investigate the collective behavior of Quincke walkers,the transport of inert particles in a bath of Quincke walkers, and the spontaneous motion of a drop containing Quincke active particle. supported by NSF Grant CBET 1437545.
A management-oriented classification of pinyon-juniper woodlands of the Great Basin
Neil E. West; Robin J. Tausch; Paul T. Tueller
1998-01-01
A hierarchical framework for the classification of Great Basin pinyon-juniper woodlands was based on a systematic sample of 426 stands from a random selection of 66 of the 110 mountain ranges in the region. That is, mountain ranges were randomly selected, but stands were systematically located on mountain ranges. The National Hierarchical Framework of Ecological Units...
School Happiness and School Success: An Investigation across Multiple Grade Levels.
ERIC Educational Resources Information Center
Parish, Joycelyn Gay; Parish, Thomas S.; Batt, Steve
A total of 572 randomly selected sixth-grade students and 908 randomly selected ninth-grade students from a large metropolitan school district in the Midwest were asked to complete a series of survey questions designed to measure the extent to which they were happy while at school, as well as questions concerning the extent to which they treated…
40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2013 CFR
2013-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2011 CFR
2011-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2010 CFR
2010-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2014 CFR
2014-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2012 CFR
2012-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
Attitude and Motivation as Predictors of Academic Achievement of Students in Clothing and Textiles
ERIC Educational Resources Information Center
Uwameiye, B. E.; Osho, L. E.
2011-01-01
This study investigated attitude and motivation as predictors of academic achievement of students in clothing and textiles. Three colleges of education in Edo and Delta States were randomly selected for use in this study. From each school, 40 students were selected from Year III using simple random technique yielding a total of 240 students. The…
A morphologic analysis of 'naked' islets of Langerhans in lobular atrophy of the pancreas.
Suda, K; Tsukahara, M; Miyake, T; Hirai, S
1994-08-01
The 'naked' islets of Langerhans (NIL) in randomly selected autopsy cases and in cases of chronic alcoholic pancreatitis, cystic fibrosis, and pancreatic carcinoma were studied histopathologically. The NIL were found in 55 of 164 randomly selected cases, with age-related frequency, in 21 of 30 cases of chronic alcoholic pancreatitis, in 2 of 2 cases of cystic fibrosis, and in 25 of 32 cases of pancreatic carcinoma. The NIL were frequently accompanied by ductal alterations: epithelial metaplasia and hyperplasia in randomly selected cases, protein plugs in chronic alcoholic pancreatitis, mucus plugs in cystic fibrosis, and obliterated ducts in pancreatic carcinoma. The NIL in randomly selected cases may have been formed by ductal alterations that caused stenosis of the lumen, those in chronic alcoholic pancreatitis and cystic fibrosis were the result of protein or mucus plugging, and those in pancreatic carcinoma were a result of neoplastic involvement of the distal pancreatic duct. Therefore, the common factor in the development of NIL is thought to be obstruction of the pancreatic duct system, and in cases of NIL that have a multilobular distribution and interinsular fibrosis, a diagnosis of chronic pancreatitis can usually be made.
Random numbers from vacuum fluctuations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Yicheng; Kurtsiefer, Christian, E-mail: christian.kurtsiefer@gmail.com; Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543
2016-07-25
We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.
Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos
2014-01-01
Introduction It is fundamental that randomised controlled trials (RCTs) are properly conducted in order to reach well-supported conclusions. However, there is emerging evidence that RCTs are subject to biases which can overestimate or underestimate the true treatment effect, due to flaws in the study design characteristics of such trials. The extent to which this holds true in oral health RCTs, which have some unique design characteristics compared to RCTs in other health fields, is unclear. As such, we aim to examine the empirical evidence quantifying the extent of bias associated with methodological and non-methodological characteristics in oral health RCTs. Methods and analysis We plan to perform a meta-epidemiological study, where a sample size of 60 meta-analyses (MAs) including approximately 600 RCTs will be selected. The MAs will be randomly obtained from the Oral Health Database of Systematic Reviews using a random number table; and will be considered for inclusion if they include a minimum of five RCTs, and examine a therapeutic intervention related to one of the recognised dental specialties. RCTs identified in selected MAs will be subsequently included if their study design includes a comparison between an intervention group and a placebo group or another intervention group. Data will be extracted from selected trials included in MAs based on a number of methodological and non-methodological characteristics. Moreover, the risk of bias will be assessed using the Cochrane Risk of Bias tool. Effect size estimates and measures of variability for the main outcome will be extracted from each RCT included in selected MAs, and a two-level analysis will be conducted using a meta-meta-analytic approach with a random effects model to allow for intra-MA and inter-MA heterogeneity. Ethics and dissemination The intended audiences of the findings will include dental clinicians, oral health researchers, policymakers and graduate students. The aforementioned will be introduced to the findings through workshops, seminars, round table discussions and targeted individual meetings. Other opportunities for knowledge transfer will be pursued such as key dental conferences. Finally, the results will be published as a scientific report in a dental peer-reviewed journal. PMID:24568962
Evaluating the capabilities of watershed-scale models in estimating sediment yield at field-scale.
Sommerlot, Andrew R; Nejadhashemi, A Pouyan; Woznicki, Sean A; Giri, Subhasis; Prohaska, Michael D
2013-09-30
Many watershed model interfaces have been developed in recent years for predicting field-scale sediment loads. They share the goal of providing data for decisions aimed at improving watershed health and the effectiveness of water quality conservation efforts. The objectives of this study were to: 1) compare three watershed-scale models (Soil and Water Assessment Tool (SWAT), Field_SWAT, and the High Impact Targeting (HIT) model) against calibrated field-scale model (RUSLE2) in estimating sediment yield from 41 randomly selected agricultural fields within the River Raisin watershed; 2) evaluate the statistical significance among models; 3) assess the watershed models' capabilities in identifying areas of concern at the field level; 4) evaluate the reliability of the watershed-scale models for field-scale analysis. The SWAT model produced the most similar estimates to RUSLE2 by providing the closest median and the lowest absolute error in sediment yield predictions, while the HIT model estimates were the worst. Concerning statistically significant differences between models, SWAT was the only model found to be not significantly different from the calibrated RUSLE2 at α = 0.05. Meanwhile, all models were incapable of identifying priorities areas similar to the RUSLE2 model. Overall, SWAT provided the most correct estimates (51%) within the uncertainty bounds of RUSLE2 and is the most reliable among the studied models, while HIT is the least reliable. The results of this study suggest caution should be exercised when using watershed-scale models for field level decision-making, while field specific data is of paramount importance. Copyright © 2013 Elsevier Ltd. All rights reserved.
Selection of stable scFv antibodies by phage display.
Brockmann, Eeva-Christine
2012-01-01
ScFv fragments are popular recombinant antibody formats but often suffer from limited stability. Phage display is a powerful tool in antibody engineering and applicable also for stability selection. ScFv variants with improved stability can be selected from large randomly mutated phage displayed libraries with a specific antigen after the unstable variants have been inactivated by heat or GdmCl. Irreversible scFv denaturation, which is a prerequisite for efficient selection, is achieved by combining denaturation with reduction of the intradomain disulfide bonds. Repeated selection cycles of increasing stringency result in enrichment of stabilized scFv fragments. Procedures for constructing a randomly mutated scFv library by error-prone PCR and phage display selection for enrichment of stable scFv antibodies from the library are described here.
Alternative Modal Basis Selection Procedures for Nonlinear Random Response Simulation
NASA Technical Reports Server (NTRS)
Przekop, Adam; Guo, Xinyun; Rizzi, Stephen A.
2010-01-01
Three procedures to guide selection of an efficient modal basis in a nonlinear random response analysis are examined. One method is based only on proper orthogonal decomposition, while the other two additionally involve smooth orthogonal decomposition. Acoustic random response problems are employed to assess the performance of the three modal basis selection approaches. A thermally post-buckled beam exhibiting snap-through behavior, a shallowly curved arch in the auto-parametric response regime and a plate structure are used as numerical test articles. The results of the three reduced-order analyses are compared with the results of the computationally taxing simulation in the physical degrees of freedom. For the cases considered, all three methods are shown to produce modal bases resulting in accurate and computationally efficient reduced-order nonlinear simulations.
An investigation into the probabilistic combination of quasi-static and random accelerations
NASA Technical Reports Server (NTRS)
Schock, R. W.; Tuell, L. P.
1984-01-01
The development of design load factors for aerospace and aircraft components and experiment support structures, which are subject to a simultaneous vehicle dynamic vibration (quasi-static) and acoustically generated random vibration, require the selection of a combination methodology. Typically, the procedure is to define the quasi-static and the random generated response separately, and arithmetically add or root sum square to get combined accelerations. Since the combination of a probabilistic and a deterministic function yield a probabilistic function, a viable alternate approach would be to determine the characteristics of the combined acceleration probability density function and select an appropriate percentile level for the combined acceleration. The following paper develops this mechanism and provides graphical data to select combined accelerations for most popular percentile levels.
Xu, Jia; Li, Chao; Li, Yiran; Lim, Chee Wah; Zhu, Zhiwen
2018-05-04
In this paper, a kind of single-walled carbon nanotube nonlinear model is developed and the strongly nonlinear dynamic characteristics of such carbon nanotubes subjected to random magnetic field are studied. The nonlocal effect of the microstructure is considered based on Eringen’s differential constitutive model. The natural frequency of the strongly nonlinear dynamic system is obtained by the energy function method, the drift coefficient and the diffusion coefficient are verified. The stationary probability density function of the system dynamic response is given and the fractal boundary of the safe basin is provided. Theoretical analysis and numerical simulation show that stochastic resonance occurs when varying the random magnetic field intensity. The boundary of safe basin has fractal characteristics and the area of safe basin decreases when the intensity of the magnetic field permeability increases.
Controlling dispersion forces between small particles with artificially created random light fields
Brügger, Georges; Froufe-Pérez, Luis S.; Scheffold, Frank; José Sáenz, Juan
2015-01-01
Appropriate combinations of laser beams can be used to trap and manipulate small particles with optical tweezers as well as to induce significant optical binding forces between particles. These interaction forces are usually strongly anisotropic depending on the interference landscape of the external fields. This is in contrast with the familiar isotropic, translationally invariant, van der Waals and, in general, Casimir–Lifshitz interactions between neutral bodies arising from random electromagnetic waves generated by equilibrium quantum and thermal fluctuations. Here we show, both theoretically and experimentally, that dispersion forces between small colloidal particles can also be induced and controlled using artificially created fluctuating light fields. Using optical tweezers as a gauge, we present experimental evidence for the predicted isotropic attractive interactions between dielectric microspheres induced by laser-generated, random light fields. These light-induced interactions open a path towards the control of translationally invariant interactions with tuneable strength and range in colloidal systems. PMID:26096622
Driving a Superconductor to Insulator Transition with Random Gauge Fields.
Nguyen, H Q; Hollen, S M; Shainline, J; Xu, J M; Valles, J M
2016-11-30
Typically the disorder that alters the interference of particle waves to produce Anderson localization is potential scattering from randomly placed impurities. Here we show that disorder in the form of random gauge fields that act directly on particle phases can also drive localization. We present evidence of a superfluid bose glass to insulator transition at a critical level of this gauge field disorder in a nano-patterned array of amorphous Bi islands. This transition shows signs of metallic transport near the critical point characterized by a resistance , indicative of a quantum phase transition. The critical disorder depends on interisland coupling in agreement with recent Quantum Monte Carlo simulations. We discuss how this disorder tuned SIT differs from the common frustration tuned SIT that also occurs in magnetic fields. Its discovery enables new high fidelity comparisons between theoretical and experimental studies of disorder effects on quantum critical systems.
Color- and motion-specific units in the tectum opticum of goldfish.
Gruber, Morna; Behrend, Konstantin; Neumeyer, Christa
2016-01-05
Extracellular recordings were performed from 69 units at different depths between 50 and [Formula: see text]m below the surface of tectum opticum in goldfish. Using large field stimuli (86[Formula: see text] visual angle) of 21 colored HKS-papers we were able to record from 54 color-sensitive units. The colored papers were presented for 5[Formula: see text]s each. They were arranged in the sequence of the color circle in humans separated by gray of medium brightness. We found 22 units with best responses between orange, red and pink. About 12 of these red-sensitive units were of the opponent "red-ON/blue-green-OFF" type as found in retinal bipolar- and ganglion cells as well. Most of them were also activated or inhibited by black and/or white. Some units responded specifically to red either with activation or inhibition. 18 units were sensitive to blue and/or green, 10 of them to both colors and most of them to black as well. They were inhibited by red, and belonged to the opponent "blue-green-ON/red-OFF" type. Other units responded more selectively either to blue, to green or to purple. Two units were selectively sensitive to yellow. A total of 15 units were sensitive to motion, stimulated by an excentrically rotating black and white random dot pattern. Activity of these units was also large when a red-green random dot pattern of high L-cone contrast was used. Activity dropped to zero when the red-green pattern did not modulate the L-cones. Neither of these motion selective units responded to any color. The results directly show color-blindness of motion vision, and confirm the hypothesis of separate and parallel processing of "color" and "motion".
Pantyley, Viktoriya
2017-09-21
The primary goals of the study were a critical analysis of the concepts associated with health from the perspective of sustainable development, and empirical analysis of health and health- related issues among the rural and urban residents of Eastern Poland in the context of the sustainable development of the region. The study was based on the following research methods: a systemic approach, selection and analysis of the literature and statistical data, developing a special questionnaire concerning socio-economic and health inequalities among the population in the studied area, field research with an interview questionnaire conducted on randomly-selected respondents (N=1,103) in randomly selected areas of the Lubelskie, Podkarpackie, Podlaskie and eastern part of Mazowieckie Provinces (with the division between provincial capital cities - county capital cities - other cities - rural areas). The results of statistical surveys in the studied area with the use of chi-square test and contingence quotients indicated a correlation between the state of health and the following independent variables: age, life quality, social position and financial situation (C-Pearson's coefficient over 0,300); a statistically significant yet weak correlation was recorded for gender, household size, place of residence and amount of free time. The conducted analysis proved the existence of a huge gap between state of health of the population in urban and rural areas. In order to eliminate unfavourable differences in the state iof health among the residents of Eastern Poland, and provide equal sustainable development in urban and rural areas of the examined areas, special preventive programmes aimed at the residents of peripheral, marginalized rural areas should be implemented. In these programmes, attention should be paid to preventive measures, early diagnosis of basic civilization and social diseases, and better accessibility to medical services for the residents.
Duchesne, Thierry; Fortin, Daniel; Rivest, Louis-Paul
2015-01-01
Animal movement has a fundamental impact on population and community structure and dynamics. Biased correlated random walks (BCRW) and step selection functions (SSF) are commonly used to study movements. Because no studies have contrasted the parameters and the statistical properties of their estimators for models constructed under these two Lagrangian approaches, it remains unclear whether or not they allow for similar inference. First, we used the Weak Law of Large Numbers to demonstrate that the log-likelihood function for estimating the parameters of BCRW models can be approximated by the log-likelihood of SSFs. Second, we illustrated the link between the two approaches by fitting BCRW with maximum likelihood and with SSF to simulated movement data in virtual environments and to the trajectory of bison (Bison bison L.) trails in natural landscapes. Using simulated and empirical data, we found that the parameters of a BCRW estimated directly from maximum likelihood and by fitting an SSF were remarkably similar. Movement analysis is increasingly used as a tool for understanding the influence of landscape properties on animal distribution. In the rapidly developing field of movement ecology, management and conservation biologists must decide which method they should implement to accurately assess the determinants of animal movement. We showed that BCRW and SSF can provide similar insights into the environmental features influencing animal movements. Both techniques have advantages. BCRW has already been extended to allow for multi-state modeling. Unlike BCRW, however, SSF can be estimated using most statistical packages, it can simultaneously evaluate habitat selection and movement biases, and can easily integrate a large number of movement taxes at multiple scales. SSF thus offers a simple, yet effective, statistical technique to identify movement taxis.
Prevalence and Characterization of Motile Salmonella in Commercial Layer Poultry Farms in Bangladesh
Barua, Himel; Biswas, Paritosh K.; Olsen, Katharina E. P.; Christensen, Jens P.
2012-01-01
Salmonella is a globally widespread food-borne pathogen having major impact on public health. All motile serovars of Salmonella enterica of poultry origin are zoonotic, and contaminated meat and raw eggs are an important source to human infections. Information on the prevalence of Salmonella at farm/holding level, and the zoonotic serovars circulating in layer poultry in the South and South-East Asian countries including Bangladesh, where small-scale commercial farms are predominant, is limited. To investigate the prevalence of Salmonella at layer farm level, and to identify the prevalent serovars we conducted a cross-sectional survey by randomly selecting 500 commercial layer poultry farms in Bangladesh. Faecal samples from the selected farms were collected following standard procedure, and examined for the presence of Salmonella using conventional bacteriological procedures. Thirty isolates were randomly selected, from the ninety obtained from the survey, for serotyping and characterized further by plasmid profiling and pulsed-field gel electrophoresis (PFGE). Results of the survey showed that the prevalence of motile Salmonella at layer farm level was 18% (95% confidence interval 15–21%), and Salmonella Kentucky was identified to be the only serovar circulating in the study population. Plasmid analysis of the S. Kentucky and non-serotyped isolates revealed two distinct profiles with a variation of two different sizes (2.7 and 4.8 kb). PFGE of the 30 S. Kentucky and 30 non-serotyped isolates showed that all of them were clonally related because only one genotype and three subtypes were determined based on the variation in two or three bands. This is also the first report on the presence of any specific serovar of Salmonella enterica in poultry in Bangladesh. PMID:22558269
Active learning: a step towards automating medical concept extraction.
Kholghi, Mahnoosh; Sitbon, Laurianne; Zuccon, Guido; Nguyen, Anthony
2016-03-01
This paper presents an automatic, active learning-based system for the extraction of medical concepts from clinical free-text reports. Specifically, (1) the contribution of active learning in reducing the annotation effort and (2) the robustness of incremental active learning framework across different selection criteria and data sets are determined. The comparative performance of an active learning framework and a fully supervised approach were investigated to study how active learning reduces the annotation effort while achieving the same effectiveness as a supervised approach. Conditional random fields as the supervised method, and least confidence and information density as 2 selection criteria for active learning framework were used. The effect of incremental learning vs standard learning on the robustness of the models within the active learning framework with different selection criteria was also investigated. The following 2 clinical data sets were used for evaluation: the Informatics for Integrating Biology and the Bedside/Veteran Affairs (i2b2/VA) 2010 natural language processing challenge and the Shared Annotated Resources/Conference and Labs of the Evaluation Forum (ShARe/CLEF) 2013 eHealth Evaluation Lab. The annotation effort saved by active learning to achieve the same effectiveness as supervised learning is up to 77%, 57%, and 46% of the total number of sequences, tokens, and concepts, respectively. Compared with the random sampling baseline, the saving is at least doubled. Incremental active learning is a promising approach for building effective and robust medical concept extraction models while significantly reducing the burden of manual annotation. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Barua, Himel; Biswas, Paritosh K; Olsen, Katharina E P; Christensen, Jens P
2012-01-01
Salmonella is a globally widespread food-borne pathogen having major impact on public health. All motile serovars of Salmonella enterica of poultry origin are zoonotic, and contaminated meat and raw eggs are an important source to human infections. Information on the prevalence of Salmonella at farm/holding level, and the zoonotic serovars circulating in layer poultry in the South and South-East Asian countries including Bangladesh, where small-scale commercial farms are predominant, is limited. To investigate the prevalence of Salmonella at layer farm level, and to identify the prevalent serovars we conducted a cross-sectional survey by randomly selecting 500 commercial layer poultry farms in Bangladesh. Faecal samples from the selected farms were collected following standard procedure, and examined for the presence of Salmonella using conventional bacteriological procedures. Thirty isolates were randomly selected, from the ninety obtained from the survey, for serotyping and characterized further by plasmid profiling and pulsed-field gel electrophoresis (PFGE). Results of the survey showed that the prevalence of motile Salmonella at layer farm level was 18% (95% confidence interval 15-21%), and Salmonella Kentucky was identified to be the only serovar circulating in the study population. Plasmid analysis of the S. Kentucky and non-serotyped isolates revealed two distinct profiles with a variation of two different sizes (2.7 and 4.8 kb). PFGE of the 30 S. Kentucky and 30 non-serotyped isolates showed that all of them were clonally related because only one genotype and three subtypes were determined based on the variation in two or three bands. This is also the first report on the presence of any specific serovar of Salmonella enterica in poultry in Bangladesh.
ERIC Educational Resources Information Center
Roeser, Robert W.; Schonert-Reichl, Kimberly A.; Jha, Amishi; Cullen, Margaret; Wallace, Linda; Wilensky, Rona; Oberle, Eva; Thomson, Kimberly; Taylor, Cynthia; Harrison, Jessica
2013-01-01
The effects of randomization to mindfulness training (MT) or to a waitlist-control condition on psychological and physiological indicators of teachers' occupational stress and burnout were examined in 2 field trials. The sample included 113 elementary and secondary school teachers (89% female) from Canada and the United States. Measures were…
ERIC Educational Resources Information Center
Al Otaiba, Stephanie; Lake, Vickie E.; Greulich, Luana; Folsom, Jessica S.; Guidry, Lisa
2012-01-01
This randomized-control trial examined the learning of preservice teachers taking an initial Early Literacy course in an early childhood education program and of the kindergarten or first grade students they tutored in their field experience. Preservice teachers were randomly assigned to one of two tutoring programs: Book Buddies and Tutor…
A Multisite Cluster Randomized Field Trial of Open Court Reading
ERIC Educational Resources Information Center
Borman, Geoffrey D.; Dowling, N. Maritza; Schneck, Carrie
2008-01-01
In this article, the authors report achievement outcomes of a multisite cluster randomized field trial of Open Court Reading 2005 (OCR), a K-6 literacy curriculum published by SRA/McGraw-Hill. The participants are 49 first-grade through fifth-grade classrooms from predominantly minority and poor contexts across the nation. Blocking by grade level…
Jian, Zhongping; Pearce, Jeremy; Mittleman, Daniel M
2003-07-18
We describe observations of the amplitude and phase of an electric field diffusing through a three-dimensional random medium, using terahertz time-domain spectroscopy. These measurements are spatially resolved with a resolution smaller than the speckle spot size and temporally resolved with a resolution better than one optical cycle. By computing correlation functions between fields measured at different positions and with different temporal delays, it is possible to obtain information about individual scattering events experienced by the diffusing field. This represents a new method for characterizing a multiply scattered wave.
Geng, Runzhe; Wang, Xiaoyan; Sharpley, Andrew N.; Meng, Fande
2015-01-01
Best management practices (BMPs) for agricultural diffuse pollution control are implemented at the field or small-watershed scale. However, the benefits of BMP implementation on receiving water quality at multiple spatial is an ongoing challenge. In this paper, we introduce an integrated approach that combines risk assessment (i.e., Phosphorus (P) index), model simulation techniques (Hydrological Simulation Program–FORTRAN), and a BMP placement tool at various scales to identify the optimal location for implementing multiple BMPs and estimate BMP effectiveness after implementation. A statistically significant decrease in nutrient discharge from watersheds is proposed to evaluate the effectiveness of BMPs, strategically targeted within watersheds. Specifically, we estimate two types of cost-effectiveness curves (total pollution reduction and proportion of watersheds improved) for four allocation approaches. Selection of a ‘‘best approach” depends on the relative importance of the two types of effectiveness, which involves a value judgment based on the random/aggregated degree of BMP distribution among and within sub-watersheds. A statistical optimization framework is developed and evaluated in Chaohe River Watershed located in the northern mountain area of Beijing. Results show that BMP implementation significantly (p >0.001) decrease P loss from the watershed. Remedial strategies where BMPs were targeted to areas of high risk of P loss, deceased P loads compared with strategies where BMPs were randomly located across watersheds. Sensitivity analysis indicated that aggregated BMP placement in particular watershed is the most cost-effective scenario to decrease P loss. The optimization approach outlined in this paper is a spatially hierarchical method for targeting nonpoint source controls across a range of scales from field to farm, to watersheds, to regions. Further, model estimates showed targeting at multiple scales is necessary to optimize program efficiency. The integrated model approach described that selects and places BMPs at varying levels of implementation, provides a new theoretical basis and technical guidance for diffuse pollution management in agricultural watersheds. PMID:26313561
Hollister, Brittany M; Restrepo, Nicole A; Farber-Eger, Eric; Crawford, Dana C; Aldrich, Melinda C; Non, Amy
2017-01-01
Socioeconomic status (SES) is a fundamental contributor to health, and a key factor underlying racial disparities in disease. However, SES data are rarely included in genetic studies due in part to the difficultly of collecting these data when studies were not originally designed for that purpose. The emergence of large clinic-based biobanks linked to electronic health records (EHRs) provides research access to large patient populations with longitudinal phenotype data captured in structured fields as billing codes, procedure codes, and prescriptions. SES data however, are often not explicitly recorded in structured fields, but rather recorded in the free text of clinical notes and communications. The content and completeness of these data vary widely by practitioner. To enable gene-environment studies that consider SES as an exposure, we sought to extract SES variables from racial/ethnic minority adult patients (n=9,977) in BioVU, the Vanderbilt University Medical Center biorepository linked to de-identified EHRs. We developed several measures of SES using information available within the de-identified EHR, including broad categories of occupation, education, insurance status, and homelessness. Two hundred patients were randomly selected for manual review to develop a set of seven algorithms for extracting SES information from de-identified EHRs. The algorithms consist of 15 categories of information, with 830 unique search terms. SES data extracted from manual review of 50 randomly selected records were compared to data produced by the algorithm, resulting in positive predictive values of 80.0% (education), 85.4% (occupation), 87.5% (unemployment), 63.6% (retirement), 23.1% (uninsured), 81.8% (Medicaid), and 33.3% (homelessness), suggesting some categories of SES data are easier to extract in this EHR than others. The SES data extraction approach developed here will enable future EHR-based genetic studies to integrate SES information into statistical analyses. Ultimately, incorporation of measures of SES into genetic studies will help elucidate the impact of the social environment on disease risk and outcomes.
Geng, Runzhe; Wang, Xiaoyan; Sharpley, Andrew N; Meng, Fande
2015-01-01
Best management practices (BMPs) for agricultural diffuse pollution control are implemented at the field or small-watershed scale. However, the benefits of BMP implementation on receiving water quality at multiple spatial is an ongoing challenge. In this paper, we introduce an integrated approach that combines risk assessment (i.e., Phosphorus (P) index), model simulation techniques (Hydrological Simulation Program-FORTRAN), and a BMP placement tool at various scales to identify the optimal location for implementing multiple BMPs and estimate BMP effectiveness after implementation. A statistically significant decrease in nutrient discharge from watersheds is proposed to evaluate the effectiveness of BMPs, strategically targeted within watersheds. Specifically, we estimate two types of cost-effectiveness curves (total pollution reduction and proportion of watersheds improved) for four allocation approaches. Selection of a ''best approach" depends on the relative importance of the two types of effectiveness, which involves a value judgment based on the random/aggregated degree of BMP distribution among and within sub-watersheds. A statistical optimization framework is developed and evaluated in Chaohe River Watershed located in the northern mountain area of Beijing. Results show that BMP implementation significantly (p >0.001) decrease P loss from the watershed. Remedial strategies where BMPs were targeted to areas of high risk of P loss, deceased P loads compared with strategies where BMPs were randomly located across watersheds. Sensitivity analysis indicated that aggregated BMP placement in particular watershed is the most cost-effective scenario to decrease P loss. The optimization approach outlined in this paper is a spatially hierarchical method for targeting nonpoint source controls across a range of scales from field to farm, to watersheds, to regions. Further, model estimates showed targeting at multiple scales is necessary to optimize program efficiency. The integrated model approach described that selects and places BMPs at varying levels of implementation, provides a new theoretical basis and technical guidance for diffuse pollution management in agricultural watersheds.
Abis, Gabor S A; Stockmann, Hein B A C; van Egmond, Marjolein; Bonjer, Hendrik J; Vandenbroucke-Grauls, Christina M J E; Oosterling, Steven J
2013-12-01
Gastrointestinal surgery is associated with a high incidence of infectious complications. Selective decontamination of the digestive tract is an antimicrobial prophylaxis regimen that aims to eradicate gastrointestinal carriage of potentially pathogenic microorganisms and represents an adjunct to regular prophylaxis in surgery. Relevant studies were identified using bibliographic searches of MEDLINE, EMBASE, and the Cochrane database (period from 1970 to November 1, 2012). Only studies investigating selective decontamination of the digestive tract in gastrointestinal surgery were included. Two randomized clinical trials and one retrospective case-control trial showed significant benefit in terms of infectious complications and anastomotic leakage in colorectal surgery. Two randomized controlled trials in esophageal surgery and two randomized clinical trials in gastric surgery reported lower levels of infectious complications. Selective decontamination of the digestive tract reduces infections following esophageal, gastric, and colorectal surgeries and also appears to have beneficial effects on anastomotic leakage in colorectal surgery. We believe these results provide the basis for a large multicenter prospective study to investigate the role of selective decontamination of the digestive tract in colorectal surgery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bromberger, Seth A.; Klymko, Christine F.; Henderson, Keith A.
Betweenness centrality is a graph statistic used to nd vertices that are participants in a large number of shortest paths in a graph. This centrality measure is commonly used in path and network interdiction problems and its complete form requires the calculation of all-pairs shortest paths for each vertex. This leads to a time complexity of O(jV jjEj), which is impractical for large graphs. Estimation of betweenness centrality has focused on performing shortest-path calculations on a subset of randomly- selected vertices. This reduces the complexity of the centrality estimation to O(jSjjEj); jSj < jV j, which can be scaled appropriatelymore » based on the computing resources available. An estimation strategy that uses random selection of vertices for seed selection is fast and simple to implement, but may not provide optimal estimation of betweenness centrality when the number of samples is constrained. Our experimentation has identi ed a number of alternate seed-selection strategies that provide lower error than random selection in common scale-free graphs. These strategies are discussed and experimental results are presented.« less
Self-excitation of a nonlinear scalar field in a random medium
Zeldovich, Ya. B.; Molchanov, S. A.; Ruzmaikin, A. A.; Sokoloff, D. D.
1987-01-01
We discuss the evolution in time of a scalar field under the influence of a random potential and diffusion. The cases of a short-correlation in time and of stationary potentials are considered. In a linear approximation and for sufficiently weak diffusion, the statistical moments of the field grow exponentially in time at growth rates that progressively increase with the order of the moment; this indicates the intermittent nature of the field. Nonlinearity halts this growth and in some cases can destroy the intermittency. However, in many nonlinear situations the intermittency is preserved: high, persistent peaks of the field exist against the background of a smooth field distribution. These widely spaced peaks may make a major contribution to the average characteristics of the field. PMID:16593872
Jo, Ick Hyun; Kim, Young Chang; Kim, Dong Hwi; Kim, Kee Hong; Hyun, Tae Kyung; Ryu, Hojin; Bang, Kyong Hwan
2017-10-01
The development of molecular markers is one of the most useful methods for molecular breeding and marker-based molecular associated selections. Even though there is less information on the reference genome, molecular markers are indispensable tools for determination of genetic variation and identification of species with high levels of accuracy and reproducibility. The demand for molecular approaches for marker-based breeding and genetic discriminations in Panax species has greatly increased in recent times and has been successfully applied for various purposes. However, owing to the existence of diverse molecular techniques and differences in their principles and applications, there should be careful consideration while selecting appropriate marker types. In this review, we outline the recent status of different molecular marker applications in ginseng research and industrial fields. In addition, we discuss the basic principles, requirements, and advantages and disadvantages of the most widely used molecular markers, including restriction fragment length polymorphism, random amplified polymorphic DNA, sequence tag sites, simple sequence repeats, and single nucleotide polymorphisms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, David B.; Gibbons, Steven J.; Rodgers, Arthur J.
In this approach, small scale-length medium perturbations not modeled in the tomographic inversion might be described as random fields, characterized by particular distribution functions (e.g., normal with specified spatial covariance). Conceivably, random field parameters (scatterer density or scale length) might themselves be the targets of tomographic inversions of the scattered wave field. As a result, such augmented models may provide processing gain through the use of probabilistic signal sub spaces rather than deterministic waveforms.
Harris, David B.; Gibbons, Steven J.; Rodgers, Arthur J.; ...
2012-05-01
In this approach, small scale-length medium perturbations not modeled in the tomographic inversion might be described as random fields, characterized by particular distribution functions (e.g., normal with specified spatial covariance). Conceivably, random field parameters (scatterer density or scale length) might themselves be the targets of tomographic inversions of the scattered wave field. As a result, such augmented models may provide processing gain through the use of probabilistic signal sub spaces rather than deterministic waveforms.
Statistical auditing and randomness test of lotto k/N-type games
NASA Astrophysics Data System (ADS)
Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Rapallo, F.; Scalas, E.
2008-11-01
One of the most popular lottery games worldwide is the so-called “lotto k/N”. It considers N numbers 1,2,…,N from which k are drawn randomly, without replacement. A player selects k or more numbers and the first prize is shared amongst those players whose selected numbers match all of the k randomly drawn. Exact rules may vary in different countries. In this paper, mean values and covariances for the random variables representing the numbers drawn from this kind of game are presented, with the aim of using them to audit statistically the consistency of a given sample of historical results with theoretical values coming from a hypergeometric statistical model. The method can be adapted to test pseudorandom number generators.
Resampling method for applying density-dependent habitat selection theory to wildlife surveys.
Tardy, Olivia; Massé, Ariane; Pelletier, Fanie; Fortin, Daniel
2015-01-01
Isodar theory can be used to evaluate fitness consequences of density-dependent habitat selection by animals. A typical habitat isodar is a regression curve plotting competitor densities in two adjacent habitats when individual fitness is equal. Despite the increasing use of habitat isodars, their application remains largely limited to areas composed of pairs of adjacent habitats that are defined a priori. We developed a resampling method that uses data from wildlife surveys to build isodars in heterogeneous landscapes without having to predefine habitat types. The method consists in randomly placing blocks over the survey area and dividing those blocks in two adjacent sub-blocks of the same size. Animal abundance is then estimated within the two sub-blocks. This process is done 100 times. Different functional forms of isodars can be investigated by relating animal abundance and differences in habitat features between sub-blocks. We applied this method to abundance data of raccoons and striped skunks, two of the main hosts of rabies virus in North America. Habitat selection by raccoons and striped skunks depended on both conspecific abundance and the difference in landscape composition and structure between sub-blocks. When conspecific abundance was low, raccoons and striped skunks favored areas with relatively high proportions of forests and anthropogenic features, respectively. Under high conspecific abundance, however, both species preferred areas with rather large corn-forest edge densities and corn field proportions. Based on random sampling techniques, we provide a robust method that is applicable to a broad range of species, including medium- to large-sized mammals with high mobility. The method is sufficiently flexible to incorporate multiple environmental covariates that can reflect key requirements of the focal species. We thus illustrate how isodar theory can be used with wildlife surveys to assess density-dependent habitat selection over large geographic extents.
Das, Banibrata
2014-06-01
The brick field industry is one of the oldest industries in India, which employs a large number of workers of poor socioeconomic status. The main aim of the present investigation is i) to determine the prevalence of musculoskeletal disorders among brick field workers, ii) to determine the prevalence of respiratory disorders and physiological stress among brick field workers compared to control workers. For this study, a total of 220 brick field workers and 130 control subjects were selected randomly. The control subjects were mainly involved in hand-intensive jobs. The Modified Nordic Questionnaire was applied to assess the discomfort felt among both groups of workers. Thermal stress was also assessed by measuring the WBGT index. The pulmonary functions were checked using the spirometry. Physiological assessment of the workload was carried out by recording the heart rate and blood pressure of the workers prior to work and just after work in the field. Brick field workers suffered from pain especially in the lower back (98%), hands (93%), knees (86%), wrists (85%), shoulders (76%) and neck (65%). Among the brick-making activities, brick field workers felt discomfort during spading for mud collection (98%), carrying bricks (95%) and molding (87%). The results showed a significantly lower p value < 0.001 in FVC, FEV1, FEV1/FVC ratio and PEFR in brick field workers compared to the control group. The post-activity heart rate of the brick field workers was 148.6 beats/min, whereas the systolic and diastolic blood pressure results were 152.8 and 78.5 mm/Hg, respectively. This study concludes that health of the brick field workers was highly affected due to working in unhealthy working conditions for a long period of time.
Potential of Using Mobile Phone Data to Assist in Mission Analysis and Area of Operations Planning
2015-08-01
tremendously beneficial especially since a sizeable portion of the population are nomads , changing location based on season. A proper AO...provided: a. User_id: Selected User’s random ID b. Timestamp: 24 h format YYYY-MM-DD-HH:M0:00 (the second digits of the minutes and all the seconds...yearly were selected. This data provided: a. User_id: Selected User’s random ID b. Timestamp: 24 h format YYYY-MM-DD-HH:M0:00 (the second digits
Discriminative Projection Selection Based Face Image Hashing
NASA Astrophysics Data System (ADS)
Karabat, Cagatay; Erdogan, Hakan
Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.
Topology-selective jamming of fully-connected, code-division random-access networks
NASA Technical Reports Server (NTRS)
Polydoros, Andreas; Cheng, Unjeng
1990-01-01
The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.
NASA Astrophysics Data System (ADS)
Hasuike, Takashi; Katagiri, Hideki
2010-10-01
This paper focuses on the proposition of a portfolio selection problem considering an investor's subjectivity and the sensitivity analysis for the change of subjectivity. Since this proposed problem is formulated as a random fuzzy programming problem due to both randomness and subjectivity presented by fuzzy numbers, it is not well-defined. Therefore, introducing Sharpe ratio which is one of important performance measures of portfolio models, the main problem is transformed into the standard fuzzy programming problem. Furthermore, using the sensitivity analysis for fuzziness, the analytical optimal portfolio with the sensitivity factor is obtained.
A multi-source precipitation approach to fill gaps over a radar precipitation field
NASA Astrophysics Data System (ADS)
Tesfagiorgis, K. B.; Mahani, S. E.; Khanbilvardi, R.
2012-12-01
Satellite Precipitation Estimates (SPEs) may be the only available source of information for operational hydrologic and flash flood prediction due to spatial limitations of radar and gauge products. The present work develops an approach to seamlessly blend satellite, radar, climatological and gauge precipitation products to fill gaps over ground-based radar precipitation fields. To mix different precipitation products, the bias of any of the products relative to each other should be removed. For bias correction, the study used an ensemble-based method which aims to estimate spatially varying multiplicative biases in SPEs using a radar rainfall product. Bias factors were calculated for a randomly selected sample of rainy pixels in the study area. Spatial fields of estimated bias were generated taking into account spatial variation and random errors in the sampled values. A weighted Successive Correction Method (SCM) is proposed to make the merging between error corrected satellite and radar rainfall estimates. In addition to SCM, we use a Bayesian spatial method for merging the gap free radar with rain gauges, climatological rainfall sources and SPEs. We demonstrate the method using SPE Hydro-Estimator (HE), radar- based Stage-II, a climatological product PRISM and rain gauge dataset for several rain events from 2006 to 2008 over three different geographical locations of the United States. Results show that: the SCM method in combination with the Bayesian spatial model produced a precipitation product in good agreement with independent measurements. The study implies that using the available radar pixels surrounding the gap area, rain gauge, PRISM and satellite products, a radar like product is achievable over radar gap areas that benefits the scientific community.
Drever, Mark C; Gyug, Les W; Nielsen, Jennifer; Stuart-Smith, A Kari; Ohanjanian, I Penny; Martin, Kathy
2015-01-01
Williamson's sapsucker (Sphyrapicus thyroideus) is a migratory woodpecker that breeds in mixed coniferous forests in western North America. In Canada, the range of this woodpecker is restricted to three small populations in southern British Columbia, precipitating a national listing as 'Endangered' in 2005, and the need to characterize critical habitat for its survival and recovery. We compared habitat attributes between Williamson's sapsucker nest territories and random points without nests or detections of this sapsucker as part of a resource selection analysis to identify the habitat features that best explain the probability of nest occurrence in two separate geographic regions in British Columbia. We compared the relative explanatory power of generalized linear models based on field-derived and Geographic Information System (GIS) data within both a 225 m and 800 m radius of a nest or random point. The model based on field-derived variables explained the most variation in nest occurrence in the Okanagan-East Kootenay Region, whereas nest occurrence was best explained by GIS information at the 800 m scale in the Western Region. Probability of nest occurrence was strongly tied to densities of potential nest trees, which included open forests with very large (diameter at breast height, DBH, ≥57.5 cm) western larch (Larix occidentalis) trees in the Okanagan-East Kootenay Region, and very large ponderosa pine (Pinus ponderosa) and large (DBH 17.5-57.5 cm) trembling aspen (Populus tremuloides) trees in the Western Region. Our results have the potential to guide identification and protection of critical habitat as required by the Species at Risk Act in Canada, and to better manage Williamson's sapsucker habitat overall in North America. In particular, management should focus on the maintenance and recruitment of very large western larch and ponderosa pine trees.
Finding SDSS Galaxy Clusters in 4-dimensional Color Space Using the False Discovery Rate
NASA Astrophysics Data System (ADS)
Nichol, R. C.; Miller, C. J.; Reichart, D.; Wasserman, L.; Genovese, C.; SDSS Collaboration
2000-12-01
We describe a recently developed statistical technique that provides a meaningful cut-off in probability-based decision making. We are concerned with multiple testing, where each test produces a well-defined probability (or p-value). By well-known, we mean that the null hypothesis used to determine the p-value is fully understood and appropriate. The method is entitled False Discovery Rate (FDR) and its largest advantage over other measures is that it allows one to specify a maximal amount of acceptable error. As an example of this tool, we apply FDR to a four-dimensional clustering algorithm using SDSS data. For each galaxy (or test galaxy), we count the number of neighbors that fit within one standard deviation of a four dimensional Gaussian centered on that test galaxy. The mean and standard deviation of that Gaussian are determined from the colors and errors of the test galaxy. We then take that same Gaussian and place it on a random selection of n galaxies and make a similar count. In the limit of large n, we expect the median count around these random galaxies to represent a typical field galaxy. For every test galaxy we determine the probability (or p-value) that it is a field galaxy based on these counts. A low p-value implies that the test galaxy is in a cluster environment. Once we have a p-value for every galaxy, we use FDR to determine at what level we should make our probability cut-off. Once this cut-off is made, we have a final sample of galaxies that are cluster-like galaxies. Using FDR, we also know the maximum amount of field contamination in our cluster galaxy sample. We present our preliminary galaxy clustering results using these methods.
Fénelon, M; Catros, S; Fricain, J C
2018-06-01
Since its first use for the reconstruction of tissue defects in the oral cavity in 1985, human amniotic membrane (hAM) has been widely studied in the field of oral surgery. Despite the growing number of publications in this field, there is no systematic review or meta-analysis concerning its clinical applications, outcome assessments, and relevance in oral surgery. The aim of this review is to provide a thorough understanding of the potential use of hAM for soft and hard tissue reconstruction in the oral cavity. A systematic electronic and a manual literature search of the MEDLINE-PubMed database and Scopus database was completed. Patient, Intervention, Comparison and Outcomes (PICO) technique was used to select the relevant articles to meet the objective. Studies using hAM for oral reconstruction, and conducted on human subjects, were included in this survey. A total of 17 articles were analyzed. Five areas of interest were identified as potential clinical application: periodontal surgery, cleft palate and tumor reconstruction, prosthodontics and peri-implant surgery. Overall, periodontal surgery was the only discipline to assess the efficacy of hAM with randomized clinical trials. The wide variability of preservation methods of hAM and the lack of objective measurements were observed in this study. hAM is already used in the field of oral surgery. Despite this, there is weak clinical evidence demonstrating convincingly the benefit of hAM in this area compared to standard surgery. Several studies now suggest the interest of hAM for periodontal tissue repair. Due to its biological and mechanical properties, hAM seems to be a promising treatment for wound healing in various areas of oral reconstruction. However, further randomized clinical trials are needed to confirm these preliminary results.
Kohler, Stefan
2013-01-01
Zimbabwean villagers of distinct background have resettled in government-organized land reforms for more than three decades. Against this backdrop, I assess the level of social cohesion in some of the newly established communities by estimating the average preferences for fairness in a structural model of bounded rationality. The estimations are based on behavioral data from an ultimatum game field experiment played by 234 randomly selected households in 6 traditional and 14 resettled villages almost two decades after resettlement. Equal or higher degrees of fairness are estimated in all resettlement schemes. In one, or arguably two, out of three distinct resettlement schemes studied, the resettled villagers exhibit significantly higher degrees of fairness ( ) and rationality ( ) than those who live in traditional villages. Overall, villagers appear similarly rational, but the attitude toward fairness is significantly stronger in resettled communities ( ). These findings are consistent with the idea of an increased need for cooperation required in recommencement. PMID:23724095
Kohler, Stefan
2013-01-01
Zimbabwean villagers of distinct background have resettled in government-organized land reforms for more than three decades. Against this backdrop, I assess the level of social cohesion in some of the newly established communities by estimating the average preferences for fairness in a structural model of bounded rationality. The estimations are based on behavioral data from an ultimatum game field experiment played by 234 randomly selected households in 6 traditional and 14 resettled villages almost two decades after resettlement. Equal or higher degrees of fairness are estimated in all resettlement schemes. In one, or arguably two, out of three distinct resettlement schemes studied, the resettled villagers exhibit significantly higher degrees of fairness (p ≤ 0.11) and rationality (p ≤ 0.04) than those who live in traditional villages. Overall, villagers appear similarly rational, but the attitude toward fairness is significantly stronger in resettled communities (p ≤ 0.01). These findings are consistent with the idea of an increased need for cooperation required in recommencement.
Region-Based Collision Avoidance Beaconless Geographic Routing Protocol in Wireless Sensor Networks.
Lee, JeongCheol; Park, HoSung; Kang, SeokYoon; Kim, Ki-Il
2015-06-05
Due to the lack of dependency on beacon messages for location exchange, the beaconless geographic routing protocol has attracted considerable attention from the research community. However, existing beaconless geographic routing protocols are likely to generate duplicated data packets when multiple winners in the greedy area are selected. Furthermore, these protocols are designed for a uniform sensor field, so they cannot be directly applied to practical irregular sensor fields with partial voids. To prevent the failure of finding a forwarding node and to remove unnecessary duplication, in this paper, we propose a region-based collision avoidance beaconless geographic routing protocol to increase forwarding opportunities for randomly-deployed sensor networks. By employing different contention priorities into the mutually-communicable nodes and the rest of the nodes in the greedy area, every neighbor node in the greedy area can be used for data forwarding without any packet duplication. Moreover, simulation results are given to demonstrate the increased packet delivery ratio and shorten end-to-end delay, rather than well-referred comparative protocols.
Region-Based Collision Avoidance Beaconless Geographic Routing Protocol in Wireless Sensor Networks
Lee, JeongCheol; Park, HoSung; Kang, SeokYoon; Kim, Ki-Il
2015-01-01
Due to the lack of dependency on beacon messages for location exchange, the beaconless geographic routing protocol has attracted considerable attention from the research community. However, existing beaconless geographic routing protocols are likely to generate duplicated data packets when multiple winners in the greedy area are selected. Furthermore, these protocols are designed for a uniform sensor field, so they cannot be directly applied to practical irregular sensor fields with partial voids. To prevent the failure of finding a forwarding node and to remove unnecessary duplication, in this paper, we propose a region-based collision avoidance beaconless geographic routing protocol to increase forwarding opportunities for randomly-deployed sensor networks. By employing different contention priorities into the mutually-communicable nodes and the rest of the nodes in the greedy area, every neighbor node in the greedy area can be used for data forwarding without any packet duplication. Moreover, simulation results are given to demonstrate the increased packet delivery ratio and shorten end-to-end delay, rather than well-referred comparative protocols. PMID:26057037
NASA Astrophysics Data System (ADS)
Beck, L.; Wood, B.; Whitney, S.; Rossi, R.; Spanner, M.; Rodriguez, M.; Rodriguez-Ramirez, A.; Salute, J.; Legters, L.; Roberts, D.; Rejmankova, E.; Washino, R.
1993-08-01
This paper describes a procedure whereby remote sensing and geographic information system (GIS) technologies are used in a sample design to study the habitat of Anopheles albimanus, one of the principle vectors of malaria in Central America. This procedure incorporates Landsat-derived land cover maps with digital elevation and road network data to identify a random selection of larval habitats accessible for field sampling. At the conclusion of the sampling season, the larval counts will be used to determine habitat productivity, and then integrated with information on human settlement to assess where people are at high risk of malaria. This aproach would be appropriate in areas where land cover information is lacking and problems of access constrain field sampling. The use of a GIS also permits other data (such as insecticide spraying data) to the incorporated in the sample design as they arise. This approach would also be pertinent for other tropical vector-borne diseases, particularly where human activities impact disease vector habitat.
Health safety of main water pipe materials supplied in China market.
Lu, Kai; Ding, Liang; Wang, Hong-Wei; Jing, Hai-Ning; Zhao, Xiao-Ning; Lin, Shao-Bin; Li, Ya-Dong; Jin, Yin-Long; Liu, Feng-Mao; Jiang, Shu-Ren
2006-04-01
To assess the health safety of copper, steel and plastic water pipes by field water quality investigations. Four consumers were randomly selected for each type of water pipes. Two consumers of every type of the water pipes had used the water pipes for more than 1 year and the other 2 consumers had used the water pipes for less than 3 months. The terminal volume of tap water in copper and steel water pipes should be not less than 0.1 liter, whereas that in plastic water pipes should be not less than 1 liter. The mean values of the experimental results in the second field water quality investigation of the copper and steel water pipes met the Sanitary Standards for Drinking Water Quality. The items of water sample of the plastic water pipes met the requirements of the Sanitary Standards for Drinking Water Quality. Copper, steel, and plastic pipes can be used as drinking water pipes.
Flowe, Heather D; Stewart, Jade; Sleath, Emma R; Palmer, Francesca T
2011-01-01
Previous research has found that drinking establishments are often antecedent to sexual aggression outcomes. In this study, male participants were randomly selected from public houses (i.e., "pubs") and asked to imagine themselves in a hypothetical intimate encounter in which the female in the scenario stops consenting to sexual contact. Participants were given the option to continue making sexual advances up to and including sexual intercourse against the woman's will. It was hypothesized based on Alcohol Myopia Theory that participant blood alcohol concentration (BAC) levels would be associated with hypothetical sexual aggression when stereotypical cues of a woman's sexual availability (revealing clothing and alcohol use) were present in the scenario. Men's engagement in hypothetical sexual aggression was associated with BAC levels, but only when the woman was wearing revealing clothing. The sobriety of the female actor was not associated with sexual aggression. Results indicate that Alcohol Myopia Theory generalizes to a field setting. © 2011 Wiley Periodicals, Inc.