Sample records for randomly selected points

  1. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology

    PubMed Central

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C.; Hobbs, Brian P.; Berry, Donald A.; Pentz, Rebecca D.; Tam, Alda; Hong, Waun K.; Ellis, Lee M.; Abbruzzese, James; Overman, Michael J.

    2015-01-01

    Purpose The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. Methods We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. Results A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P < .001). Twenty-eight studies (37.8%) reported a total of 65 unplanned end points; 52 (80.0%) of which were not identified as unplanned. Thirty-one (41.9%) and 19 (25.7%) of 74 trials reported a total of 52 unplanned analyses involving primary end points and 33 unplanned analyses involving nonprimary end points, respectively. Studies reported positive unplanned end points and unplanned analyses more frequently than negative outcomes in abstracts (unplanned end points odds ratio, 6.8; P = .002; unplanned analyses odd ratio, 8.4; P = .007). Conclusion Despite public and reviewer access to protocols, selective outcome reporting persists and is a major concern in the reporting of randomized clinical trials. To foster credible evidence-based medicine, additional initiatives are needed to minimize selective reporting. PMID:26304898

  2. A new mosaic method for three-dimensional surface

    NASA Astrophysics Data System (ADS)

    Yuan, Yun; Zhu, Zhaokun; Ding, Yongjun

    2011-08-01

    Three-dimensional (3-D) data mosaic is a indispensable link in surface measurement and digital terrain map generation. With respect to the mosaic problem of the local unorganized cloud points with rude registration and mass mismatched points, a new mosaic method for 3-D surface based on RANSAC is proposed. Every circular of this method is processed sequentially by random sample with additional shape constraint, data normalization of cloud points, absolute orientation, data denormalization of cloud points, inlier number statistic, etc. After N random sample trials the largest consensus set is selected, and at last the model is re-estimated using all the points in the selected subset. The minimal subset is composed of three non-colinear points which form a triangle. The shape of triangle is considered in random sample selection in order to make the sample selection reasonable. A new coordinate system transformation algorithm presented in this paper is used to avoid the singularity. The whole rotation transformation between the two coordinate systems can be solved by twice rotations expressed by Euler angle vector, each rotation has explicit physical means. Both simulation and real data are used to prove the correctness and validity of this mosaic method. This method has better noise immunity due to its robust estimation property, and has high accuracy as the shape constraint is added to random sample and the data normalization added to the absolute orientation. This method is applicable for high precision measurement of three-dimensional surface and also for the 3-D terrain mosaic.

  3. Using histograms to introduce randomization in the generation of ensembles of decision trees

    DOEpatents

    Kamath, Chandrika; Cantu-Paz, Erick; Littau, David

    2005-02-22

    A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

  4. Unbiased split variable selection for random survival forests using maximally selected rank statistics.

    PubMed

    Wright, Marvin N; Dankowski, Theresa; Ziegler, Andreas

    2017-04-15

    The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption may not always be fulfilled. An alternative approach for survival prediction is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistic, which favors splitting variables with many possible split points. Conditional inference forests avoid this split variable selection bias. However, linear rank statistics are utilized by default in conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. An alternative is to use maximally selected rank statistics for the split point selection. As in conditional inference forests, splitting variables are compared on the p-value scale. However, instead of the conditional Monte-Carlo approach used in conditional inference forests, p-value approximations are employed. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split variable selection is possible. However, there is a trade-off between unbiased split variable selection and runtime. In benchmark studies of prediction performance on simulated and real datasets, the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison, the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Evaluating the Variations in the Flood Susceptibility Maps Accuracies due to the Alterations in the Type and Extent of the Flood Inventory

    NASA Astrophysics Data System (ADS)

    Tehrany, M. Sh.; Jones, S.

    2017-10-01

    This paper explores the influence of the extent and density of the inventory data on the final outcomes. This study aimed to examine the impact of different formats and extents of the flood inventory data on the final susceptibility map. An extreme 2011 Brisbane flood event was used as the case study. LR model was applied using polygon and point formats of the inventory data. Random points of 1000, 700, 500, 300, 100 and 50 were selected and susceptibility mapping was undertaken using each group of random points. To perform the modelling Logistic Regression (LR) method was selected as it is a very well-known algorithm in natural hazard modelling due to its easily understandable, rapid processing time and accurate measurement approach. The resultant maps were assessed visually and statistically using Area under Curve (AUC) method. The prediction rates measured for susceptibility maps produced by polygon, 1000, 700, 500, 300, 100 and 50 random points were 63 %, 76 %, 88 %, 80 %, 74 %, 71 % and 65 % respectively. Evidently, using the polygon format of the inventory data didn't lead to the reasonable outcomes. In the case of random points, raising the number of points consequently increased the prediction rates, except for 1000 points. Hence, the minimum and maximum thresholds for the extent of the inventory must be set prior to the analysis. It is concluded that the extent and format of the inventory data are also two of the influential components in the precision of the modelling.

  6. Yet another method for triangulation and contouring for automated cartography

    NASA Technical Reports Server (NTRS)

    De Floriani, L.; Falcidieno, B.; Nasy, G.; Pienovi, C.

    1982-01-01

    An algorithm is presented for hierarchical subdivision of a set of three-dimensional surface observations. The data structure used for obtaining the desired triangulation is also singularly appropriate for extracting contours. Some examples are presented, and the results obtained are compared with those given by Delaunay triangulation. The data points selected by the algorithm provide a better approximation to the desired surface than do randomly selected points.

  7. GIS-based support vector machine modeling of earthquake-triggered landslide susceptibility in the Jianjiang River watershed, China

    NASA Astrophysics Data System (ADS)

    Xu, Chong; Dai, Fuchu; Xu, Xiwei; Lee, Yuan Hsi

    2012-04-01

    Support vector machine (SVM) modeling is based on statistical learning theory. It involves a training phase with associated input and target output values. In recent years, the method has become increasingly popular. The main purpose of this study is to evaluate the mapping power of SVM modeling in earthquake triggered landslide-susceptibility mapping for a section of the Jianjiang River watershed using a Geographic Information System (GIS) software. The river was affected by the Wenchuan earthquake of May 12, 2008. Visual interpretation of colored aerial photographs of 1-m resolution and extensive field surveys provided a detailed landslide inventory map containing 3147 landslides related to the 2008 Wenchuan earthquake. Elevation, slope angle, slope aspect, distance from seismogenic faults, distance from drainages, and lithology were used as the controlling parameters. For modeling, three groups of positive and negative training samples were used in concert with four different kernel functions. Positive training samples include the centroids of 500 large landslides, those of all 3147 landslides, and 5000 randomly selected points in landslide polygons. Negative training samples include 500, 3147, and 5000 randomly selected points on slopes that remained stable during the Wenchuan earthquake. The four kernel functions are linear, polynomial, radial basis, and sigmoid. In total, 12 cases of landslide susceptibility were mapped. Comparative analyses of landslide-susceptibility probability and area relation curves show that both the polynomial and radial basis functions suitably classified the input data as either landslide positive or negative though the radial basis function was more successful. The 12 generated landslide-susceptibility maps were compared with known landslide centroid locations and landslide polygons to verify the success rate and predictive accuracy of each model. The 12 results were further validated using area-under-curve analysis. Group 3 with 5000 randomly selected points on the landslide polygons, and 5000 randomly selected points along stable slopes gave the best results with a success rate of 79.20% and predictive accuracy of 79.13% under the radial basis function. Of all the results, the sigmoid kernel function was the least skillful when used in concert with the centroid data of all 3147 landslides as positive training samples, and the negative training samples of 3147 randomly selected points in regions of stable slope (success rate = 54.95%; predictive accuracy = 61.85%). This paper also provides suggestions and reference data for selecting appropriate training samples and kernel function types for earthquake triggered landslide-susceptibility mapping using SVM modeling. Predictive landslide-susceptibility maps could be useful in hazard mitigation by helping planners understand the probability of landslides in different regions.

  8. Disturbance characteristics of half-selected cells in a cross-point resistive switching memory array

    NASA Astrophysics Data System (ADS)

    Chen, Zhe; Li, Haitong; Chen, Hong-Yu; Chen, Bing; Liu, Rui; Huang, Peng; Zhang, Feifei; Jiang, Zizhen; Ye, Hongfei; Gao, Bin; Liu, Lifeng; Liu, Xiaoyan; Kang, Jinfeng; Wong, H.-S. Philip; Yu, Shimeng

    2016-05-01

    Disturbance characteristics of cross-point resistive random access memory (RRAM) arrays are comprehensively studied in this paper. An analytical model is developed to quantify the number of pulses (#Pulse) the cell can bear before disturbance occurs under various sub-switching voltage stresses based on physical understanding. An evaluation methodology is proposed to assess the disturb behavior of half-selected (HS) cells in cross-point RRAM arrays by combining the analytical model and SPICE simulation. The characteristics of cross-point RRAM arrays such as energy consumption, reliable operating cycles and total error bits are evaluated by the methodology. A possible solution to mitigate disturbance is proposed.

  9. Preliminary classification of forest vegetation of the Kenai Peninsula, Alaska.

    Treesearch

    K.M. Reynolds

    1990-01-01

    A total of 5,597 photo points was systematically located on 1:60,000-scale high altitude photographs of the Kenai Peninsula, Alaska; photo interpretation was used to classify the vegetation at each grid position. Of the total grid points, 12.3 percent were classified as timberland; 129 photo points within the timberland class were randomly selected for field survey....

  10. Trade-off study and computer simulation for assessing spacecraft pointing accuracy and stability capabilities

    NASA Astrophysics Data System (ADS)

    Algrain, Marcelo C.; Powers, Richard M.

    1997-05-01

    A case study, written in a tutorial manner, is presented where a comprehensive computer simulation is developed to determine the driving factors contributing to spacecraft pointing accuracy and stability. Models for major system components are described. Among them are spacecraft bus, attitude controller, reaction wheel assembly, star-tracker unit, inertial reference unit, and gyro drift estimators (Kalman filter). The predicted spacecraft performance is analyzed for a variety of input commands and system disturbances. The primary deterministic inputs are the desired attitude angles and rate set points. The stochastic inputs include random torque disturbances acting on the spacecraft, random gyro bias noise, gyro random walk, and star-tracker noise. These inputs are varied over a wide range to determine their effects on pointing accuracy and stability. The results are presented in the form of trade- off curves designed to facilitate the proper selection of subsystems so that overall spacecraft pointing accuracy and stability requirements are met.

  11. Evolving artificial metalloenzymes via random mutagenesis

    NASA Astrophysics Data System (ADS)

    Yang, Hao; Swartz, Alan M.; Park, Hyun June; Srivastava, Poonam; Ellis-Guardiola, Ken; Upp, David M.; Lee, Gihoon; Belsare, Ketaki; Gu, Yifan; Zhang, Chen; Moellering, Raymond E.; Lewis, Jared C.

    2018-03-01

    Random mutagenesis has the potential to optimize the efficiency and selectivity of protein catalysts without requiring detailed knowledge of protein structure; however, introducing synthetic metal cofactors complicates the expression and screening of enzyme libraries, and activity arising from free cofactor must be eliminated. Here we report an efficient platform to create and screen libraries of artificial metalloenzymes (ArMs) via random mutagenesis, which we use to evolve highly selective dirhodium cyclopropanases. Error-prone PCR and combinatorial codon mutagenesis enabled multiplexed analysis of random mutations, including at sites distal to the putative ArM active site that are difficult to identify using targeted mutagenesis approaches. Variants that exhibited significantly improved selectivity for each of the cyclopropane product enantiomers were identified, and higher activity than previously reported ArM cyclopropanases obtained via targeted mutagenesis was also observed. This improved selectivity carried over to other dirhodium-catalysed transformations, including N-H, S-H and Si-H insertion, demonstrating that ArMs evolved for one reaction can serve as starting points to evolve catalysts for others.

  12. Comparative Evaluations of Randomly Selected Four Point-of-Care Glucometer Devices in Addis Ababa, Ethiopia.

    PubMed

    Wolde, Mistire; Tarekegn, Getahun; Kebede, Tedla

    2018-05-01

    Point-of-care glucometer (PoCG) devices play a significant role in self-monitoring of the blood sugar level, particularly in the follow-up of high blood sugar therapeutic response. The aim of this study was to evaluate blood glucose test results performed with four randomly selected glucometers on diabetes and control subjects versus standard wet chemistry (hexokinase) methods in Addis Ababa, Ethiopia. A prospective cross-sectional study was conducted on randomly selected 200 study participants (100 participants with diabetes and 100 healthy controls). Four randomly selected PoCG devices (CareSens N, DIAVUE Prudential, On Call Extra, i-QARE DS-W) were evaluated against hexokinase method and ISO 15197:2003 and ISO 15197:2013 standards. The minimum and maximum blood sugar values were recorded by CareSens N (21 mg/dl) and hexokinase method (498.8 mg/dl), respectively. The mean sugar values of all PoCG devices except On Call Extra showed significant differences compared with the reference hexokinase method. Meanwhile, all four PoCG devices had strong positive relationship (>80%) with the reference method (hexokinase). On the other hand, none of the four PoCG devices fulfilled the minimum accuracy measurement set by ISO 15197:2003 and ISO 15197:2013 standards. In addition, the linear regression analysis revealed that all four selected PoCG overestimated the glucose concentrations. The overall evaluation of the selected four PoCG measurements were poorly correlated with standard reference method. Therefore, before introducing PoCG devices to the market, there should be a standardized evaluation platform for validation. Further similar large-scale studies on other PoCG devices also need to be undertaken.

  13. Pseudorandom number generation using chaotic true orbits of the Bernoulli map

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saito, Asaki, E-mail: saito@fun.ac.jp; Yamaguchi, Akihiro

    We devise a pseudorandom number generator that exactly computes chaotic true orbits of the Bernoulli map on quadratic algebraic integers. Moreover, we describe a way to select the initial points (seeds) for generating multiple pseudorandom binary sequences. This selection method distributes the initial points almost uniformly (equidistantly) in the unit interval, and latter parts of the generated sequences are guaranteed not to coincide. We also demonstrate through statistical testing that the generated sequences possess good randomness properties.

  14. Multivariate random regression analysis for body weight and main morphological traits in genetically improved farmed tilapia (Oreochromis niloticus).

    PubMed

    He, Jie; Zhao, Yunfeng; Zhao, Jingli; Gao, Jin; Han, Dandan; Xu, Pao; Yang, Runqing

    2017-11-02

    Because of their high economic importance, growth traits in fish are under continuous improvement. For growth traits that are recorded at multiple time-points in life, the use of univariate and multivariate animal models is limited because of the variable and irregular timing of these measures. Thus, the univariate random regression model (RRM) was introduced for the genetic analysis of dynamic growth traits in fish breeding. We used a multivariate random regression model (MRRM) to analyze genetic changes in growth traits recorded at multiple time-point of genetically-improved farmed tilapia. Legendre polynomials of different orders were applied to characterize the influences of fixed and random effects on growth trajectories. The final MRRM was determined by optimizing the univariate RRM for the analyzed traits separately via penalizing adaptively the likelihood statistical criterion, which is superior to both the Akaike information criterion and the Bayesian information criterion. In the selected MRRM, the additive genetic effects were modeled by Legendre polynomials of three orders for body weight (BWE) and body length (BL) and of two orders for body depth (BD). By using the covariance functions of the MRRM, estimated heritabilities were between 0.086 and 0.628 for BWE, 0.155 and 0.556 for BL, and 0.056 and 0.607 for BD. Only heritabilities for BD measured from 60 to 140 days of age were consistently higher than those estimated by the univariate RRM. All genetic correlations between growth time-points exceeded 0.5 for either single or pairwise time-points. Moreover, correlations between early and late growth time-points were lower. Thus, for phenotypes that are measured repeatedly in aquaculture, an MRRM can enhance the efficiency of the comprehensive selection for BWE and the main morphological traits.

  15. Point-Sampling and Line-Sampling Probability Theory, Geometric Implications, Synthesis

    Treesearch

    L.R. Grosenbaugh

    1958-01-01

    Foresters concerned with measuring tree populations on definite areas have long employed two well-known methods of representative sampling. In list or enumerative sampling the entire tree population is tallied with a known proportion being randomly selected and measured for volume or other variables. In area sampling all trees on randomly located plots or strips...

  16. Random covering of the circle: the configuration-space of the free deposition process

    NASA Astrophysics Data System (ADS)

    Huillet, Thierry

    2003-12-01

    Consider a circle of circumference 1. Throw at random n points, sequentially, on this circle and append clockwise an arc (or rod) of length s to each such point. The resulting random set (the free gas of rods) is a collection of a random number of clusters with random sizes. It models a free deposition process on a 1D substrate. For such processes, we shall consider the occurrence times (number of rods) and probabilities, as n grows, of the following configurations: those avoiding rod overlap (the hard-rod gas), those for which the largest gap is smaller than rod length s (the packing gas), those (parking configurations) for which hard rod and packing constraints are both fulfilled and covering configurations. Special attention is paid to the statistical properties of each such (rare) configuration in the asymptotic density domain when ns = rgr, for some finite density rgr of points. Using results from spacings in the random division of the circle, explicit large deviation rate functions can be computed in each case from state equations. Lastly, a process consisting in selecting at random one of these specific equilibrium configurations (called the observable) can be modelled. When particularized to the parking model, this system produces parking configurations differently from Rényi's random sequential adsorption model.

  17. Genetic algorithms applied to the scheduling of the Hubble Space Telescope

    NASA Technical Reports Server (NTRS)

    Sponsler, Jeffrey L.

    1989-01-01

    A prototype system employing a genetic algorithm (GA) has been developed to support the scheduling of the Hubble Space Telescope. A non-standard knowledge structure is used and appropriate genetic operators have been created. Several different crossover styles (random point selection, evolving points, and smart point selection) are tested and the best GA is compared with a neural network (NN) based optimizer. The smart crossover operator produces the best results and the GA system is able to evolve complete schedules using it. The GA is not as time-efficient as the NN system and the NN solutions tend to be better.

  18. Applications of random forest feature selection for fine-scale genetic population assignment.

    PubMed

    Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G

    2018-02-01

    Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.

  19. Selective randomized load balancing and mesh networks with changing demands

    NASA Astrophysics Data System (ADS)

    Shepherd, F. B.; Winzer, P. J.

    2006-05-01

    We consider the problem of building cost-effective networks that are robust to dynamic changes in demand patterns. We compare several architectures using demand-oblivious routing strategies. Traditional approaches include single-hop architectures based on a (static or dynamic) circuit-switched core infrastructure and multihop (packet-switched) architectures based on point-to-point circuits in the core. To address demand uncertainty, we seek minimum cost networks that can carry the class of hose demand matrices. Apart from shortest-path routing, Valiant's randomized load balancing (RLB), and virtual private network (VPN) tree routing, we propose a third, highly attractive approach: selective randomized load balancing (SRLB). This is a blend of dual-hop hub routing and randomized load balancing that combines the advantages of both architectures in terms of network cost, delay, and delay jitter. In particular, we give empirical analyses for the cost (in terms of transport and switching equipment) for the discussed architectures, based on three representative carrier networks. Of these three networks, SRLB maintains the resilience properties of RLB while achieving significant cost reduction over all other architectures, including RLB and multihop Internet protocol/multiprotocol label switching (IP/MPLS) networks using VPN-tree routing.

  20. Habitat selection by juvenile Mojave Desert tortoises

    USGS Publications Warehouse

    Todd, Brian D; Halstead, Brian J.; Chiquoine, Lindsay P.; Peaden, J. Mark; Buhlmann, Kurt A.; Tuberville, Tracey D.; Nafus, Melia G.

    2016-01-01

    Growing pressure to develop public lands for renewable energy production places several protected species at increased risk of habitat loss. One example is the Mojave desert tortoise (Gopherus agassizii), a species often at the center of conflicts over public land development. For this species and others on public lands, a better understanding of their habitat needs can help minimize negative impacts and facilitate protection or restoration of habitat. We used radio-telemetry to track 46 neonate and juvenile tortoises in the Eastern Mojave Desert, California, USA, to quantify habitat at tortoise locations and paired random points to assess habitat selection. Tortoise locations near burrows were more likely to be under canopy cover and had greater coverage of perennial plants (especially creosote [Larrea tridentata]), more coverage by washes, a greater number of small-mammal burrows, and fewer white bursage (Ambrosia dumosa) than random points. Active tortoise locations away from burrows were closer to washes and perennial plants than were random points. Our results can help planners locate juvenile tortoises and avoid impacts to habitat critical for this life stage. Additionally, our results provide targets for habitat protection and restoration and suggest that diverse and abundant small-mammal populations and the availability of creosote bush are vital for juvenile desert tortoises in the Eastern Mojave Desert.

  1. Evaluating Concentrations of Heavy Metals in the U.S. Peanut Crop in the Presence of Detection Limits

    USDA-ARS?s Scientific Manuscript database

    The concentration of mercury, cadmium, lead, and arsenic along with glyphosate and an extensive array of pesticides in the U.S. peanut crop was assessed for crop years 2013-2015. Samples were randomly selected from various buying points during the grading process. Samples were selected from the thre...

  2. The quality of reporting of randomized controlled trials of traditional Chinese medicine: a survey of 13 randomly selected journals from mainland China.

    PubMed

    Wang, Gang; Mao, Bing; Xiong, Ze-Yu; Fan, Tao; Chen, Xiao-Dong; Wang, Lei; Liu, Guan-Jian; Liu, Jia; Guo, Jia; Chang, Jing; Wu, Tai-Xiang; Li, Ting-Qian

    2007-07-01

    The number of randomized controlled trials (RCTs) of traditional Chinese medicine (TCM) is increasing. However, there have been few systematic assessments of the quality of reporting of these trials. This study was undertaken to evaluate the quality of reporting of RCTs in TCM journals published in mainland China from 1999 to 2004. Thirteen TCM journals were randomly selected by stratified sampling of the approximately 100 TCM journals published in mainland China. All issues of the selected journals published from 1999 to 2004 were hand-searched according to guidelines from the Cochrane Centre. All reviewers underwent training in the evaluation of RCTs at the Chinese Centre of Evidence-based Medicine. A comprehensive quality assessment of each RCT was completed using a modified version of the Consolidated Standards of Reporting Trials (CONSORT) checklist (total of 30 items) and the Jadad scale. Disagreements were resolved by consensus. Seven thousand four hundred twenty-two RCTs were identified. The proportion of published RCTs relative to all types of published clinical trials increased significantly over the period studied, from 18.6% in 1999 to 35.9% in 2004 (P < 0.001). The mean (SD) Jadad score was 1.03 (0.61) overall. One RCT had a Jadad score of 5 points; 14 had a score of 4 points; and 102 had a score of 3 points. The mean (SD) Jadad score was 0.85 (0.53) in 1999 (746 RCTs) and 1.20 (0.62) in 2004 (1634 RCTs). Across all trials, 39.4% of the items on the modified CONSORT checklist were reported, which was equivalent to 11.82 (5.78) of the 30 items. Some important methodologic components of RCTs were incompletely reported, such as sample-size calculation (reported in 1.1% of RCTs), randomization sequence (7.9%), allocation concealment (0.3 %), implementation of the random-allocation sequence (0%), and analysis of intention to treat (0%). The findings of this study indicate that the quality of reporting of RCTs of TCM has improved, but remains poor.

  3. The genealogy of samples in models with selection.

    PubMed

    Neuhauser, C; Krone, S M

    1997-02-01

    We introduce the genealogy of a random sample of genes taken from a large haploid population that evolves according to random reproduction with selection and mutation. Without selection, the genealogy is described by Kingman's well-known coalescent process. In the selective case, the genealogy of the sample is embedded in a graph with a coalescing and branching structure. We describe this graph, called the ancestral selection graph, and point out differences and similarities with Kingman's coalescent. We present simulations for a two-allele model with symmetric mutation in which one of the alleles has a selective advantage over the other. We find that when the allele frequencies in the population are already in equilibrium, then the genealogy does not differ much from the neutral case. This is supported by rigorous results. Furthermore, we describe the ancestral selection graph for other selective models with finitely many selection classes, such as the K-allele models, infinitely-many-alleles models. DNA sequence models, and infinitely-many-sites models, and briefly discuss the diploid case.

  4. The Genealogy of Samples in Models with Selection

    PubMed Central

    Neuhauser, C.; Krone, S. M.

    1997-01-01

    We introduce the genealogy of a random sample of genes taken from a large haploid population that evolves according to random reproduction with selection and mutation. Without selection, the genealogy is described by Kingman's well-known coalescent process. In the selective case, the genealogy of the sample is embedded in a graph with a coalescing and branching structure. We describe this graph, called the ancestral selection graph, and point out differences and similarities with Kingman's coalescent. We present simulations for a two-allele model with symmetric mutation in which one of the alleles has a selective advantage over the other. We find that when the allele frequencies in the population are already in equilibrium, then the genealogy does not differ much from the neutral case. This is supported by rigorous results. Furthermore, we describe the ancestral selection graph for other selective models with finitely many selection classes, such as the K-allele models, infinitely-many-alleles models, DNA sequence models, and infinitely-many-sites models, and briefly discuss the diploid case. PMID:9071604

  5. Line-of-sight pointing accuracy/stability analysis and computer simulation for small spacecraft

    NASA Astrophysics Data System (ADS)

    Algrain, Marcelo C.; Powers, Richard M.

    1996-06-01

    This paper presents a case study where a comprehensive computer simulation is developed to determine the driving factors contributing to spacecraft pointing accuracy and stability. The simulation is implemented using XMATH/SystemBuild software from Integrated Systems, Inc. The paper is written in a tutorial manner and models for major system components are described. Among them are spacecraft bus, attitude controller, reaction wheel assembly, star-tracker unit, inertial reference unit, and gyro drift estimators (Kalman filter). THe predicted spacecraft performance is analyzed for a variety of input commands and system disturbances. The primary deterministic inputs are desired attitude angles and rate setpoints. The stochastic inputs include random torque disturbances acting on the spacecraft, random gyro bias noise, gyro random walk, and star-tracker noise. These inputs are varied over a wide range to determine their effects on pointing accuracy and stability. The results are presented in the form of trade-off curves designed to facilitate the proper selection of subsystems so that overall spacecraft pointing accuracy and stability requirements are met.

  6. Real-time measurement of quality during the compaction of subgrade soils.

    DOT National Transportation Integrated Search

    2012-12-01

    Conventional quality control of subgrade soils during their compaction is usually performed by monitoring moisture content and dry density at a few discrete locations. However, randomly selected points do not adequately represent the entire compacted...

  7. Experimental Design Considerations for Establishing an Off-Road, Habitat-Specific Bird Monitoring Program Using Point Counts

    Treesearch

    JoAnn M. Hanowski; Gerald J. Niemi

    1995-01-01

    We established bird monitoring programs in two regions of Minnesota: the Chippewa National Forest and the Superior National Forest. The experimental design defined forest cover types as strata in which samples of forest stands were randomly selected. Subsamples (3 point counts) were placed in each stand to maximize field effort and to assess within-stand and between-...

  8. Image subsampling and point scoring approaches for large-scale marine benthic monitoring programs

    NASA Astrophysics Data System (ADS)

    Perkins, Nicholas R.; Foster, Scott D.; Hill, Nicole A.; Barrett, Neville S.

    2016-07-01

    Benthic imagery is an effective tool for quantitative description of ecologically and economically important benthic habitats and biota. The recent development of autonomous underwater vehicles (AUVs) allows surveying of spatial scales that were previously unfeasible. However, an AUV collects a large number of images, the scoring of which is time and labour intensive. There is a need to optimise the way that subsamples of imagery are chosen and scored to gain meaningful inferences for ecological monitoring studies. We examine the trade-off between the number of images selected within transects and the number of random points scored within images on the percent cover of target biota, the typical output of such monitoring programs. We also investigate the efficacy of various image selection approaches, such as systematic or random, on the bias and precision of cover estimates. We use simulated biotas that have varying size, abundance and distributional patterns. We find that a relatively small sampling effort is required to minimise bias. An increased precision for groups that are likely to be the focus of monitoring programs is best gained through increasing the number of images sampled rather than the number of points scored within images. For rare species, sampling using point count approaches is unlikely to provide sufficient precision, and alternative sampling approaches may need to be employed. The approach by which images are selected (simple random sampling, regularly spaced etc.) had no discernible effect on mean and variance estimates, regardless of the distributional pattern of biota. Field validation of our findings is provided through Monte Carlo resampling analysis of a previously scored benthic survey from temperate waters. We show that point count sampling approaches are capable of providing relatively precise cover estimates for candidate groups that are not overly rare. The amount of sampling required, in terms of both the number of images and number of points, varies with the abundance, size and distributional pattern of target biota. Therefore, we advocate either the incorporation of prior knowledge or the use of baseline surveys to establish key properties of intended target biota in the initial stages of monitoring programs.

  9. Speeding up Coarse Point Cloud Registration by Threshold-Independent Baysac Match Selection

    NASA Astrophysics Data System (ADS)

    Kang, Z.; Lindenbergh, R.; Pu, S.

    2016-06-01

    This paper presents an algorithm for the automatic registration of terrestrial point clouds by match selection using an efficiently conditional sampling method -- threshold-independent BaySAC (BAYes SAmpling Consensus) and employs the error metric of average point-to-surface residual to reduce the random measurement error and then approach the real registration error. BaySAC and other basic sampling algorithms usually need to artificially determine a threshold by which inlier points are identified, which leads to a threshold-dependent verification process. Therefore, we applied the LMedS method to construct the cost function that is used to determine the optimum model to reduce the influence of human factors and improve the robustness of the model estimate. Point-to-point and point-to-surface error metrics are most commonly used. However, point-to-point error in general consists of at least two components, random measurement error and systematic error as a result of a remaining error in the found rigid body transformation. Thus we employ the measure of the average point-to-surface residual to evaluate the registration accuracy. The proposed approaches, together with a traditional RANSAC approach, are tested on four data sets acquired by three different scanners in terms of their computational efficiency and quality of the final registration. The registration results show the st.dev of the average point-to-surface residuals is reduced from 1.4 cm (plain RANSAC) to 0.5 cm (threshold-independent BaySAC). The results also show that, compared to the performance of RANSAC, our BaySAC strategies lead to less iterations and cheaper computational cost when the hypothesis set is contaminated with more outliers.

  10. Augmentation effect of acupuncture on Bi'nao for hypophasis in patients with Bell's palsy: study protocol for a randomized controlled trial.

    PubMed

    Li, Xiaoyan; Chen, Chunlan; Zhao, Chuang; Li, Zunyuan; Liang, Wei; Liu, Zhidan

    2018-06-11

    Hypophasis is one of the most frequently observed sequelae of patients with Bell's palsy, who have not recovered completely, creating a clinical difficulty for physicians. Acupuncture therapy has been widely used to treat Bell's palsy as a reasonable resolution for management of symptoms such as hypophasis. The number of acupuncture points (acu-points) is frequently selected in the approach of acupuncture therapy; however, whether these had high efficiency has not been proved. According to the literature review, Bi'nao was useful for treating eye and eye lipid diseases, which could be proved only by some successful cases. Thus, a randomized controlled trial was designed to evaluate the efficiency of the acu-point Bi'nao. Participants with hypophasis as the major symptom are selected among patients with Bell's palsy and randomly allocated into one of the three groups at a 1:1:1 allocation ratio. All participants receive conventional acupuncture therapy; however, those assigned to the real acupuncture group will be given added acupuncture therapy on the acu-point Bi'nao, while those assigned to the sham acupuncture group were given extra acupuncture therapy on the sham Bi'nao as a placebo. The efficacy of the acupuncture therapy on the acu-point Bi'nao for hypophasis will be evaluated by Eye Crack Width Measurement (ECWM) and Eyelid Strength Assessment (ESA) before and after therapy. This is the first study assessing the safety and efficiency of Bi'nao in treating the hypophasis of patients with Bell's palsy that might support the application of this acupuncture therapy. However, evaluating hypophasis is challenging, and, thus, ECWM and ESA were applied to measure the eyelid movement. Chinese Clinical Trials Registry, ChiCTR-INR-17012955 . Registered on 12 October 2017.

  11. TRUNCATED RANDOM MEASURES

    DTIC Science & Technology

    2018-01-12

    sequential representations, a method is required for deter- mining which to use for the application at hand and, once a representation is selected, for...DISTRIBUTION UNLIMITED Methods , Assumptions, and Procedures 3.1 Background 3.1.1 CRMs and truncation Consider a Poisson point process on R+ := [0...the heart of the study of truncated CRMs. They provide an itera- tive method that can be terminated at any point to yield a finite approximation to the

  12. Registration algorithm of point clouds based on multiscale normal features

    NASA Astrophysics Data System (ADS)

    Lu, Jun; Peng, Zhongtao; Su, Hang; Xia, GuiHua

    2015-01-01

    The point cloud registration technology for obtaining a three-dimensional digital model is widely applied in many areas. To improve the accuracy and speed of point cloud registration, a registration method based on multiscale normal vectors is proposed. The proposed registration method mainly includes three parts: the selection of key points, the calculation of feature descriptors, and the determining and optimization of correspondences. First, key points are selected from the point cloud based on the changes of magnitude of multiscale curvatures obtained by using principal components analysis. Then the feature descriptor of each key point is proposed, which consists of 21 elements based on multiscale normal vectors and curvatures. The correspondences in a pair of two point clouds are determined according to the descriptor's similarity of key points in the source point cloud and target point cloud. Correspondences are optimized by using a random sampling consistency algorithm and clustering technology. Finally, singular value decomposition is applied to optimized correspondences so that the rigid transformation matrix between two point clouds is obtained. Experimental results show that the proposed point cloud registration algorithm has a faster calculation speed, higher registration accuracy, and better antinoise performance.

  13. Roosting habitat use and selection by northern spotted owls during natal dispersal

    USGS Publications Warehouse

    Sovern, Stan G.; Forsman, Eric D.; Dugger, Catherine M.; Taylor, Margaret

    2015-01-01

    We studied habitat selection by northern spotted owls (Strix occidentalis caurina) during natal dispersal in Washington State, USA, at both the roost site and landscape scales. We used logistic regression to obtain parameters for an exponential resource selection function based on vegetation attributes in roost and random plots in 76 forest stands that were used for roosting. We used a similar analysis to evaluate selection of landscape habitat attributes based on 301 radio-telemetry relocations and random points within our study area. We found no evidence of within-stand selection for any of the variables examined, but 78% of roosts were in stands with at least some large (>50 cm dbh) trees. At the landscape scale, owls selected for stands with high canopy cover (>70%). Dispersing owls selected vegetation types that were more similar to habitat selected by adult owls than habitat that would result from following guidelines previously proposed to maintain dispersal habitat. Our analysis indicates that juvenile owls select stands for roosting that have greater canopy cover than is recommended in current agency guidelines.

  14. Classification of epileptic EEG signals based on simple random sampling and sequential feature selection.

    PubMed

    Ghayab, Hadi Ratham Al; Li, Yan; Abdulla, Shahab; Diykh, Mohammed; Wan, Xiangkui

    2016-06-01

    Electroencephalogram (EEG) signals are used broadly in the medical fields. The main applications of EEG signals are the diagnosis and treatment of diseases such as epilepsy, Alzheimer, sleep problems and so on. This paper presents a new method which extracts and selects features from multi-channel EEG signals. This research focuses on three main points. Firstly, simple random sampling (SRS) technique is used to extract features from the time domain of EEG signals. Secondly, the sequential feature selection (SFS) algorithm is applied to select the key features and to reduce the dimensionality of the data. Finally, the selected features are forwarded to a least square support vector machine (LS_SVM) classifier to classify the EEG signals. The LS_SVM classifier classified the features which are extracted and selected from the SRS and the SFS. The experimental results show that the method achieves 99.90, 99.80 and 100 % for classification accuracy, sensitivity and specificity, respectively.

  15. Random forest feature selection approach for image segmentation

    NASA Astrophysics Data System (ADS)

    Lefkovits, László; Lefkovits, Szidónia; Emerich, Simina; Vaida, Mircea Florin

    2017-03-01

    In the field of image segmentation, discriminative models have shown promising performance. Generally, every such model begins with the extraction of numerous features from annotated images. Most authors create their discriminative model by using many features without using any selection criteria. A more reliable model can be built by using a framework that selects the important variables, from the point of view of the classification, and eliminates the unimportant once. In this article we present a framework for feature selection and data dimensionality reduction. The methodology is built around the random forest (RF) algorithm and its variable importance evaluation. In order to deal with datasets so large as to be practically unmanageable, we propose an algorithm based on RF that reduces the dimension of the database by eliminating irrelevant features. Furthermore, this framework is applied to optimize our discriminative model for brain tumor segmentation.

  16. A sampling strategy to estimate the area and perimeter of irregularly shaped planar regions

    Treesearch

    Timothy G. Gregoire; Harry T. Valentine

    1995-01-01

    The length of a randomly oriented ray emanating from an interior point of a planar region can be used to unbiasedly estimate the region's area and perimeter. Estimators and corresponding variance estimators under various selection strategies are presented.

  17. Stem Sinuosity, Tree Size, and Pest Injury of Machine-Planted Tress with and without Bent Taproots: A Comparison of Loblolly and Slash Pine

    Treesearch

    Jason A. Gatch; Timothy B. Harrington; Terry S. Price; M. Boyd Edwards

    1999-01-01

    Twenty-four maohine-planted stands each of slash (Pinus elliottii Engelm.) and loblolly pine (Pinus taeda L.) (between ages 3 to 10 years) were randomly selected in the Coastal Plain and Piedmont of Georgia, respectively. Ten points per site were located along a transect and two planted trees within a 10-m radius of each point were...

  18. The Aged Residential Care Healthcare Utilization Study (ARCHUS): a multidisciplinary, cluster randomized controlled trial designed to reduce acute avoidable hospitalizations from long-term care facilities.

    PubMed

    Connolly, Martin J; Boyd, Michal; Broad, Joanna B; Kerse, Ngaire; Lumley, Thomas; Whitehead, Noeline; Foster, Susan

    2015-01-01

    To assess effect of a complex, multidisciplinary intervention aimed at reducing avoidable acute hospitalization of residents of residential aged care (RAC) facilities. Cluster randomized controlled trial. RAC facilities with higher than expected hospitalizations in Auckland, New Zealand, were recruited and randomized to intervention or control. A total of 1998 residents of 18 intervention facilities and 18 control facilities. A facility-based complex intervention of 9 months' duration. The intervention comprised gerontology nurse specialist (GNS)-led staff education, facility bench-marking, GNS resident review, and multidisciplinary (geriatrician, primary-care physician, pharmacist, GNS, and facility nurse) discussion of residents selected using standard criteria. Primary end point was avoidable hospitalizations. Secondary end points were all acute admissions, mortality, and acute bed-days. Follow-up was for a total of 14 months. The intervention did not affect main study end points: number of acute avoidable hospital admissions (RR 1.07; 95% CI 0.85-1.36; P = .59) or mortality (RR 1.11; 95% CI 0.76-1.61; P = .62). This multidisciplinary intervention, packaging selected case review, and staff education had no overall impact on acute hospital admissions or mortality. This may have considerable implications for resourcing in the acute and RAC sectors in the face of population aging. Australian and New Zealand Clinical Trials Registry (ACTRN12611000187943). Copyright © 2015 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  19. Cerebral Laterality and Handedness in Aviation: Performance and Selection Implications

    DTIC Science & Technology

    1989-01-01

    population; orangutans , rhesus monkeys, and mice demonstrated this seemingly random pattern (253). Chimpanzees have recently been tested for...higher right Sylvian point In the brains of chimpanzees and orangutans (as in humans) (144), a larger right frontal lobe in the baboon (34), and the

  20. Random Time Identity Based Firewall In Mobile Ad hoc Networks

    NASA Astrophysics Data System (ADS)

    Suman, Patel, R. B.; Singh, Parvinder

    2010-11-01

    A mobile ad hoc network (MANET) is a self-organizing network of mobile routers and associated hosts connected by wireless links. MANETs are highly flexible and adaptable but at the same time are highly prone to security risks due to the open medium, dynamically changing network topology, cooperative algorithms, and lack of centralized control. Firewall is an effective means of protecting a local network from network-based security threats and forms a key component in MANET security architecture. This paper presents a review of firewall implementation techniques in MANETs and their relative merits and demerits. A new approach is proposed to select MANET nodes at random for firewall implementation. This approach randomly select a new node as firewall after fixed time and based on critical value of certain parameters like power backup. This approach effectively balances power and resource utilization of entire MANET because responsibility of implementing firewall is equally shared among all the nodes. At the same time it ensures improved security for MANETs from outside attacks as intruder will not be able to find out the entry point in MANET due to the random selection of nodes for firewall implementation.

  1. Random Walk Quantum Clustering Algorithm Based on Space

    NASA Astrophysics Data System (ADS)

    Xiao, Shufen; Dong, Yumin; Ma, Hongyang

    2018-01-01

    In the random quantum walk, which is a quantum simulation of the classical walk, data points interacted when selecting the appropriate walk strategy by taking advantage of quantum-entanglement features; thus, the results obtained when the quantum walk is used are different from those when the classical walk is adopted. A new quantum walk clustering algorithm based on space is proposed by applying the quantum walk to clustering analysis. In this algorithm, data points are viewed as walking participants, and similar data points are clustered using the walk function in the pay-off matrix according to a certain rule. The walk process is simplified by implementing a space-combining rule. The proposed algorithm is validated by a simulation test and is proved superior to existing clustering algorithms, namely, Kmeans, PCA + Kmeans, and LDA-Km. The effects of some of the parameters in the proposed algorithm on its performance are also analyzed and discussed. Specific suggestions are provided.

  2. Minimizing Statistical Bias with Queries.

    DTIC Science & Technology

    1995-09-14

    method for optimally selecting these points would o er enormous savings in time and money. An active learning system will typically attempt to select data...research in active learning assumes that the sec- ond term of Equation 2 is approximately zero, that is, that the learner is unbiased. If this is the case...outperforms the variance- minimizing algorithm and random exploration. and e ective strategy for active learning . I have given empirical evidence that, with

  3. Selective Cannabinoids for Chronic Neuropathic Pain: A Systematic Review and Meta-analysis.

    PubMed

    Meng, Howard; Johnston, Bradley; Englesakis, Marina; Moulin, Dwight E; Bhatia, Anuj

    2017-11-01

    There is a lack of consensus on the role of selective cannabinoids for the treatment of neuropathic pain (NP). Guidelines from national and international pain societies have provided contradictory recommendations. The primary objective of this systematic review and meta-analysis (SR-MA) was to determine the analgesic efficacy and safety of selective cannabinoids compared to conventional management or placebo for chronic NP. We reviewed randomized controlled trials that compared selective cannabinoids (dronabinol, nabilone, nabiximols) with conventional treatments (eg, pharmacotherapy, physical therapy, or a combination of these) or placebo in patients with chronic NP because patients with NP may be on any of these therapies or none if all standard treatments have failed to provide analgesia and or if these treatments have been associated with adverse effects. MEDLINE, EMBASE, and other major databases up to March 11, 2016, were searched. Data on scores of numerical rating scale for NP and its subtypes, central and peripheral, were meta-analyzed. The certainty of evidence was classified using the Grade of Recommendations Assessment, Development, and Evaluation approach. Eleven randomized controlled trials including 1219 patients (614 in selective cannabinoid and 605 in comparator groups) were included in this SR-MA. There was variability in the studies in quality of reporting, etiology of NP, type and dose of selective cannabinoids. Patients who received selective cannabinoids reported a significant, but clinically small, reduction in mean numerical rating scale pain scores (0-10 scale) compared with comparator groups (-0.65 points; 95% confidence interval, -1.06 to -0.23 points; P = .002, I = 60%; Grade of Recommendations Assessment, Development, and Evaluation: weak recommendation and moderate-quality evidence). Use of selective cannabinoids was also associated with improvements in quality of life and sleep with no major adverse effects. Selective cannabinoids provide a small analgesic benefit in patients with chronic NP. There was a high degree of heterogeneity among publications included in this SR-MA. Well-designed, large, randomized studies are required to better evaluate specific dosage, duration of intervention, and the effect of this intervention on physical and psychologic function.

  4. 40 CFR 795.250 - Developmental neurotoxicity screen.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the tests when conducted at about the same age. (C) One male and one female shall be randomly selected... searching, compulsive biting or licking, self-mutilation, circling, and walking backwards. (C) The presence... reliability is required. At a minimum, the end points outlined in paragraph (c)(6)(ii) of this section shall...

  5. 40 CFR 795.250 - Developmental neurotoxicity screen.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the tests when conducted at about the same age. (C) One male and one female shall be randomly selected... searching, compulsive biting or licking, self-mutilation, circling, and walking backwards. (C) The presence... reliability is required. At a minimum, the end points outlined in paragraph (c)(6)(ii) of this section shall...

  6. Genetic Algorithm Phase Retrieval for the Systematic Image-Based Optical Alignment Testbed

    NASA Technical Reports Server (NTRS)

    Rakoczy, John; Steincamp, James; Taylor, Jaime

    2003-01-01

    A reduced surrogate, one point crossover genetic algorithm with random rank-based selection was used successfully to estimate the multiple phases of a segmented optical system modeled on the seven-mirror Systematic Image-Based Optical Alignment testbed located at NASA's Marshall Space Flight Center.

  7. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data.

    PubMed

    Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia

    2017-07-28

    Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  8. Optimal Strategy for Integrated Dynamic Inventory Control and Supplier Selection in Unknown Environment via Stochastic Dynamic Programming

    NASA Astrophysics Data System (ADS)

    Sutrisno; Widowati; Solikhin

    2016-06-01

    In this paper, we propose a mathematical model in stochastic dynamic optimization form to determine the optimal strategy for an integrated single product inventory control problem and supplier selection problem where the demand and purchasing cost parameters are random. For each time period, by using the proposed model, we decide the optimal supplier and calculate the optimal product volume purchased from the optimal supplier so that the inventory level will be located at some point as close as possible to the reference point with minimal cost. We use stochastic dynamic programming to solve this problem and give several numerical experiments to evaluate the model. From the results, for each time period, the proposed model was generated the optimal supplier and the inventory level was tracked the reference point well.

  9. Vortex-Core Reversal Dynamics: Towards Vortex Random Access Memory

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Koog

    2011-03-01

    An energy-efficient, ultrahigh-density, ultrafast, and nonvolatile solid-state universal memory is a long-held dream in the field of information-storage technology. The magnetic random access memory (MRAM) along with a spin-transfer-torque switching mechanism is a strong candidate-means of realizing that dream, given its nonvolatility, infinite endurance, and fast random access. Magnetic vortices in patterned soft magnetic dots promise ground-breaking applications in information-storage devices, owing to the very stable twofold ground states of either their upward or downward core magnetization orientation and plausible core switching by in-plane alternating magnetic fields or spin-polarized currents. However, two technologically most important but very challenging issues --- low-power recording and reliable selection of each memory cell with already existing cross-point architectures --- have not yet been resolved for the basic operations in information storage, that is, writing (recording) and readout. Here, we experimentally demonstrate a magnetic vortex random access memory (VRAM) in the basic cross-point architecture. This unique VRAM offers reliable cell selection and low-power-consumption control of switching of out-of-plane core magnetizations using specially designed rotating magnetic fields generated by two orthogonal and unipolar Gaussian-pulse currents along with optimized pulse width and time delay. Our achievement of a new device based on a new material, that is, a medium composed of patterned vortex-state disks, together with the new physics on ultrafast vortex-core switching dynamics, can stimulate further fruitful research on MRAMs that are based on vortex-state dot arrays.

  10. Random and non-random mating populations: Evolutionary dynamics in meiotic drive.

    PubMed

    Sarkar, Bijan

    2016-01-01

    Game theoretic tools are utilized to analyze a one-locus continuous selection model of sex-specific meiotic drive by considering nonequivalence of the viabilities of reciprocal heterozygotes that might be noticed at an imprinted locus. The model draws attention to the role of viability selections of different types to examine the stable nature of polymorphic equilibrium. A bridge between population genetics and evolutionary game theory has been built up by applying the concept of the Fundamental Theorem of Natural Selection. In addition to pointing out the influences of male and female segregation ratios on selection, configuration structure reveals some noted results, e.g., Hardy-Weinberg frequencies hold in replicator dynamics, occurrence of faster evolution at the maximized variance fitness, existence of mixed Evolutionarily Stable Strategy (ESS) in asymmetric games, the tending evolution to follow not only a 1:1 sex ratio but also a 1:1 different alleles ratio at particular gene locus. Through construction of replicator dynamics in the group selection framework, our selection model introduces a redefining bases of game theory to incorporate non-random mating where a mating parameter associated with population structure is dependent on the social structure. Also, the model exposes the fact that the number of polymorphic equilibria will depend on the algebraic expression of population structure. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Checked Out: Ohioans' Views on Education 2009

    ERIC Educational Resources Information Center

    Thomas B. Fordham Institute, 2009

    2009-01-01

    In collaboration with the Thomas B. Fordham Institute and Catalyst Ohio, the FDR Group conducted a telephone survey of 1,002 randomly selected Ohio residents between April 1 and April 9, 2009 (margin of error +/- 3 percentage points). The survey--the third in a series--reports Ohioans' views on critical education issues, including school funding,…

  12. Characteristics and Professional Concerns of Organization Development Practitioners.

    ERIC Educational Resources Information Center

    Patten, Thomas H., Jr.; And Others

    A study was undertaken of organization development (OD) programs, from the point of view of the organization members who plan and conduct them, to gain information for meaningful planning by the American Society for Training and Development (ASTD). A questionnaire was returned by 103 of 450 randomly selected OD practitioners. Most respondents had…

  13. Prone position as prevention of lung injury in comatose patients: a prospective, randomized, controlled study.

    PubMed

    Beuret, Pascal; Carton, Marie-Jose; Nourdine, Karim; Kaaki, Mahmoud; Tramoni, Gerard; Ducreux, Jean-Claude

    2002-05-01

    Comatose patients frequently exhibit pulmonary function worsening, especially in cases of pulmonary infection. It appears to have a deleterious effect on neurologic outcome. We therefore conducted a randomized trial to determine whether daily prone positioning would prevent lung worsening in these patients. Prospective, randomized, controlled study. Sixteen-bed intensive care unit. Fifty-one patients who required invasive mechanical ventilation because of coma with Glascow coma scores of 9 or less. In the prone position (PP) group: prone positioning for 4 h once daily until the patients could get up to sit in an armchair; in the supine position (SP) group: supine positioning. The primary end point was the incidence of lung worsening defined by an increase in the Lung Injury Score of at least 1 point since the time of randomization. The secondary end point was the incidence of ventilator-associated pneumonia (VAP). A total of 25 patients were randomly assigned to the PP group and 26 patients to the SP group. The characteristics of the patients from the two groups were similar at randomization. The incidence of lung worsening was lower in the PP group (12%) than in the SP group (50%) ( p=0.003). The incidence of VAP was 20% in the PP group and 38.4% in the SP group ( p=0.14). There was no serious complication attributable to prone positioning, however, there was a significant increase of intracranial pressure in the PP. In a selected population of comatose ventilated patients, daily prone positioning reduced the incidence of lung worsening.

  14. Point process statistics in atom probe tomography.

    PubMed

    Philippe, T; Duguay, S; Grancher, G; Blavette, D

    2013-09-01

    We present a review of spatial point processes as statistical models that we have designed for the analysis and treatment of atom probe tomography (APT) data. As a major advantage, these methods do not require sampling. The mean distance to nearest neighbour is an attractive approach to exhibit a non-random atomic distribution. A χ(2) test based on distance distributions to nearest neighbour has been developed to detect deviation from randomness. Best-fit methods based on first nearest neighbour distance (1 NN method) and pair correlation function are presented and compared to assess the chemical composition of tiny clusters. Delaunay tessellation for cluster selection has been also illustrated. These statistical tools have been applied to APT experiments on microelectronics materials. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Vector control of wind turbine on the basis of the fuzzy selective neural net*

    NASA Astrophysics Data System (ADS)

    Engel, E. A.; Kovalev, I. V.; Engel, N. E.

    2016-04-01

    An article describes vector control of wind turbine based on fuzzy selective neural net. Based on the wind turbine system’s state, the fuzzy selective neural net tracks an maximum power point under random perturbations. Numerical simulations are accomplished to clarify the applicability and advantages of the proposed vector wind turbine’s control on the basis of the fuzzy selective neuronet. The simulation results show that the proposed intelligent control of wind turbine achieves real-time control speed and competitive performance, as compared to a classical control model with PID controllers based on traditional maximum torque control strategy.

  16. Distribution majorization of corner points by reinforcement learning for moving object detection

    NASA Astrophysics Data System (ADS)

    Wu, Hao; Yu, Hao; Zhou, Dongxiang; Cheng, Yongqiang

    2018-04-01

    Corner points play an important role in moving object detection, especially in the case of free-moving camera. Corner points provide more accurate information than other pixels and reduce the computation which is unnecessary. Previous works only use intensity information to locate the corner points, however, the information that former and the last frames provided also can be used. We utilize the information to focus on more valuable area and ignore the invaluable area. The proposed algorithm is based on reinforcement learning, which regards the detection of corner points as a Markov process. In the Markov model, the video to be detected is regarded as environment, the selections of blocks for one corner point are regarded as actions and the performance of detection is regarded as state. Corner points are assigned to be the blocks which are seperated from original whole image. Experimentally, we select a conventional method which uses marching and Random Sample Consensus algorithm to obtain objects as the main framework and utilize our algorithm to improve the result. The comparison between the conventional method and the same one with our algorithm show that our algorithm reduce 70% of the false detection.

  17. Mixed-Poisson Point Process with Partially-Observed Covariates: Ecological Momentary Assessment of Smoking.

    PubMed

    Neustifter, Benjamin; Rathbun, Stephen L; Shiffman, Saul

    2012-01-01

    Ecological Momentary Assessment is an emerging method of data collection in behavioral research that may be used to capture the times of repeated behavioral events on electronic devices, and information on subjects' psychological states through the electronic administration of questionnaires at times selected from a probability-based design as well as the event times. A method for fitting a mixed Poisson point process model is proposed for the impact of partially-observed, time-varying covariates on the timing of repeated behavioral events. A random frailty is included in the point-process intensity to describe variation among subjects in baseline rates of event occurrence. Covariate coefficients are estimated using estimating equations constructed by replacing the integrated intensity in the Poisson score equations with a design-unbiased estimator. An estimator is also proposed for the variance of the random frailties. Our estimators are robust in the sense that no model assumptions are made regarding the distribution of the time-varying covariates or the distribution of the random effects. However, subject effects are estimated under gamma frailties using an approximate hierarchical likelihood. The proposed approach is illustrated using smoking data.

  18. Shaping Attention with Reward: Effects of Reward on Space- and Object-Based Selection

    PubMed Central

    Shomstein, Sarah; Johnson, Jacoba

    2014-01-01

    The contribution of rewarded actions to automatic attentional selection remains obscure. We hypothesized that some forms of automatic orienting, such as object-based selection, can be completely abandoned in lieu of reward maximizing strategy. While presenting identical visual stimuli to the observer, in a set of two experiments, we manipulate what is being rewarded (different object targets or random object locations) and the type of reward received (money or points). It was observed that reward alone guides attentional selection, entirely predicting behavior. These results suggest that guidance of selective attention, while automatic, is flexible and can be adjusted in accordance with external non-sensory reward-based factors. PMID:24121412

  19. Experimental Design in Clinical 'Omics Biomarker Discovery.

    PubMed

    Forshed, Jenny

    2017-11-03

    This tutorial highlights some issues in the experimental design of clinical 'omics biomarker discovery, how to avoid bias and get as true quantities as possible from biochemical analyses, and how to select samples to improve the chance of answering the clinical question at issue. This includes the importance of defining clinical aim and end point, knowing the variability in the results, randomization of samples, sample size, statistical power, and how to avoid confounding factors by including clinical data in the sample selection, that is, how to avoid unpleasant surprises at the point of statistical analysis. The aim of this Tutorial is to help translational clinical and preclinical biomarker candidate research and to improve the validity and potential of future biomarker candidate findings.

  20. Influence of olfactory and visual cover on nest site selection and nest success for grassland-nesting birds.

    PubMed

    Fogarty, Dillon T; Elmore, R Dwayne; Fuhlendorf, Samuel D; Loss, Scott R

    2017-08-01

    Habitat selection by animals is influenced by and mitigates the effects of predation and environmental extremes. For birds, nest site selection is crucial to offspring production because nests are exposed to extreme weather and predation pressure. Predators that forage using olfaction often dominate nest predator communities; therefore, factors that influence olfactory detection (e.g., airflow and weather variables, including turbulence and moisture) should influence nest site selection and survival. However, few studies have assessed the importance of olfactory cover for habitat selection and survival. We assessed whether ground-nesting birds select nest sites based on visual and/or olfactory cover. Additionally, we assessed the importance of visual cover and airflow and weather variables associated with olfactory cover in influencing nest survival. In managed grasslands in Oklahoma, USA, we monitored nests of Northern Bobwhite ( Colinus virginianus ), Eastern Meadowlark ( Sturnella magna ), and Grasshopper Sparrow ( Ammodramus savannarum ) during 2015 and 2016. To assess nest site selection, we compared cover variables between nests and random points. To assess factors influencing nest survival, we used visual cover and olfactory-related measurements (i.e., airflow and weather variables) to model daily nest survival. For nest site selection, nest sites had greater overhead visual cover than random points, but no other significant differences were found. Weather variables hypothesized to influence olfactory detection, specifically precipitation and relative humidity, were the best predictors of and were positively related to daily nest survival. Selection for overhead cover likely contributed to mitigation of thermal extremes and possibly reduced detectability of nests. For daily nest survival, we hypothesize that major nest predators focused on prey other than the monitored species' nests during high moisture conditions, thus increasing nest survival on these days. Our study highlights how mechanistic approaches to studying cover informs which dimensions are perceived and selected by animals and which dimensions confer fitness-related benefits.

  1. Recursive Branching Simulated Annealing Algorithm

    NASA Technical Reports Server (NTRS)

    Bolcar, Matthew; Smith, J. Scott; Aronstein, David

    2012-01-01

    This innovation is a variation of a simulated-annealing optimization algorithm that uses a recursive-branching structure to parallelize the search of a parameter space for the globally optimal solution to an objective. The algorithm has been demonstrated to be more effective at searching a parameter space than traditional simulated-annealing methods for a particular problem of interest, and it can readily be applied to a wide variety of optimization problems, including those with a parameter space having both discrete-value parameters (combinatorial) and continuous-variable parameters. It can take the place of a conventional simulated- annealing, Monte-Carlo, or random- walk algorithm. In a conventional simulated-annealing (SA) algorithm, a starting configuration is randomly selected within the parameter space. The algorithm randomly selects another configuration from the parameter space and evaluates the objective function for that configuration. If the objective function value is better than the previous value, the new configuration is adopted as the new point of interest in the parameter space. If the objective function value is worse than the previous value, the new configuration may be adopted, with a probability determined by a temperature parameter, used in analogy to annealing in metals. As the optimization continues, the region of the parameter space from which new configurations can be selected shrinks, and in conjunction with lowering the annealing temperature (and thus lowering the probability for adopting configurations in parameter space with worse objective functions), the algorithm can converge on the globally optimal configuration. The Recursive Branching Simulated Annealing (RBSA) algorithm shares some features with the SA algorithm, notably including the basic principles that a starting configuration is randomly selected from within the parameter space, the algorithm tests other configurations with the goal of finding the globally optimal solution, and the region from which new configurations can be selected shrinks as the search continues. The key difference between these algorithms is that in the SA algorithm, a single path, or trajectory, is taken in parameter space, from the starting point to the globally optimal solution, while in the RBSA algorithm, many trajectories are taken; by exploring multiple regions of the parameter space simultaneously, the algorithm has been shown to converge on the globally optimal solution about an order of magnitude faster than when using conventional algorithms. Novel features of the RBSA algorithm include: 1. More efficient searching of the parameter space due to the branching structure, in which multiple random configurations are generated and multiple promising regions of the parameter space are explored; 2. The implementation of a trust region for each parameter in the parameter space, which provides a natural way of enforcing upper- and lower-bound constraints on the parameters; and 3. The optional use of a constrained gradient- search optimization, performed on the continuous variables around each branch s configuration in parameter space to improve search efficiency by allowing for fast fine-tuning of the continuous variables within the trust region at that configuration point.

  2. Simultaneous feature selection and parameter optimisation using an artificial ant colony: case study of melting point prediction.

    PubMed

    O'Boyle, Noel M; Palmer, David S; Nigsch, Florian; Mitchell, John Bo

    2008-10-29

    We present a novel feature selection algorithm, Winnowing Artificial Ant Colony (WAAC), that performs simultaneous feature selection and model parameter optimisation for the development of predictive quantitative structure-property relationship (QSPR) models. The WAAC algorithm is an extension of the modified ant colony algorithm of Shen et al. (J Chem Inf Model 2005, 45: 1024-1029). We test the ability of the algorithm to develop a predictive partial least squares model for the Karthikeyan dataset (J Chem Inf Model 2005, 45: 581-590) of melting point values. We also test its ability to perform feature selection on a support vector machine model for the same dataset. Starting from an initial set of 203 descriptors, the WAAC algorithm selected a PLS model with 68 descriptors which has an RMSE on an external test set of 46.6 degrees C and R2 of 0.51. The number of components chosen for the model was 49, which was close to optimal for this feature selection. The selected SVM model has 28 descriptors (cost of 5, epsilon of 0.21) and an RMSE of 45.1 degrees C and R2 of 0.54. This model outperforms a kNN model (RMSE of 48.3 degrees C, R2 of 0.47) for the same data and has similar performance to a Random Forest model (RMSE of 44.5 degrees C, R2 of 0.55). However it is much less prone to bias at the extremes of the range of melting points as shown by the slope of the line through the residuals: -0.43 for WAAC/SVM, -0.53 for Random Forest. With a careful choice of objective function, the WAAC algorithm can be used to optimise machine learning and regression models that suffer from overfitting. Where model parameters also need to be tuned, as is the case with support vector machine and partial least squares models, it can optimise these simultaneously. The moving probabilities used by the algorithm are easily interpreted in terms of the best and current models of the ants, and the winnowing procedure promotes the removal of irrelevant descriptors.

  3. Strategies for efficient resolution analysis in full-waveform inversion

    NASA Astrophysics Data System (ADS)

    Fichtner, A.; van Leeuwen, T.; Trampert, J.

    2016-12-01

    Full-waveform inversion is developing into a standard method in the seismological toolbox. It combines numerical wave propagation for heterogeneous media with adjoint techniques in order to improve tomographic resolution. However, resolution becomes increasingly difficult to quantify because of the enormous computational requirements. Here we present two families of methods that can be used for efficient resolution analysis in full-waveform inversion. They are based on the targeted extraction of resolution proxies from the Hessian matrix, which is too large to store and to compute explicitly. Fourier methods rest on the application of the Hessian to Earth models with harmonic oscillations. This yields the Fourier spectrum of the Hessian for few selected wave numbers, from which we can extract properties of the tomographic point-spread function for any point in space. Random probing methods use uncorrelated, random test models instead of harmonic oscillations. Auto-correlating the Hessian-model applications for sufficiently many test models also characterises the point-spread function. Both Fourier and random probing methods provide a rich collection of resolution proxies. These include position- and direction-dependent resolution lengths, and the volume of point-spread functions as indicator of amplitude recovery and inter-parameter trade-offs. The computational requirements of these methods are equivalent to approximately 7 conjugate-gradient iterations in full-waveform inversion. This is significantly less than the optimisation itself, which may require tens to hundreds of iterations to reach convergence. In addition to the theoretical foundations of the Fourier and random probing methods, we show various illustrative examples from real-data full-waveform inversion for crustal and mantle structure.

  4. A Comparative Study on Power Point Presentation and Traditional Lecture Method in Material Understandability, Effectiveness and Attitude

    ERIC Educational Resources Information Center

    Sewasew, Daniel; Mengestle, Missaye; Abate, Gebeyehu

    2015-01-01

    The aim of this study was to compare PPT and traditional lecture method in material understandability, effectiveness and attitude among university students. Comparative descriptive survey research design was employed to answer the research questions raised. Four hundred and twenty nine participants were selected randomly using stratified sampling…

  5. A Follow-Up of Subjects Scoring above 180 IQ in Terman's "Genetic Studies of Genius."

    ERIC Educational Resources Information Center

    Feldman, David Henry

    1984-01-01

    Using the Terman files, 26 subjects with scores above 180 IQ were compared with 26 randomly selected subjects from Terman's sample. Findings were generally that the extra IQ points made little difference and that extremely high IQ does not seem to indicate "genius" in the commonly understood sense. (Author/CL)

  6. Prevalence of Attention Deficit Hyperactivity Disorder and Associated Features among Children in France

    ERIC Educational Resources Information Center

    Lecendreux, Michel; Konofal, Eric; Faraone, Stephen V.

    2011-01-01

    Background: Earlier studies point to the prevalence of attention deficit hyperactivity disorder (ADHD) to be similar around the world. There is, however, a wide variety in estimates. The prevalence of ADHD in youth has never been examined in France. Method: Starting with 18 million telephone numbers, 7,912 numbers are randomly selected. Among the…

  7. Acceptability of Adaptations for Struggling Writers: A National Survey with Primary-Grade Teachers

    ERIC Educational Resources Information Center

    Graham, Steve; Harris, Karen R.; Bartlett, Brendan J.; Popadopoulou, Eleni; Santoro, Julia

    2016-01-01

    One hundred twenty-five primary-grade teachers randomly selected from across the United States indicated how frequently they made 20 instructional adaptations for the struggling writers in their classroom. The measure of frequency ranged from never, several times a year, monthly, weekly, several times a week, and daily. Using a 6-point Likert-type…

  8. The African American Student Network: An Intervention for Retention

    ERIC Educational Resources Information Center

    Grier-Reed, Tabitha; Arcinue, Ferdinand; Inman, Evetta

    2016-01-01

    Comparing retention rates for 91 Black women and 56 Black men who participated in the African American Student Network with 68 women and 36 men who were randomly selected from the population of Black undergraduates at a Midwestern university, we included an analysis of covariance to control for ACT score and first-term grade point average. Results…

  9. Educational Service Quality in Zanjan University of Medical Sciences from Students' Point of View

    ERIC Educational Resources Information Center

    Mohammadi, Ali; Mohammadi, Jamshid

    2014-01-01

    This study aims at evaluating perceived service quality in Zanjan University of Medical Sciences (ZUMS). This study was cross-sectional and authors surveyed educational services at ZUMS. Through stratified random sampling, 384 students were selected and an adapted SERVQUAL instrument was used for data collection. Data analysis was performed by…

  10. Evolution of Randomized Trials in Advanced/Metastatic Soft Tissue Sarcoma: End Point Selection, Surrogacy, and Quality of Reporting.

    PubMed

    Zer, Alona; Prince, Rebecca M; Amir, Eitan; Abdul Razak, Albiruni

    2016-05-01

    Randomized controlled trials (RCTs) in soft tissue sarcoma (STS) have used varying end points. The surrogacy of intermediate end points, such as progression-free survival (PFS), response rate (RR), and 3-month and 6-month PFS (3moPFS and 6moPFS) with overall survival (OS), remains unknown. The quality of efficacy and toxicity reporting in these studies is also uncertain. A systematic review of systemic therapy RCTs in STS was performed. Surrogacy between intermediate end points and OS was explored using weighted linear regression for the hazard ratio for OS with the hazard ratio for PFS or the odds ratio for RR, 3moPFS, and 6moPFS. The quality of reporting for efficacy and toxicity was also evaluated. Fifty-two RCTs published between 1974 and 2014, comprising 9,762 patients, met the inclusion criteria. There were significant correlations between PFS and OS (R = 0.61) and between RR and OS (R = 0.51). Conversely, there were nonsignificant correlations between 3moPFS and 6moPFS with OS. A reduction in the use of RR as the primary end point was observed over time, favoring time-based events (P for trend = .02). In 14% of RCTs, the primary end point was not met, but the study was reported as being positive. Toxicity was comprehensively reported in 47% of RCTs, whereas 14% inadequately reported toxicity. In advanced STS, PFS and RR seem to be appropriate surrogates for OS. There is poor correlation between OS and both 3moPFS and 6moPFS. As such, caution is urged with the use of these as primary end points in randomized STS trials. The quality of toxicity reporting and interpretation of results is suboptimal. © 2016 by American Society of Clinical Oncology.

  11. SOP: parallel surrogate global optimization with Pareto center selection for computationally expensive single objective problems

    DOE PAGES

    Krityakierne, Tipaluck; Akhtar, Taimoor; Shoemaker, Christine A.

    2016-02-02

    This paper presents a parallel surrogate-based global optimization method for computationally expensive objective functions that is more effective for larger numbers of processors. To reach this goal, we integrated concepts from multi-objective optimization and tabu search into, single objective, surrogate optimization. Our proposed derivative-free algorithm, called SOP, uses non-dominated sorting of points for which the expensive function has been previously evaluated. The two objectives are the expensive function value of the point and the minimum distance of the point to previously evaluated points. Based on the results of non-dominated sorting, P points from the sorted fronts are selected as centersmore » from which many candidate points are generated by random perturbations. Based on surrogate approximation, the best candidate point is subsequently selected for expensive evaluation for each of the P centers, with simultaneous computation on P processors. Centers that previously did not generate good solutions are tabu with a given tenure. We show almost sure convergence of this algorithm under some conditions. The performance of SOP is compared with two RBF based methods. The test results show that SOP is an efficient method that can reduce time required to find a good near optimal solution. In a number of cases the efficiency of SOP is so good that SOP with 8 processors found an accurate answer in less wall-clock time than the other algorithms did with 32 processors.« less

  12. Exploring a potential energy surface by machine learning for characterizing atomic transport

    NASA Astrophysics Data System (ADS)

    Kanamori, Kenta; Toyoura, Kazuaki; Honda, Junya; Hattori, Kazuki; Seko, Atsuto; Karasuyama, Masayuki; Shitara, Kazuki; Shiga, Motoki; Kuwabara, Akihide; Takeuchi, Ichiro

    2018-03-01

    We propose a machine-learning method for evaluating the potential barrier governing atomic transport based on the preferential selection of dominant points for atomic transport. The proposed method generates numerous random samples of the entire potential energy surface (PES) from a probabilistic Gaussian process model of the PES, which enables defining the likelihood of the dominant points. The robustness and efficiency of the method are demonstrated on a dozen model cases for proton diffusion in oxides, in comparison with a conventional nudge elastic band method.

  13. Random regression models for the prediction of days to weight, ultrasound rib eye area, and ultrasound back fat depth in beef cattle.

    PubMed

    Speidel, S E; Peel, R K; Crews, D H; Enns, R M

    2016-02-01

    Genetic evaluation research designed to reduce the required days to a specified end point has received very little attention in pertinent scientific literature, given that its economic importance was first discussed in 1957. There are many production scenarios in today's beef industry, making a prediction for the required number of days to a single end point a suboptimal option. Random regression is an attractive alternative to calculate days to weight (DTW), days to ultrasound back fat (DTUBF), and days to ultrasound rib eye area (DTUREA) genetic predictions that could overcome weaknesses of a single end point prediction. The objective of this study was to develop random regression approaches for the prediction of the DTW, DTUREA, and DTUBF. Data were obtained from the Agriculture and Agri-Food Canada Research Centre, Lethbridge, AB, Canada. Data consisted of records on 1,324 feedlot cattle spanning 1999 to 2007. Individual animals averaged 5.77 observations with weights, ultrasound rib eye area (UREA), ultrasound back fat depth (UBF), and ages ranging from 293 to 863 kg, 73.39 to 129.54 cm, 1.53 to 30.47 mm, and 276 to 519 d, respectively. Random regression models using Legendre polynomials were used to regress age of the individual on weight, UREA, and UBF. Fixed effects in the model included an overall fixed regression of age on end point (weight, UREA, and UBF) nested within breed to account for the mean relationship between age and weight as well as a contemporary group effect consisting of breed of the animal (Angus, Charolais, and Charolais sired), feedlot pen, and year of measure. Likelihood ratio tests were used to determine the appropriate random polynomial order. Use of the quadratic polynomial did not account for any additional genetic variation in days for DTW ( > 0.11), for DTUREA ( > 0.18), and for DTUBF ( > 0.20) when compared with the linear random polynomial. Heritability estimates from the linear random regression for DTW ranged from 0.54 to 0.74, corresponding to end points of 293 and 863 kg, respectively. Heritability for DTUREA ranged from 0.51 to 0.34 and for DTUBF ranged from 0.55 to 0.37. These estimates correspond to UREA end points of 35 and 125 cm and UBF end points of 1.53 and 30 mm, respectively. This range of heritability shows DTW, DTUREA, and DTUBF to be highly heritable and indicates that selection pressure aimed at reducing the number of days to reach a finish weight end point can result in genetic change given sufficient data.

  14. Using Nonlinear Stochastic Evolutionary Game Strategy to Model an Evolutionary Biological Network of Organ Carcinogenesis Under a Natural Selection Scheme

    PubMed Central

    Chen, Bor-Sen; Tsai, Kun-Wei; Li, Cheng-Wei

    2015-01-01

    Molecular biologists have long recognized carcinogenesis as an evolutionary process that involves natural selection. Cancer is driven by the somatic evolution of cell lineages. In this study, the evolution of somatic cancer cell lineages during carcinogenesis was modeled as an equilibrium point (ie, phenotype of attractor) shifting, the process of a nonlinear stochastic evolutionary biological network. This process is subject to intrinsic random fluctuations because of somatic genetic and epigenetic variations, as well as extrinsic disturbances because of carcinogens and stressors. In order to maintain the normal function (ie, phenotype) of an evolutionary biological network subjected to random intrinsic fluctuations and extrinsic disturbances, a network robustness scheme that incorporates natural selection needs to be developed. This can be accomplished by selecting certain genetic and epigenetic variations to modify the network structure to attenuate intrinsic fluctuations efficiently and to resist extrinsic disturbances in order to maintain the phenotype of the evolutionary biological network at an equilibrium point (attractor). However, during carcinogenesis, the remaining (or neutral) genetic and epigenetic variations accumulate, and the extrinsic disturbances become too large to maintain the normal phenotype at the desired equilibrium point for the nonlinear evolutionary biological network. Thus, the network is shifted to a cancer phenotype at a new equilibrium point that begins a new evolutionary process. In this study, the natural selection scheme of an evolutionary biological network of carcinogenesis was derived from a robust negative feedback scheme based on the nonlinear stochastic Nash game strategy. The evolvability and phenotypic robustness criteria of the evolutionary cancer network were also estimated by solving a Hamilton–Jacobi inequality – constrained optimization problem. The simulation revealed that the phenotypic shift of the lung cancer-associated cell network takes 54.5 years from a normal state to stage I cancer, 1.5 years from stage I to stage II cancer, and 2.5 years from stage II to stage III cancer, with a reasonable match for the statistical result of the average age of lung cancer. These results suggest that a robust negative feedback scheme, based on a stochastic evolutionary game strategy, plays a critical role in an evolutionary biological network of carcinogenesis under a natural selection scheme. PMID:26244004

  15. Improvement of Automated POST Case Success Rate Using Support Vector Machines

    NASA Technical Reports Server (NTRS)

    Zwack, Matthew R.; Dees, Patrick D.

    2017-01-01

    During early conceptual design of complex systems, concept down selection can have a large impact upon program life-cycle cost. Therefore, any concepts selected during early design will inherently commit program costs and affect the overall probability of program success. For this reason it is important to consider as large a design space as possible in order to better inform the down selection process. For conceptual design of launch vehicles, trajectory analysis and optimization often presents the largest obstacle to evaluating large trade spaces. This is due to the sensitivity of the trajectory discipline to changes in all other aspects of the vehicle design. Small deltas in the performance of other subsystems can result in relatively large fluctuations in the ascent trajectory because the solution space is non-linear and multi-modal [1]. In order to help capture large design spaces for new launch vehicles, the authors have performed previous work seeking to automate the execution of the industry standard tool, Program to Optimize Simulated Trajectories (POST). This work initially focused on implementation of analyst heuristics to enable closure of cases in an automated fashion, with the goal of applying the concepts of design of experiments (DOE) and surrogate modeling to enable near instantaneous throughput of vehicle cases [2]. Additional work was then completed to improve the DOE process by utilizing a graph theory based approach to connect similar design points [3]. The conclusion of the previous work illustrated the utility of the graph theory approach for completing a DOE through POST. However, this approach was still dependent upon the use of random repetitions to generate seed points for the graph. As noted in [3], only 8% of these random repetitions resulted in converged trajectories. This ultimately affects the ability of the random reps method to confidently approach the global optima for a given vehicle case in a reasonable amount of time. With only an 8% pass rate, tens or hundreds of thousands of reps may be needed to be confident that the best repetition is at least close to the global optima. However, typical design study time constraints require that fewer repetitions be attempted, sometimes resulting in seed points that have only a handful of successful completions. If a small number of successful repetitions are used to generate a seed point, the graph method may inherit some inaccuracies as it chains DOE cases from the non-global-optimal seed points. This creates inherent noise in the graph data, which can limit the accuracy of the resulting surrogate models. For this reason, the goal of this work is to improve the seed point generation method and ultimately the accuracy of the resulting POST surrogate model. The work focuses on increasing the case pass rate for seed point generation.

  16. Dynamic laser speckle analyzed considering inhomogeneities in the biological sample

    NASA Astrophysics Data System (ADS)

    Braga, Roberto A.; González-Peña, Rolando J.; Viana, Dimitri Campos; Rivera, Fernando Pujaico

    2017-04-01

    Dynamic laser speckle phenomenon allows a contactless and nondestructive way to monitor biological changes that are quantified by second-order statistics applied in the images in time using a secondary matrix known as time history of the speckle pattern (THSP). To avoid being time consuming, the traditional way to build the THSP restricts the data to a line or column. Our hypothesis is that the spatial restriction of the information could compromise the results, particularly when undesirable and unexpected optical inhomogeneities occur, such as in cell culture media. It tested a spatial random approach to collect the points to form a THSP. Cells in a culture medium and in drying paint, representing homogeneous samples in different levels, were tested, and a comparison with the traditional method was carried out. An alternative random selection based on a Gaussian distribution around a desired position was also presented. The results showed that the traditional protocol presented higher variation than the outcomes using the random method. The higher the inhomogeneity of the activity map, the higher the efficiency of the proposed method using random points. The Gaussian distribution proved to be useful when there was a well-defined area to monitor.

  17. Reducing seed dependent variability of non-uniformly sampled multidimensional NMR data

    NASA Astrophysics Data System (ADS)

    Mobli, Mehdi

    2015-07-01

    The application of NMR spectroscopy to study the structure, dynamics and function of macromolecules requires the acquisition of several multidimensional spectra. The one-dimensional NMR time-response from the spectrometer is extended to additional dimensions by introducing incremented delays in the experiment that cause oscillation of the signal along "indirect" dimensions. For a given dimension the delay is incremented at twice the rate of the maximum frequency (Nyquist rate). To achieve high-resolution requires acquisition of long data records sampled at the Nyquist rate. This is typically a prohibitive step due to time constraints, resulting in sub-optimal data records to the detriment of subsequent analyses. The multidimensional NMR spectrum itself is typically sparse, and it has been shown that in such cases it is possible to use non-Fourier methods to reconstruct a high-resolution multidimensional spectrum from a random subset of non-uniformly sampled (NUS) data. For a given acquisition time, NUS has the potential to improve the sensitivity and resolution of a multidimensional spectrum, compared to traditional uniform sampling. The improvements in sensitivity and/or resolution achieved by NUS are heavily dependent on the distribution of points in the random subset acquired. Typically, random points are selected from a probability density function (PDF) weighted according to the NMR signal envelope. In extreme cases as little as 1% of the data is subsampled. The heavy under-sampling can result in poor reproducibility, i.e. when two experiments are carried out where the same number of random samples is selected from the same PDF but using different random seeds. Here, a jittered sampling approach is introduced that is shown to improve random seed dependent reproducibility of multidimensional spectra generated from NUS data, compared to commonly applied NUS methods. It is shown that this is achieved due to the low variability of the inherent sensitivity of the random subset chosen from a given PDF. Finally, it is demonstrated that metrics used to find optimal NUS distributions are heavily dependent on the inherent sensitivity of the random subset, and such optimisation is therefore less critical when using the proposed sampling scheme.

  18. Deterministic chaotic dynamics of Raba River flow (Polish Carpathian Mountains)

    NASA Astrophysics Data System (ADS)

    Kędra, Mariola

    2014-02-01

    Is the underlying dynamics of river flow random or deterministic? If it is deterministic, is it deterministic chaotic? This issue is still controversial. The application of several independent methods, techniques and tools for studying daily river flow data gives consistent, reliable and clear-cut results to the question. The outcomes point out that the investigated discharge dynamics is not random but deterministic. Moreover, the results completely confirm the nonlinear deterministic chaotic nature of the studied process. The research was conducted on daily discharge from two selected gauging stations of the mountain river in southern Poland, the Raba River.

  19. Designs for Testing Group-Based Interventions with Limited Numbers of Social Units: The Dynamic Wait-Listed and Regression Point Displacement Designs.

    PubMed

    Wyman, Peter A; Henry, David; Knoblauch, Shannon; Brown, C Hendricks

    2015-10-01

    The dynamic wait-listed design (DWLD) and regression point displacement design (RPDD) address several challenges in evaluating group-based interventions when there is a limited number of groups. Both DWLD and RPDD utilize efficiencies that increase statistical power and can enhance balance between community needs and research priorities. The DWLD blocks on more time units than traditional wait-listed designs, thereby increasing the proportion of a study period during which intervention and control conditions can be compared, and can also improve logistics of implementing intervention across multiple sites and strengthen fidelity. We discuss DWLDs in the larger context of roll-out randomized designs and compare it with its cousin the Stepped Wedge design. The RPDD uses archival data on the population of settings from which intervention unit(s) are selected to create expected posttest scores for units receiving intervention, to which actual posttest scores are compared. High pretest-posttest correlations give the RPDD statistical power for assessing intervention impact even when one or a few settings receive intervention. RPDD works best when archival data are available over a number of years prior to and following intervention. If intervention units were not randomly selected, propensity scores can be used to control for non-random selection factors. Examples are provided of the DWLD and RPDD used to evaluate, respectively, suicide prevention training (QPR) in 32 schools and a violence prevention program (CeaseFire) in two Chicago police districts over a 10-year period. How DWLD and RPDD address common threats to internal and external validity, as well as their limitations, are discussed.

  20. Are Prenatal Ultrasound Scans Associated with the Autism Phenotype? Follow-Up of a Randomised Controlled Trial

    ERIC Educational Resources Information Center

    Stoch, Yonit K.; Williams, Cori J.; Granich, Joanna; Hunt, Anna M.; Landau, Lou I.; Newnham, John P.; Whitehouse, Andrew J. O.

    2012-01-01

    An existing randomised controlled trial was used to investigate whether multiple ultrasound scans may be associated with the autism phenotype. From 2,834 single pregnancies, 1,415 were selected at random to receive ultrasound imaging and continuous wave Doppler flow studies at five points throughout pregnancy (Intensive) and 1,419 to receive a…

  1. Evolutionary constraints and the neutral theory. [mutation-caused nucleotide substitutions in DNA

    NASA Technical Reports Server (NTRS)

    Jukes, T. H.; Kimura, M.

    1984-01-01

    The neutral theory of molecular evolution postulates that nucleotide substitutions inherently take place in DNA as a result of point mutations followed by random genetic drift. In the absence of selective constraints, the substitution rate reaches the maximum value set by the mutation rate. The rate in globin pseudogenes is about 5 x 10 to the -9th substitutions per site per year in mammals. Rates slower than this indicate the presence of constraints imposed by negative (natural) selection, which rejects and discards deleterious mutations.

  2. High capacity low delay packet broadcasting multiaccess schemes for satellite repeater systems

    NASA Astrophysics Data System (ADS)

    Bose, S. K.

    1980-12-01

    Demand assigned packet radio schemes using satellite repeaters can achieve high capacities but often exhibit relatively large delays under low traffic conditions when compared to random access. Several schemes which improve delay performance at low traffic but which have high capacity are presented and analyzed. These schemes allow random acess attempts by users, who are waiting for channel assignments. The performance of these are considered in the context of a multiple point communication system carrying fixed length messages between geographically distributed (ground) user terminals which are linked via a satellite repeater. Channel assignments are done following a BCC queueing discipline by a (ground) central controller on the basis of requests correctly received over a collision type access channel. In TBACR Scheme A, some of the forward message channels are set aside for random access transmissions; the rest are used in a demand assigned mode. Schemes B and C operate all their forward message channels in a demand assignment mode but, by means of appropriate algorithms for trailer channel selection, allow random access attempts on unassigned channels. The latter scheme also introduces framing and slotting of the time axis to implement a more efficient algorithm for trailer channel selection than the former.

  3. Fixed-Rate Compressed Floating-Point Arrays.

    PubMed

    Lindstrom, Peter

    2014-12-01

    Current compression schemes for floating-point data commonly take fixed-precision values and compress them to a variable-length bit stream, complicating memory management and random access. We present a fixed-rate, near-lossless compression scheme that maps small blocks of 4(d) values in d dimensions to a fixed, user-specified number of bits per block, thereby allowing read and write random access to compressed floating-point data at block granularity. Our approach is inspired by fixed-rate texture compression methods widely adopted in graphics hardware, but has been tailored to the high dynamic range and precision demands of scientific applications. Our compressor is based on a new, lifted, orthogonal block transform and embedded coding, allowing each per-block bit stream to be truncated at any point if desired, thus facilitating bit rate selection using a single compression scheme. To avoid compression or decompression upon every data access, we employ a software write-back cache of uncompressed blocks. Our compressor has been designed with computational simplicity and speed in mind to allow for the possibility of a hardware implementation, and uses only a small number of fixed-point arithmetic operations per compressed value. We demonstrate the viability and benefits of lossy compression in several applications, including visualization, quantitative data analysis, and numerical simulation.

  4. The coalescent of a sample from a binary branching process.

    PubMed

    Lambert, Amaury

    2018-04-25

    At time 0, start a time-continuous binary branching process, where particles give birth to a single particle independently (at a possibly time-dependent rate) and die independently (at a possibly time-dependent and age-dependent rate). A particular case is the classical birth-death process. Stop this process at time T>0. It is known that the tree spanned by the N tips alive at time T of the tree thus obtained (called a reduced tree or coalescent tree) is a coalescent point process (CPP), which basically means that the depths of interior nodes are independent and identically distributed (iid). Now select each of the N tips independently with probability y (Bernoulli sample). It is known that the tree generated by the selected tips, which we will call the Bernoulli sampled CPP, is again a CPP. Now instead, select exactly k tips uniformly at random among the N tips (a k-sample). We show that the tree generated by the selected tips is a mixture of Bernoulli sampled CPPs with the same parent CPP, over some explicit distribution of the sampling probability y. An immediate consequence is that the genealogy of a k-sample can be obtained by the realization of k random variables, first the random sampling probability Y and then the k-1 node depths which are iid conditional on Y=y. Copyright © 2018. Published by Elsevier Inc.

  5. Characteristics of health-promoting schools from Iranian adolescents' point of view.

    PubMed

    Shahhosseini, Zohreh; Simbar, Masoumeh; Ramezankhani, Ali

    2016-05-01

    Although characteristics of health-promoting schools are mentioned in the World Health Organization guidelines, different countries need to design more details of indicators for assessing these schools according to their social and cultural context. The aim of this study was to investigate characteristics of health-promoting schools from Iranian adolescent girls' point of view. In this cross-sectional study, 2010 middle school and high school female adolescents were selected from randomly selected schools in Mazandaran province, Iran. They completed a self-completion questionnaire around their views about characteristics of health- promoting schools. Data were analyzed using descriptive statistics and an independent t-test. It is revealed that from Iranian adolescents' point of view the most important feature of health-promoting schools was the schools with no stressful exams and where notices are kindly given to students for their mistakes. The results suggest that there is a need for more measurable standards of health-promoting schools based on the socio-cultural context of both developing and developed countries.

  6. Photobiomodulation in the Prevention of Tooth Sensitivity Caused by In-Office Dental Bleaching. A Randomized Placebo Preliminary Study.

    PubMed

    Calheiros, Andrea Paiva Corsetti; Moreira, Maria Stella; Gonçalves, Flávia; Aranha, Ana Cecília Correa; Cunha, Sandra Ribeiro; Steiner-Oliveira, Carolina; Eduardo, Carlos de Paula; Ramalho, Karen Müller

    2017-08-01

    Analyze the effect of photobiomodulation in the prevention of tooth sensitivity after in-office dental bleaching. Tooth sensitivity is a common clinical consequence of dental bleaching. Therapies for prevention of sensitivity have been investigated in literature. This study was developed as a randomized, placebo blind clinical trial. Fifty patients were selected (n = 10) and randomly divided into five groups: (1) control, (2) placebo, (3) laser before bleaching, (4) laser after bleaching, and (5) laser before and after bleaching. Irradiation was performed perpendicularly, in contact, on each tooth during 10 sec per point in two points. The first point was positioned in the middle of the tooth crown and the second in the periapical region. Photobiomodulation was applied using the following parameters: 780 nm, 40 mW, 10 J/cm 2 , 0.4 J per point. Pain was analyzed before, immediately after, and seven subsequent days after bleaching. Patients were instructed to report pain using the scale: 0 = no tooth sensitivity, 1 = gentle sensitivity, 2 = moderate sensitivity, 3 = severe sensitivity. There were no statistical differences between groups at any time (p > 0.05). More studies, with others parameters and different methods of tooth sensitivity analysis, should be performed to complement the results found. Within the limitation of the present study, the laser parameters of photobiomodulation tested in the present study were not efficient in preventing tooth sensitivity after in-office bleaching.

  7. Bayesian dynamic modeling of time series of dengue disease case counts.

    PubMed

    Martínez-Bello, Daniel Adyro; López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-07-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health.

  8. Random phase detection in multidimensional NMR.

    PubMed

    Maciejewski, Mark W; Fenwick, Matthew; Schuyler, Adam D; Stern, Alan S; Gorbatyuk, Vitaliy; Hoch, Jeffrey C

    2011-10-04

    Despite advances in resolution accompanying the development of high-field superconducting magnets, biomolecular applications of NMR require multiple dimensions in order to resolve individual resonances, and the achievable resolution is typically limited by practical constraints on measuring time. In addition to the need for measuring long evolution times to obtain high resolution, the need to distinguish the sign of the frequency constrains the ability to shorten measuring times. Sign discrimination is typically accomplished by sampling the signal with two different receiver phases or by selecting a reference frequency outside the range of frequencies spanned by the signal and then sampling at a higher rate. In the parametrically sampled (indirect) time dimensions of multidimensional NMR experiments, either method imposes an additional factor of 2 sampling burden for each dimension. We demonstrate that by using a single detector phase at each time sample point, but randomly altering the phase for different points, the sign ambiguity that attends fixed single-phase detection is resolved. Random phase detection enables a reduction in experiment time by a factor of 2 for each indirect dimension, amounting to a factor of 8 for a four-dimensional experiment, albeit at the cost of introducing sampling artifacts. Alternatively, for fixed measuring time, random phase detection can be used to double resolution in each indirect dimension. Random phase detection is complementary to nonuniform sampling methods, and their combination offers the potential for additional benefits. In addition to applications in biomolecular NMR, random phase detection could be useful in magnetic resonance imaging and other signal processing contexts.

  9. Applications of step-selection functions in ecology and conservation.

    PubMed

    Thurfjell, Henrik; Ciuti, Simone; Boyce, Mark S

    2014-01-01

    Recent progress in positioning technology facilitates the collection of massive amounts of sequential spatial data on animals. This has led to new opportunities and challenges when investigating animal movement behaviour and habitat selection. Tools like Step Selection Functions (SSFs) are relatively new powerful models for studying resource selection by animals moving through the landscape. SSFs compare environmental attributes of observed steps (the linear segment between two consecutive observations of position) with alternative random steps taken from the same starting point. SSFs have been used to study habitat selection, human-wildlife interactions, movement corridors, and dispersal behaviours in animals. SSFs also have the potential to depict resource selection at multiple spatial and temporal scales. There are several aspects of SSFs where consensus has not yet been reached such as how to analyse the data, when to consider habitat covariates along linear paths between observations rather than at their endpoints, how many random steps should be considered to measure availability, and how to account for individual variation. In this review we aim to address all these issues, as well as to highlight weak features of this modelling approach that should be developed by further research. Finally, we suggest that SSFs could be integrated with state-space models to classify behavioural states when estimating SSFs.

  10. The National Mathematics Curriculum for BEP (Basic Education Programme) and the MDG (Millennium Development Goals) for Mathematics Teachers in Nigeria: Teachers' Perception and Readiness

    ERIC Educational Resources Information Center

    Ekwueme, Cecilia Olunwa; Meremikwu, Anne; Kalu, Nnenna

    2013-01-01

    The study used a survey design. The instrument was teachers' questionnaire and interview on awareness and readiness. The interview was administered to the different categories of the respondents using a 4-point Likert scale. Two hundred mathematics teachers were randomly selected from 100 schools (public and private) using stratified random…

  11. The origin of biological macromolecules on the earth. The hypothesis of inorganic template

    NASA Technical Reports Server (NTRS)

    Lu, T. S.

    1977-01-01

    Studies about the origin of life are reviewed. The nonrandom organization of organelles is discussed from a structural and functional point of view. After postulating that the origin of biomacromolecules was not a random event, the paper develops the hypothesis that polypeptides and polynucleotides were formed on an inorganic template. Only information-containing structures can pass natural selection and develop through evolution.

  12. HIV Salvage Therapy Does Not Require Nucleoside Reverse Transcriptase Inhibitors: A Randomized, Controlled Trial.

    PubMed

    Tashima, Karen T; Smeaton, Laura M; Fichtenbaum, Carl J; Andrade, Adriana; Eron, Joseph J; Gandhi, Rajesh T; Johnson, Victoria A; Klingman, Karin L; Ritz, Justin; Hodder, Sally; Santana, Jorge L; Wilkin, Timothy; Haubrich, Richard H

    2015-12-15

    Nucleoside reverse transcriptase inhibitors (NRTIs) are often included in antiretroviral regimens in treatment-experienced patients in the absence of data from randomized trials. To compare treatment success between participants who omit versus those who add NRTIs to an optimized antiretroviral regimen of 3 or more agents. Multicenter, randomized, controlled trial. (ClinicalTrials.gov: NCT00537394). Outpatient HIV clinics. Treatment-experienced patients with HIV infection and viral resistance. Open-label optimized regimens (not including NRTIs) were selected on the basis of treatment history and susceptibility testing. Participants were randomly assigned to omit or add NRTIs. The primary efficacy outcome was regimen failure through 48 weeks using a noninferiority margin of 15%. The primary safety outcome was time to initial episode of a severe sign, symptom, or laboratory abnormality before discontinuation of NRTI assignment. 360 participants were randomly assigned, and 93% completed a 48-week visit. The cumulative probability of regimen failure was 29.8% in the omit-NRTIs group versus 25.9% in the add-NRTIs group (difference, 3.2 percentage points [95% CI, -6.1 to 12.5 percentage points]). No significant between-group differences were found in the primary safety end points or the proportion of participants with HIV RNA level less than 50 copies/mL. No deaths occurred in the omit-NRTIs group compared with 7 deaths in the add-NRTIs group. Unblinded study design, and the study may not be applicable to resource-poor settings. Treatment-experienced patients with HIV infection starting a new optimized regimen can safely omit NRTIs without compromising virologic efficacy. Omitting NRTIs will reduce pill burden, cost, and toxicity in this patient population. National Institute of Allergy and Infectious Diseases, Boehringer Ingelheim, Janssen, Merck, ViiV Healthcare, Roche, and Monogram Biosciences (LabCorp).

  13. Neutrality and evolvability of designed protein sequences

    NASA Astrophysics Data System (ADS)

    Bhattacherjee, Arnab; Biswas, Parbati

    2010-07-01

    The effect of foldability on protein’s evolvability is analyzed by a two-prong approach consisting of a self-consistent mean-field theory and Monte Carlo simulations. Theory and simulation models representing protein sequences with binary patterning of amino acid residues compatible with a particular foldability criteria are used. This generalized foldability criterion is derived using the high temperature cumulant expansion approximating the free energy of folding. The effect of cumulative point mutations on these designed proteins is studied under neutral condition. The robustness, protein’s ability to tolerate random point mutations is determined with a selective pressure of stability (ΔΔG) for the theory designed sequences, which are found to be more robust than that of Monte Carlo and mean-field-biased Monte Carlo generated sequences. The results show that this foldability criterion selects viable protein sequences more effectively compared to the Monte Carlo method, which has a marked effect on how the selective pressure shapes the evolutionary sequence space. These observations may impact de novo sequence design and its applications in protein engineering.

  14. Effect of anger management education on mental health and aggression of prisoner women.

    PubMed

    Bahrami, Elaheh; Mazaheri, Maryam Amidi; Hasanzadeh, Akbar

    2016-01-01

    "Uncontrolled anger" threats the compatible and health of people as serious risk. The effects of weaknesses and shortcomings in the management of anger, from personal distress and destruction interpersonal relationships beyond and linked to the public health problems, lack of compromises, and aggressive behavior adverse outcomes. This study investigates the effects of anger management education on mental health and aggression of prisoner women in Isfahan. The single-group quasi-experimental (pretest, posttest) by prisoner women in the central prison of Isfahan was done. Multi-stage random sampling method was used. Initially, 165 women were selected randomly and completed the Buss and Perry Aggression Questionnaire and the General Health Questionnaire-28, and among these, those with scores >78 (the cut point) in aggression scale was selected and among them 70 were randomly selected. In the next step, interventions in four 90 min training sessions were conducted. Posttest was performed within 1-month after the intervention. Data were analyzed using SPSS-20 software. Data analysis showed that anger management training was effective in reducing aggression (P < 0.001) and also had a positive effect on mental health (P < 0.001). According to the importance of aggression in consistency and individual and collective health and according to findings, presented educational programs on anger management is essential for female prisoners.

  15. Species conservation profiles of a random sample of world spiders I: Agelenidae to Filistatidae

    PubMed Central

    Seppälä, Sini; Henriques, Sérgio; Draney, Michael L; Foord, Stefan; Gibbons, Alastair T; Gomez, Luz A; Kariko, Sarah; Malumbres-Olarte, Jagoba; Milne, Marc; Vink, Cor J

    2018-01-01

    Abstract Background The IUCN Red List of Threatened Species is the most widely used information source on the extinction risk of species. One of the uses of the Red List is to evaluate and monitor the state of biodiversity and a possible approach for this purpose is the Red List Index (RLI). For many taxa, mainly hyperdiverse groups, it is not possible within available resources to assess all known species. In such cases, a random sample of species might be selected for assessment and the results derived from it extrapolated for the entire group - the Sampled Red List Index (SRLI). With the current contribution and the three following papers, we intend to create the first point in time of a future spider SRLI encompassing 200 species distributed across the world. New information A sample of 200 species of spiders were randomly selected from the World Spider Catalogue, an updated global database containing all recognised species names for the group. The 200 selected species where divided taxonomically at the family level and the familes were ordered alphabetically. In this publication, we present the conservation profiles of 46 species belonging to the famillies alphabetically arranged between Agelenidae and Filistatidae, which encompassed Agelenidae, Amaurobiidae, Anyphaenidae, Araneidae, Archaeidae, Barychelidae, Clubionidae, Corinnidae, Ctenidae, Ctenizidae, Cyatholipidae, Dictynidae, Dysderidae, Eresidae and Filistatidae. PMID:29725239

  16. Species conservation profiles of a random sample of world spiders I: Agelenidae to Filistatidae.

    PubMed

    Seppälä, Sini; Henriques, Sérgio; Draney, Michael L; Foord, Stefan; Gibbons, Alastair T; Gomez, Luz A; Kariko, Sarah; Malumbres-Olarte, Jagoba; Milne, Marc; Vink, Cor J; Cardoso, Pedro

    2018-01-01

    The IUCN Red List of Threatened Species is the most widely used information source on the extinction risk of species. One of the uses of the Red List is to evaluate and monitor the state of biodiversity and a possible approach for this purpose is the Red List Index (RLI). For many taxa, mainly hyperdiverse groups, it is not possible within available resources to assess all known species. In such cases, a random sample of species might be selected for assessment and the results derived from it extrapolated for the entire group - the Sampled Red List Index (SRLI). With the current contribution and the three following papers, we intend to create the first point in time of a future spider SRLI encompassing 200 species distributed across the world. A sample of 200 species of spiders were randomly selected from the World Spider Catalogue, an updated global database containing all recognised species names for the group. The 200 selected species where divided taxonomically at the family level and the familes were ordered alphabetically. In this publication, we present the conservation profiles of 46 species belonging to the famillies alphabetically arranged between Agelenidae and Filistatidae, which encompassed Agelenidae, Amaurobiidae, Anyphaenidae, Araneidae, Archaeidae, Barychelidae, Clubionidae, Corinnidae, Ctenidae, Ctenizidae, Cyatholipidae, Dictynidae, Dysderidae, Eresidae and Filistatidae.

  17. Simultaneous feature selection and parameter optimisation using an artificial ant colony: case study of melting point prediction

    PubMed Central

    O'Boyle, Noel M; Palmer, David S; Nigsch, Florian; Mitchell, John BO

    2008-01-01

    Background We present a novel feature selection algorithm, Winnowing Artificial Ant Colony (WAAC), that performs simultaneous feature selection and model parameter optimisation for the development of predictive quantitative structure-property relationship (QSPR) models. The WAAC algorithm is an extension of the modified ant colony algorithm of Shen et al. (J Chem Inf Model 2005, 45: 1024–1029). We test the ability of the algorithm to develop a predictive partial least squares model for the Karthikeyan dataset (J Chem Inf Model 2005, 45: 581–590) of melting point values. We also test its ability to perform feature selection on a support vector machine model for the same dataset. Results Starting from an initial set of 203 descriptors, the WAAC algorithm selected a PLS model with 68 descriptors which has an RMSE on an external test set of 46.6°C and R2 of 0.51. The number of components chosen for the model was 49, which was close to optimal for this feature selection. The selected SVM model has 28 descriptors (cost of 5, ε of 0.21) and an RMSE of 45.1°C and R2 of 0.54. This model outperforms a kNN model (RMSE of 48.3°C, R2 of 0.47) for the same data and has similar performance to a Random Forest model (RMSE of 44.5°C, R2 of 0.55). However it is much less prone to bias at the extremes of the range of melting points as shown by the slope of the line through the residuals: -0.43 for WAAC/SVM, -0.53 for Random Forest. Conclusion With a careful choice of objective function, the WAAC algorithm can be used to optimise machine learning and regression models that suffer from overfitting. Where model parameters also need to be tuned, as is the case with support vector machine and partial least squares models, it can optimise these simultaneously. The moving probabilities used by the algorithm are easily interpreted in terms of the best and current models of the ants, and the winnowing procedure promotes the removal of irrelevant descriptors. PMID:18959785

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krityakierne, Tipaluck; Akhtar, Taimoor; Shoemaker, Christine A.

    This paper presents a parallel surrogate-based global optimization method for computationally expensive objective functions that is more effective for larger numbers of processors. To reach this goal, we integrated concepts from multi-objective optimization and tabu search into, single objective, surrogate optimization. Our proposed derivative-free algorithm, called SOP, uses non-dominated sorting of points for which the expensive function has been previously evaluated. The two objectives are the expensive function value of the point and the minimum distance of the point to previously evaluated points. Based on the results of non-dominated sorting, P points from the sorted fronts are selected as centersmore » from which many candidate points are generated by random perturbations. Based on surrogate approximation, the best candidate point is subsequently selected for expensive evaluation for each of the P centers, with simultaneous computation on P processors. Centers that previously did not generate good solutions are tabu with a given tenure. We show almost sure convergence of this algorithm under some conditions. The performance of SOP is compared with two RBF based methods. The test results show that SOP is an efficient method that can reduce time required to find a good near optimal solution. In a number of cases the efficiency of SOP is so good that SOP with 8 processors found an accurate answer in less wall-clock time than the other algorithms did with 32 processors.« less

  19. Re-starting smoking in the postpartum period after receiving a smoking cessation intervention: a systematic review.

    PubMed

    Jones, Matthew; Lewis, Sarah; Parrott, Steve; Wormall, Stephen; Coleman, Tim

    2016-06-01

    In pregnant smoking cessation trial participants, to estimate (1) among women abstinent at the end of pregnancy, the proportion who re-start smoking at time-points afterwards (primary analysis) and (2) among all trial participants, the proportion smoking at the end of pregnancy and at selected time-points during the postpartum period (secondary analysis). Trials identified from two Cochrane reviews plus searches of Medline and EMBASE. Twenty-seven trials were included. The included trials were randomized or quasi-randomized trials of within-pregnancy cessation interventions given to smokers who reported abstinence both at end of pregnancy and at one or more defined time-points after birth. Outcomes were validated biochemically and self-reported continuous abstinence from smoking and 7-day point prevalence abstinence. The primary random-effects meta-analysis used longitudinal data to estimate mean pooled proportions of re-starting smoking; a secondary analysis used cross-sectional data to estimate the mean proportions smoking at different postpartum time-points. Subgroup analyses were performed on biochemically validated abstinence. The pooled mean proportion re-starting at 6 months postpartum was 43% [95% confidence interval (CI) = 16-72%, I(2)  = 96.7%] (11 trials, 571 abstinent women). The pooled mean proportion smoking at the end of pregnancy was 87% (95% CI = 84-90%, I(2)  = 93.2%) and 94% (95% CI = 92-96%, I(2)  = 88%) at 6 months postpartum (23 trials, 9262 trial participants). Findings were similar when using biochemically validated abstinence. In clinical trials of smoking cessation interventions during pregnancy only 13% are abstinent at term. Of these, 43% re-start by 6 months postpartum. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  20. Stress reduction in the secondary prevention of cardiovascular disease: randomized, controlled trial of transcendental meditation and health education in Blacks.

    PubMed

    Schneider, Robert H; Grim, Clarence E; Rainforth, Maxwell V; Kotchen, Theodore; Nidich, Sanford I; Gaylord-King, Carolyn; Salerno, John W; Kotchen, Jane Morley; Alexander, Charles N

    2012-11-01

    Blacks have disproportionately high rates of cardiovascular disease. Psychosocial stress may contribute to this disparity. Previous trials on stress reduction with the Transcendental Meditation (TM) program have reported improvements in cardiovascular disease risk factors, surrogate end points, and mortality in blacks and other populations. This was a randomized, controlled trial of 201 black men and women with coronary heart disease who were randomized to the TM program or health education. The primary end point was the composite of all-cause mortality, myocardial infarction, or stroke. Secondary end points included the composite of cardiovascular mortality, revascularizations, and cardiovascular hospitalizations; blood pressure; psychosocial stress factors; and lifestyle behaviors. During an average follow-up of 5.4 years, there was a 48% risk reduction in the primary end point in the TM group (hazard ratio, 0.52; 95% confidence interval, 0.29-0.92; P=0.025). The TM group also showed a 24% risk reduction in the secondary end point (hazard ratio, 0.76; 95% confidence interval, 0.51-0.1.13; P=0.17). There were reductions of 4.9 mmHg in systolic blood pressure (95% confidence interval -8.3 to -1.5 mmHg; P=0.01) and anger expression (P<0.05 for all scales). Adherence was associated with survival. A selected mind-body intervention, the TM program, significantly reduced risk for mortality, myocardial infarction, and stroke in coronary heart disease patients. These changes were associated with lower blood pressure and psychosocial stress factors. Therefore, this practice may be clinically useful in the secondary prevention of cardiovascular disease. Clinical Trial Registration- URL: www.clinicaltrials.gov Unique identifier: NCT01299935.

  1. Double-Blind, Placebo-Controlled, Randomized Phase III Trial Evaluating Pertuzumab Combined With Chemotherapy for Low Tumor Human Epidermal Growth Factor Receptor 3 mRNA-Expressing Platinum-Resistant Ovarian Cancer (PENELOPE).

    PubMed

    Kurzeder, Christian; Bover, Isabel; Marmé, Frederik; Rau, Joern; Pautier, Patricia; Colombo, Nicoletta; Lorusso, Domenica; Ottevanger, Petronella; Bjurberg, Maria; Marth, Christian; Barretina-Ginesta, Pilar; Vergote, Ignace; Floquet, Anne; Del Campo, Josep M; Mahner, Sven; Bastière-Truchot, Lydie; Martin, Nicolas; Oestergaard, Mikkel Z; Kiermaier, Astrid; Schade-Brittinger, Carmen; Polleis, Sandra; du Bois, Andreas; Gonzalez-Martin, Antonio

    2016-07-20

    The AGO-OVAR 2.29/ENGOT-ov14/PENELOPE prospectively randomized phase III trial evaluated the addition of pertuzumab to chemotherapy in patients with platinum-resistant ovarian carcinoma with low tumor human epidermal growth factor receptor 3 (HER3) mRNA expression. We report the results of the primary efficacy analysis. Eligible patients had ovarian carcinoma that progressed during or within 6 months of completing four or more platinum cycles, centrally tested low tumor HER3 mRNA expression (concentration ratio ≤ 2.81 by quantitative reverse transcriptase polymerase chain reaction on cobas z480 [Roche Molecular Diagnostics, Pleasanton, CA]), and no more than two prior lines of chemotherapy. After investigators' selection of the chemotherapy backbone (single-agent topotecan, weekly paclitaxel, or gemcitabine), patients were randomly assigned to also receive either placebo or pertuzumab (840-mg loading dose followed by 420 mg every 3 weeks). Stratification factors were selected chemotherapy, prior antiangiogenic therapy, and platinum-free interval. The primary end point was independent review committee-assessed progression-free survival (PFS). Additional end points included overall survival, investigator-assessed PFS, objective response rate, safety, patient-reported outcomes, and translational research. Overall, 156 patients were randomly assigned. Adding pertuzumab to chemotherapy did not significantly improve independent review committee-assessed PFS for the primary analysis (stratified hazard ratio, 0.74; 95% CI, 0.50 to 1.11; P = .14; median PFS, 4.3 months for pertuzumab plus chemotherapy v 2.6 months for placebo plus chemotherapy). Sensitivity analyses and secondary efficacy end point results were consistent with the primary analysis. The effect on PFS favoring pertuzumab was more pronounced in the gemcitabine and paclitaxel cohorts. No new safety signals were seen. Although the primary objective was not met, subgroup analyses showed trends in PFS favoring pertuzumab in the gemcitabine and paclitaxel cohorts, meriting further exploration of pertuzumab in ovarian cancer. © 2016 by American Society of Clinical Oncology.

  2. Application of random coherence order selection in gradient-enhanced multidimensional NMR

    NASA Astrophysics Data System (ADS)

    Bostock, Mark J.; Nietlispach, Daniel

    2016-03-01

    Development of multidimensional NMR is essential to many applications, for example in high resolution structural studies of biomolecules. Multidimensional techniques enable separation of NMR signals over several dimensions, improving signal resolution, whilst also allowing identification of new connectivities. However, these advantages come at a significant cost. The Fourier transform theorem requires acquisition of a grid of regularly spaced points to satisfy the Nyquist criterion, while frequency discrimination and acquisition of a pure phase spectrum require acquisition of both quadrature components for each time point in every indirect (non-acquisition) dimension, adding a factor of 2 N -1 to the number of free- induction decays which must be acquired, where N is the number of dimensions. Compressed sensing (CS) ℓ 1-norm minimisation in combination with non-uniform sampling (NUS) has been shown to be extremely successful in overcoming the Nyquist criterion. Previously, maximum entropy reconstruction has also been used to overcome the limitation of frequency discrimination, processing data acquired with only one quadrature component at a given time interval, known as random phase detection (RPD), allowing a factor of two reduction in the number of points for each indirect dimension (Maciejewski et al. 2011 PNAS 108 16640). However, whilst this approach can be easily applied in situations where the quadrature components are acquired as amplitude modulated data, the same principle is not easily extended to phase modulated (P-/N-type) experiments where data is acquired in the form exp (iωt) or exp (-iωt), and which make up many of the multidimensional experiments used in modern NMR. Here we demonstrate a modification of the CS ℓ 1-norm approach to allow random coherence order selection (RCS) for phase modulated experiments; we generalise the nomenclature for RCS and RPD as random quadrature detection (RQD). With this method, the power of RQD can be extended to the full suite of experiments available to modern NMR spectroscopy, allowing resolution enhancements for all indirect dimensions; alone or in combination with NUS, RQD can be used to improve experimental resolution, or shorten experiment times, of considerable benefit to the challenging applications undertaken by modern NMR.

  3. Quantifying Uncertainties from Presence Data Sampling Methods for Species Distribution Modeling: Focused on Vegetation.

    NASA Astrophysics Data System (ADS)

    Sung, S.; Kim, H. G.; Lee, D. K.; Park, J. H.; Mo, Y.; Kil, S.; Park, C.

    2016-12-01

    The impact of climate change has been observed throughout the globe. The ecosystem experiences rapid changes such as vegetation shift, species extinction. In these context, Species Distribution Model (SDM) is one of the popular method to project impact of climate change on the ecosystem. SDM basically based on the niche of certain species with means to run SDM present point data is essential to find biological niche of species. To run SDM for plants, there are certain considerations on the characteristics of vegetation. Normally, to make vegetation data in large area, remote sensing techniques are used. In other words, the exact point of presence data has high uncertainties as we select presence data set from polygons and raster dataset. Thus, sampling methods for modeling vegetation presence data should be carefully selected. In this study, we used three different sampling methods for selection of presence data of vegetation: Random sampling, Stratified sampling and Site index based sampling. We used one of the R package BIOMOD2 to access uncertainty from modeling. At the same time, we included BioCLIM variables and other environmental variables as input data. As a result of this study, despite of differences among the 10 SDMs, the sampling methods showed differences in ROC values, random sampling methods showed the lowest ROC value while site index based sampling methods showed the highest ROC value. As a result of this study the uncertainties from presence data sampling methods and SDM can be quantified.

  4. An enhanced deterministic K-Means clustering algorithm for cancer subtype prediction from gene expression data.

    PubMed

    Nidheesh, N; Abdul Nazeer, K A; Ameer, P M

    2017-12-01

    Clustering algorithms with steps involving randomness usually give different results on different executions for the same dataset. This non-deterministic nature of algorithms such as the K-Means clustering algorithm limits their applicability in areas such as cancer subtype prediction using gene expression data. It is hard to sensibly compare the results of such algorithms with those of other algorithms. The non-deterministic nature of K-Means is due to its random selection of data points as initial centroids. We propose an improved, density based version of K-Means, which involves a novel and systematic method for selecting initial centroids. The key idea of the algorithm is to select data points which belong to dense regions and which are adequately separated in feature space as the initial centroids. We compared the proposed algorithm to a set of eleven widely used single clustering algorithms and a prominent ensemble clustering algorithm which is being used for cancer data classification, based on the performances on a set of datasets comprising ten cancer gene expression datasets. The proposed algorithm has shown better overall performance than the others. There is a pressing need in the Biomedical domain for simple, easy-to-use and more accurate Machine Learning tools for cancer subtype prediction. The proposed algorithm is simple, easy-to-use and gives stable results. Moreover, it provides comparatively better predictions of cancer subtypes from gene expression data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Primary School Teachers' Inspection in Turkey: Primary School Teachers' Expectations about Inspectors' Guidance Roles and the Realisation Level of These Expectations

    ERIC Educational Resources Information Center

    Polat, Soner; Ugurlu, Celal Teyyar

    2008-01-01

    The aim of this research is to point out primary school teachers' expectations about inspectors' guidance roles and the realisation level of these expectations. The data used in this research that will be done in descriptive scanning model is collected from the views of primary school teachers selected randomly from Balikesir, Batman and Hatay.…

  6. A stratified two-stage sampling design for digital soil mapping in a Mediterranean basin

    NASA Astrophysics Data System (ADS)

    Blaschek, Michael; Duttmann, Rainer

    2015-04-01

    The quality of environmental modelling results often depends on reliable soil information. In order to obtain soil data in an efficient manner, several sampling strategies are at hand depending on the level of prior knowledge and the overall objective of the planned survey. This study focuses on the collection of soil samples considering available continuous secondary information in an undulating, 16 km²-sized river catchment near Ussana in southern Sardinia (Italy). A design-based, stratified, two-stage sampling design has been applied aiming at the spatial prediction of soil property values at individual locations. The stratification based on quantiles from density functions of two land-surface parameters - topographic wetness index and potential incoming solar radiation - derived from a digital elevation model. Combined with four main geological units, the applied procedure led to 30 different classes in the given test site. Up to six polygons of each available class were selected randomly excluding those areas smaller than 1ha to avoid incorrect location of the points in the field. Further exclusion rules were applied before polygon selection masking out roads and buildings using a 20m buffer. The selection procedure was repeated ten times and the set of polygons with the best geographical spread were chosen. Finally, exact point locations were selected randomly from inside the chosen polygon features. A second selection based on the same stratification and following the same methodology (selecting one polygon instead of six) was made in order to create an appropriate validation set. Supplementary samples were obtained during a second survey focusing on polygons that have either not been considered during the first phase at all or were not adequately represented with respect to feature size. In total, both field campaigns produced an interpolation set of 156 samples and a validation set of 41 points. The selection of sample point locations has been done using ESRI software (ArcGIS) extended by Hawth's Tools and later on its replacement the Geospatial Modelling Environment (GME). 88% of all desired points could actually be reached in the field and have been successfully sampled. Our results indicate that the sampled calibration and validation sets are representative for each other and could be successfully used as interpolation data for spatial prediction purposes. With respect to soil textural fractions, for instance, equal multivariate means and variance homogeneity were found for the two datasets as evidenced by significant (P > 0.05) Hotelling T²-test (2.3 with df1 = 3, df2 = 193) and Bartlett's test statistics (6.4 with df = 6). The multivariate prediction of clay, silt and sand content using a neural network residual cokriging approach reached an explained variance level of 56%, 47% and 63%. Thus, the presented case study is a successful example of considering readily available continuous information on soil forming factors such as geology and relief as stratifying variables for designing sampling schemes in digital soil mapping projects.

  7. Efficacy and tolerability balance of oxycodone/naloxone and tapentadol in chronic low back pain with a neuropathic component: a blinded end point analysis of randomly selected routine data from 12-week prospective open-label observations.

    PubMed

    Ueberall, Michael A; Mueller-Schwefe, Gerhard H H

    2016-01-01

    To evaluate the benefit-risk profile (BRP) of oxycodone/naloxone (OXN) and tapentadol (TAP) in patients with chronic low back pain (cLBP) with a neuropathic component (NC) in routine clinical practice. This was a blinded end point analysis of randomly selected 12-week routine/open-label data of the German Pain Registry on adult patients with cLBP-NC who initiated an index treatment in compliance with the current German prescribing information between 1st January and 31st October 2015 (OXN/TAP, n=128/133). Primary end point was defined as a composite of three efficacy components (≥30% improvement of pain, pain-related disability, and quality of life each at the end of observation vs baseline) and three tolerability components (normal bowel function, absence of either central nervous system side effects, and treatment-emergent adverse event [TEAE]-related treatment discontinuation during the observation period) adopted to reflect BRP assessments under real-life conditions. Demographic as well as baseline and pretreatment characteristics were comparable for the randomly selected data sets of both index groups without any indicators for critical selection biases. Treatment with OXN resulted formally in a BRP noninferior to that of TAP and showed a significantly higher primary end point response vs TAP (39.8% vs 25.6%, odds ratio: 1.93; P =0.014), due to superior analgesic effects. Between-group differences increased with stricter response definitions for all three efficacy components in favor of OXN: ≥30%/≥50%/≥70% response rates for OXN vs TAP were seen for pain intensity in 85.2%/67.2%/39.1% vs 83.5%/54.1%/15.8% ( P = ns/0.031/<0.001), for pain-related disability in 78.1%/64.8%/43.8% vs 66.9%/50.4%/24.8% ( P =0.043/0.018/0.001), and for quality of life in 76.6%/68.0%/50.0% vs 63.9%/54.1%/34.6% ( P =0.026/0.022/0.017). Overall, OXN vs TAP treatments were well tolerated, and proportions of patients who either maintained a normal bowel function (68.0% vs 72.2%), reported no central nervous system side effects (91.4% vs 89.5%), or completed the 12-week evaluation period without any TEAE-related treatment discontinuations (93.0% vs 92.5%) were similar for both index medications ( P = ns for each comparison). In daily practice, the BRP of OXN proved to be noninferior to that of TAP in patients with cLBP-NC, but showed a superior efficacy if stricter analgesic response definitions were evaluated.

  8. Efficacy and tolerability balance of oxycodone/naloxone and tapentadol in chronic low back pain with a neuropathic component: a blinded end point analysis of randomly selected routine data from 12-week prospective open-label observations

    PubMed Central

    Ueberall, Michael A; Mueller-Schwefe, Gerhard H H

    2016-01-01

    Objective To evaluate the benefit–risk profile (BRP) of oxycodone/naloxone (OXN) and tapentadol (TAP) in patients with chronic low back pain (cLBP) with a neuropathic component (NC) in routine clinical practice. Methods This was a blinded end point analysis of randomly selected 12-week routine/open-label data of the German Pain Registry on adult patients with cLBP-NC who initiated an index treatment in compliance with the current German prescribing information between 1st January and 31st October 2015 (OXN/TAP, n=128/133). Primary end point was defined as a composite of three efficacy components (≥30% improvement of pain, pain-related disability, and quality of life each at the end of observation vs baseline) and three tolerability components (normal bowel function, absence of either central nervous system side effects, and treatment-emergent adverse event [TEAE]-related treatment discontinuation during the observation period) adopted to reflect BRP assessments under real-life conditions. Results Demographic as well as baseline and pretreatment characteristics were comparable for the randomly selected data sets of both index groups without any indicators for critical selection biases. Treatment with OXN resulted formally in a BRP noninferior to that of TAP and showed a significantly higher primary end point response vs TAP (39.8% vs 25.6%, odds ratio: 1.93; P=0.014), due to superior analgesic effects. Between-group differences increased with stricter response definitions for all three efficacy components in favor of OXN: ≥30%/≥50%/≥70% response rates for OXN vs TAP were seen for pain intensity in 85.2%/67.2%/39.1% vs 83.5%/54.1%/15.8% (P= ns/0.031/<0.001), for pain-related disability in 78.1%/64.8%/43.8% vs 66.9%/50.4%/24.8% (P=0.043/0.018/0.001), and for quality of life in 76.6%/68.0%/50.0% vs 63.9%/54.1%/34.6% (P=0.026/0.022/0.017). Overall, OXN vs TAP treatments were well tolerated, and proportions of patients who either maintained a normal bowel function (68.0% vs 72.2%), reported no central nervous system side effects (91.4% vs 89.5%), or completed the 12-week evaluation period without any TEAE-related treatment discontinuations (93.0% vs 92.5%) were similar for both index medications (P= ns for each comparison). Conclusion In daily practice, the BRP of OXN proved to be noninferior to that of TAP in patients with cLBP-NC, but showed a superior efficacy if stricter analgesic response definitions were evaluated. PMID:27881925

  9. Construction of random sheared fosmid library from Chinese cabbage and its use for Brassica rapa genome sequencing project.

    PubMed

    Park, Tae-Ho; Park, Beom-Seok; Kim, Jin-A; Hong, Joon Ki; Jin, Mina; Seol, Young-Joo; Mun, Jeong-Hwan

    2011-01-01

    As a part of the Multinational Genome Sequencing Project of Brassica rapa, linkage group R9 and R3 were sequenced using a bacterial artificial chromosome (BAC) by BAC strategy. The current physical contigs are expected to cover approximately 90% euchromatins of both chromosomes. As the project progresses, BAC selection for sequence extension becomes more limited because BAC libraries are restriction enzyme-specific. To support the project, a random sheared fosmid library was constructed. The library consists of 97536 clones with average insert size of approximately 40 kb corresponding to seven genome equivalents, assuming a Chinese cabbage genome size of 550 Mb. The library was screened with primers designed at the end of sequences of nine points of scaffold gaps where BAC clones cannot be selected to extend the physical contigs. The selected positive clones were end-sequenced to check the overlap between the fosmid clones and the adjacent BAC clones. Nine fosmid clones were selected and fully sequenced. The sequences revealed two completed gap filling and seven sequence extensions, which can be used for further selection of BAC clones confirming that the fosmid library will facilitate the sequence completion of B. rapa. Copyright © 2011. Published by Elsevier Ltd.

  10. Effects of one versus two bouts of moderate intensity physical activity on selective attention during a school morning in Dutch primary schoolchildren: A randomized controlled trial.

    PubMed

    Altenburg, Teatske M; Chinapaw, Mai J M; Singh, Amika S

    2016-10-01

    Evidence suggests that physical activity is positively related to several aspects of cognitive functioning in children, among which is selective attention. To date, no information is available on the optimal frequency of physical activity on cognitive functioning in children. The current study examined the acute effects of one and two bouts of moderate-intensity physical activity on children's selective attention. Randomized controlled trial (ISRCTN97975679). Thirty boys and twenty-six girls, aged 10-13 years, were randomly assigned to three conditions: (A) sitting all morning working on simulated school tasks; (B) one 20-min physical activity bout after 90min; and (C) two 20-min physical activity bouts, i.e. at the start and after 90min. Selective attention was assessed at five time points during the morning (i.e. at baseline and after 20, 110, 130 and 220min), using the 'Sky Search' subtest of the 'Test of Selective Attention in Children'. We used GEE analysis to examine differences in Sky Search scores between the three experimental conditions, adjusting for school, baseline scores, self-reported screen time and time spent in sports. Children who performed two 20-min bouts of moderate-intensity physical activity had significantly better Sky Search scores compared to children who performed one physical activity bout or remained seated the whole morning (B=-0.26; 95% CI=[-0.52; -0.00]). Our findings support the importance of repeated physical activity during the school day for beneficial effects on selective attention in children. Copyright © 2015 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  11. Protocol: Testing the Relevance of Acupuncture Theory in the Treatment of Myofascial Pain in the Upper Trapezius Muscle.

    PubMed

    Elsdon, Dale S; Spanswick, Selina; Zaslawski, Chris; Meier, Peter C

    2017-01-01

    A protocol for a prospective single-blind parallel four-arm randomized placebo-controlled trial with repeated measures was designed to test the effects of various acupuncture methods compared with sham. Eighty self-selected participants with myofascial pain in the upper trapezius muscle were randomized into four groups. Group 1 received acupuncture to a myofascial trigger point (MTrP) in the upper trapezius. Group 2 received acupuncture to the MTrP in addition to relevant distal points. Group 3 received acupuncture to the relevant distal points only. Group 4 received a sham treatment to both the MTrP and distal points using a deactivated acupuncture laser device. Treatment was applied four times within 2 weeks with outcomes measured throughout the trial and at 2 weeks and 4 weeks posttreatment. Outcome measurements were a 100-mm visual analog pain scale, SF-36, pressure pain threshold, Neck Disability Index, the Upper Extremity Functional Index, lateral flexion in the neck, McGill Pain Questionnaire, Massachusetts General Hospital Acupuncture Sensation Scale, Working Alliance Inventory (short form), and the Credibility Expectance Questionnaire. Two-way analysis of variance (ANOVA) with repeated measures were used to assess the differences between groups. Copyright © 2017 Medical Association of Pharmacopuncture Institute. Published by Elsevier B.V. All rights reserved.

  12. Microhabitat selection of the Virginia Northern Flying Squirrel (Glaucomys sabrinus fuscus Miller) in the central Appalachians

    USGS Publications Warehouse

    Diggins, Corinne A.; Ford, W. Mark

    2017-01-01

    Glaucomys sabrinus fuscus (Virginia Northern Flying Squirrel; VNFS) is a rare Sciurid that occurrs in the Allegheny Mountains of eastern West Virginia and northwest Virginia. Previous work on this subspecies has confirmed close associations with Picea rubens (Red Spruce) at the landscape and stand levels in the region. However, ongoing Red Spruce restoration actions using canopy-gap creation to release single or small groups of trees requires a better understanding of within-stand habitat selection of VNFS to assess potential short- and medium-term impacts. To address these questions, we conducted a microhabitat study using radio-collared squirrels in montane conifer and mixed conifer—hardwood stands. We used points obtained from telemetry surveys and randomly generated points within each squirrel's home range to compare microhabitat variables for 13 individuals. We found that VNFS preferentially selected plots with conifer-dominant overstories and deep organic-soil horizons. VNFS avoided plots with dense Red Spruce regeneration in the understory in stands with hardwood-dominated overstories—the types of areas targeted for Red Spruce restoration. We also opportunistically searched for hypogeal fungi at telemetry points and found 3 species of Elaphomyces during our surveys. Our results indicate that microhabitat selection is associated with Red Spruce-dominant forests. Efforts to restore Red Spruce where hardwoods dominate in the central Appalachians may improve the connectivity and extent of habitat of VNFS.

  13. Research on sparse feature matching of improved RANSAC algorithm

    NASA Astrophysics Data System (ADS)

    Kong, Xiangsi; Zhao, Xian

    2018-04-01

    In this paper, a sparse feature matching method based on modified RANSAC algorithm is proposed to improve the precision and speed. Firstly, the feature points of the images are extracted using the SIFT algorithm. Then, the image pair is matched roughly by generating SIFT feature descriptor. At last, the precision of image matching is optimized by the modified RANSAC algorithm,. The RANSAC algorithm is improved from three aspects: instead of the homography matrix, this paper uses the fundamental matrix generated by the 8 point algorithm as the model; the sample is selected by a random block selecting method, which ensures the uniform distribution and the accuracy; adds sequential probability ratio test(SPRT) on the basis of standard RANSAC, which cut down the overall running time of the algorithm. The experimental results show that this method can not only get higher matching accuracy, but also greatly reduce the computation and improve the matching speed.

  14. Random forests ensemble classifier trained with data resampling strategy to improve cardiac arrhythmia diagnosis.

    PubMed

    Ozçift, Akin

    2011-05-01

    Supervised classification algorithms are commonly used in the designing of computer-aided diagnosis systems. In this study, we present a resampling strategy based Random Forests (RF) ensemble classifier to improve diagnosis of cardiac arrhythmia. Random forests is an ensemble classifier that consists of many decision trees and outputs the class that is the mode of the class's output by individual trees. In this way, an RF ensemble classifier performs better than a single tree from classification performance point of view. In general, multiclass datasets having unbalanced distribution of sample sizes are difficult to analyze in terms of class discrimination. Cardiac arrhythmia is such a dataset that has multiple classes with small sample sizes and it is therefore adequate to test our resampling based training strategy. The dataset contains 452 samples in fourteen types of arrhythmias and eleven of these classes have sample sizes less than 15. Our diagnosis strategy consists of two parts: (i) a correlation based feature selection algorithm is used to select relevant features from cardiac arrhythmia dataset. (ii) RF machine learning algorithm is used to evaluate the performance of selected features with and without simple random sampling to evaluate the efficiency of proposed training strategy. The resultant accuracy of the classifier is found to be 90.0% and this is a quite high diagnosis performance for cardiac arrhythmia. Furthermore, three case studies, i.e., thyroid, cardiotocography and audiology, are used to benchmark the effectiveness of the proposed method. The results of experiments demonstrated the efficiency of random sampling strategy in training RF ensemble classification algorithm. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Mapping risk for nest predation on a barrier island

    USGS Publications Warehouse

    Hackney, Amanda D.; Baldwin, Robert F.; Jodice, Patrick G.R.

    2013-01-01

    Barrier islands and coastal beach systems provide nesting habitat for marine and estuarine turtles. Densely settled coastal areas may subsidize nest predators. Our purpose was to inform conservation by providing a greater understanding of habitat-based risk factors for nest predation, for an estuarine turtle. We expected that habitat conditions at predated nests would differ from random locations at two spatial extents. We developed and validated an island-wide model for the distribution of predated Diamondback terrapin nests using locations of 198 predated nests collected during exhaustive searches at Fisherman Island National Wildlife Refuge, USA. We used aerial photographs to identify all areas of possible nesting habitat and searched each and surrounding environments for nests, collecting location and random-point microhabitat data. We built models for the probability of finding a predated nest using an equal number of random points and validated them with a reserve set (N = 67). Five variables in 9 a priori models were used and the best selected model (AIC weight 0.98) reflected positive associations with sand patches near marshes and roadways. Model validation had an average capture rate of predated nests of 84.14 % (26.17–97.38 %, Q1 77.53 %, median 88.07 %, Q3 95.08 %). Microhabitat selection results suggest that nests placed at the edges of sand patches adjacent to upland shrub/forest and marsh systems are vulnerable to predation. Forests and marshes provide cover and alternative resources for predators and roadways provide access; a suggestion is to focus nest protection efforts on the edges of dunes, near dense vegetation and roads.

  16. Purchasing Behavior and Calorie Information at Fast-Food Chains in New York City, 2007

    PubMed Central

    Bassett, Mary T.; Dumanovsky, Tamara; Huang, Christina; Silver, Lynn D.; Young, Candace; Nonas, Cathy; Matte, Thomas D.; Chideya, Sekai; Frieden, Thomas R.

    2008-01-01

    We surveyed 7318 customers from 275 randomly selected restaurants of 11 fast food chains. Participants purchased a mean of 827 calories, with 34% purchasing 1000 calories or more. Unlike other chains, Subway posted calorie information at point of purchase and its patrons more often reported seeing calorie infomation than patrons of other chains (32% vs 4%; P<.001); Subway patrons who saw calorie information purchased 52 fewer calories than did other Subway patrons (P<.01). Fast-food chains should display calorie information prominently at point of purchase, where it can be seen and used to inform purchases. PMID:18556597

  17. Prophylactic Cranial Irradiation in Extensive Disease Small-Cell Lung Cancer: Short-Term Health-Related Quality of Life and Patient Reported Symptoms—Results of an International Phase III Randomized Controlled Trial by the EORTC Radiation Oncology and Lung Cancer Groups

    PubMed Central

    Slotman, Berend J.; Mauer, Murielle E.; Bottomley, Andrew; Faivre-Finn, Corinne; Kramer, Gijs W.P.M.; Rankin, Elaine M.; Snee, Michael; Hatton, Matthew; Postmus, Pieter E.; Collette, Laurence; Senan, Suresh

    2009-01-01

    Purpose Prophylactic cranial irradiation (PCI) in patients with extensive-disease small-cell lung cancer (ED-SCLC) leads to significantly fewer symptomatic brain metastases and improved survival. Detailed effects of PCI on health-related quality of life (HRQOL) are reported here. Patients and Methods Patients (age, 18 to 75 years; WHO ≤ 2) with ED-SCLC, and any response to chemotherapy, were randomly assigned to either observation or PCI. Health-related quality of life (HRQOL) and patient-reported symptoms were secondary end points. The European Organisation for the Research and Treatment of Cancer core HRQOL tool (Quality of Life Questionnaire C30) and brain module (Quality of Life Questionnaire Brain Cancer Module) were used to collect self-reported patient data. Six HRQOL scales were selected as primary HRQOL end points: global health status; hair loss; fatigue; and role, cognitive and emotional functioning. Assessments were performed at random assignment, 6 weeks, 3 months, and then 3-monthly up to 1 year and 6-monthly thereafter. Results Compliance with the HRQOL assessment was 93.7% at baseline and dropped to 60% at 6 weeks. Short-term results up to 3 months showed that there was a negative impact of PCI on selected HRQOL scales. The largest mean difference between the two arms was observed for fatigue and hair loss. The impact of PCI on global health status as well as on functioning scores was more limited. For global health status, the observed mean difference was eight points on a scale 0 to 100 at 6 weeks (P = .018) and 3 months (P = .055). Conclusion PCI should be offered to all responding ED SCLC patients. Patients should be informed of the potential adverse effects from PCI. Clinicians should be alert to these; monitor their patients; and offer appropriate support, clinical, and psychosocial care. PMID:19047288

  18. Consumer assessment of beef tenderloin steaks from various USDA quality grades at 3 degrees of doneness.

    PubMed

    O'Quinn, Travis G; Brooks, J Chance; Miller, Markus F

    2015-02-01

    A consumer study was conducted to determine palatability ratings of beef tenderloin steaks from USDA Choice, USDA Select, and USDA Select with marbling scores from Slight 50 to 100 (USDA High Select) cooked to various degrees of doneness. Steaks were randomly assigned to 1 of 3 degree of doneness categories: very-rare, medium-rare, or well-done. Consumers (N = 315) were screened for preference of degree of doneness and fed 4 samples of their preferred doneness (a warm-up and one from each USDA quality grade treatment in a random order). Consumers evaluated steaks on an 8-point verbally anchored hedonic scale for tenderness, juiciness, flavor, and overall like as well as rated steaks as acceptable or unacceptable for all palatability traits. Quality grade had no effect (P > 0.05) on consumer ratings for tenderness, juiciness, flavor, and overall like scores, with all traits averaging above a 7 ("like very much") on the 8-point scale. In addition, no differences (P > 0.05) were found in the percentage of samples rated as acceptable for all palatability traits, with more than 94% of samples rated acceptable for each trait in all quality grades evaluated. Steaks cooked to well-done had lower (P < 0.05) juiciness scores than steaks cooked to very-rare or medium-rare and were rated lower for tenderness (P < 0.05) than steaks cooked to a very-rare degree of doneness. Results indicate consumers were not able to detect differences in tenderness, juiciness, flavor, or overall like among beef tenderloin steaks from USDA Choice and Select quality grades. © 2015 Institute of Food Technologists®

  19. Inference from clustering with application to gene-expression microarrays.

    PubMed

    Dougherty, Edward R; Barrera, Junior; Brun, Marcel; Kim, Seungchan; Cesar, Roberto M; Chen, Yidong; Bittner, Michael; Trent, Jeffrey M

    2002-01-01

    There are many algorithms to cluster sample data points based on nearness or a similarity measure. Often the implication is that points in different clusters come from different underlying classes, whereas those in the same cluster come from the same class. Stochastically, the underlying classes represent different random processes. The inference is that clusters represent a partition of the sample points according to which process they belong. This paper discusses a model-based clustering toolbox that evaluates cluster accuracy. Each random process is modeled as its mean plus independent noise, sample points are generated, the points are clustered, and the clustering error is the number of points clustered incorrectly according to the generating random processes. Various clustering algorithms are evaluated based on process variance and the key issue of the rate at which algorithmic performance improves with increasing numbers of experimental replications. The model means can be selected by hand to test the separability of expected types of biological expression patterns. Alternatively, the model can be seeded by real data to test the expected precision of that output or the extent of improvement in precision that replication could provide. In the latter case, a clustering algorithm is used to form clusters, and the model is seeded with the means and variances of these clusters. Other algorithms are then tested relative to the seeding algorithm. Results are averaged over various seeds. Output includes error tables and graphs, confusion matrices, principal-component plots, and validation measures. Five algorithms are studied in detail: K-means, fuzzy C-means, self-organizing maps, hierarchical Euclidean-distance-based and correlation-based clustering. The toolbox is applied to gene-expression clustering based on cDNA microarrays using real data. Expression profile graphics are generated and error analysis is displayed within the context of these profile graphics. A large amount of generated output is available over the web.

  20. Advanced analysis of forest fire clustering

    NASA Astrophysics Data System (ADS)

    Kanevski, Mikhail; Pereira, Mario; Golay, Jean

    2017-04-01

    Analysis of point pattern clustering is an important topic in spatial statistics and for many applications: biodiversity, epidemiology, natural hazards, geomarketing, etc. There are several fundamental approaches used to quantify spatial data clustering using topological, statistical and fractal measures. In the present research, the recently introduced multi-point Morisita index (mMI) is applied to study the spatial clustering of forest fires in Portugal. The data set consists of more than 30000 fire events covering the time period from 1975 to 2013. The distribution of forest fires is very complex and highly variable in space. mMI is a multi-point extension of the classical two-point Morisita index. In essence, mMI is estimated by covering the region under study by a grid and by computing how many times more likely it is that m points selected at random will be from the same grid cell than it would be in the case of a complete random Poisson process. By changing the number of grid cells (size of the grid cells), mMI characterizes the scaling properties of spatial clustering. From mMI, the data intrinsic dimension (fractal dimension) of the point distribution can be estimated as well. In this study, the mMI of forest fires is compared with the mMI of random patterns (RPs) generated within the validity domain defined as the forest area of Portugal. It turns out that the forest fires are highly clustered inside the validity domain in comparison with the RPs. Moreover, they demonstrate different scaling properties at different spatial scales. The results obtained from the mMI analysis are also compared with those of fractal measures of clustering - box counting and sand box counting approaches. REFERENCES Golay J., Kanevski M., Vega Orozco C., Leuenberger M., 2014: The multipoint Morisita index for the analysis of spatial patterns. Physica A, 406, 191-202. Golay J., Kanevski M. 2015: A new estimator of intrinsic dimension based on the multipoint Morisita index. Pattern Recognition, 48, 4070-4081.

  1. Analysis of training sample selection strategies for regression-based quantitative landslide susceptibility mapping methods

    NASA Astrophysics Data System (ADS)

    Erener, Arzu; Sivas, A. Abdullah; Selcuk-Kestel, A. Sevtap; Düzgün, H. Sebnem

    2017-07-01

    All of the quantitative landslide susceptibility mapping (QLSM) methods requires two basic data types, namely, landslide inventory and factors that influence landslide occurrence (landslide influencing factors, LIF). Depending on type of landslides, nature of triggers and LIF, accuracy of the QLSM methods differs. Moreover, how to balance the number of 0 (nonoccurrence) and 1 (occurrence) in the training set obtained from the landslide inventory and how to select which one of the 1's and 0's to be included in QLSM models play critical role in the accuracy of the QLSM. Although performance of various QLSM methods is largely investigated in the literature, the challenge of training set construction is not adequately investigated for the QLSM methods. In order to tackle this challenge, in this study three different training set selection strategies along with the original data set is used for testing the performance of three different regression methods namely Logistic Regression (LR), Bayesian Logistic Regression (BLR) and Fuzzy Logistic Regression (FLR). The first sampling strategy is proportional random sampling (PRS), which takes into account a weighted selection of landslide occurrences in the sample set. The second method, namely non-selective nearby sampling (NNS), includes randomly selected sites and their surrounding neighboring points at certain preselected distances to include the impact of clustering. Selective nearby sampling (SNS) is the third method, which concentrates on the group of 1's and their surrounding neighborhood. A randomly selected group of landslide sites and their neighborhood are considered in the analyses similar to NNS parameters. It is found that LR-PRS, FLR-PRS and BLR-Whole Data set-ups, with order, yield the best fits among the other alternatives. The results indicate that in QLSM based on regression models, avoidance of spatial correlation in the data set is critical for the model's performance.

  2. Presenting Cost and Efficiency Measures That Support Consumers to Make High-Value Health Care Choices.

    PubMed

    Greene, Jessica; Sacks, Rebecca M

    2018-02-25

    To identify approaches to presenting cost and resource use measures that support consumers in selecting high-value hospitals. Survey data were collected from U.S. employees of Analog Devices (n = 420). In two online experiments, participants viewed comparative data on four hospitals. In one experiment, participants were randomized to view one of five versions of the same comparative cost data, and in the other experiment they viewed different versions of the same readmissions data. Bivariate and multivariate analyses examined whether presentation approach was related to selecting the high-value hospital. Consumers were approximately 16 percentage points more likely to select a high-value hospital when cost data were presented using actual dollar amounts or using the word "affordable" to describe low-cost hospitals, compared to when the Hospital Compare spending ratio was used. Consumers were 33 points more likely to select the highest performing hospital when readmission performance was shown using word icons rather than percentages. Presenting cost and resource use measures effectively to consumers is challenging. This study suggests using actual dollar amounts for cost, but presenting performance on readmissions using evaluative symbols. © Health Research and Educational Trust.

  3. Recent wetland land loss due to hurricanes: improved estimates based upon multiple source images

    USGS Publications Warehouse

    Kranenburg, Christine J.; Palaseanu-Lovejoy, Monica; Barras, John A.; Brock, John C.; Wang, Ping; Rosati, Julie D.; Roberts, Tiffany M.

    2011-01-01

    The objective of this study was to provide a moderate resolution 30-m fractional water map of the Chenier Plain for 2003, 2006 and 2009 by using information contained in high-resolution satellite imagery of a subset of the study area. Indices and transforms pertaining to vegetation and water were created using the high-resolution imagery, and a threshold was applied to obtain a categorical land/water map. The high-resolution data was used to train a decision-tree classifier to estimate percent water in a lower resolution (Landsat) image. Two new water indices based on the tasseled cap transformation were proposed for IKONOS imagery in wetland environments and more than 700 input parameter combinations were considered for each Landsat image classified. Final selection and thresholding of the resulting percent water maps involved over 5,000 unambiguous classified random points using corresponding 1-m resolution aerial photographs, and a statistical optimization procedure to determine the threshold at which the maximum Kappa coefficient occurs. Each selected dataset has a Kappa coefficient, percent correctly classified (PCC) water, land and total greater than 90%. An accuracy assessment using 1,000 independent random points was performed. Using the validation points, the PCC values decreased to around 90%. The time series change analysis indicated that due to Hurricane Rita, the study area lost 6.5% of marsh area, and transient changes were less than 3% for either land or water. Hurricane Ike resulted in an additional 8% land loss, although not enough time has passed to discriminate between persistent and transient changes.

  4. On Pfaffian Random Point Fields

    NASA Astrophysics Data System (ADS)

    Kargin, V.

    2014-02-01

    We study Pfaffian random point fields by using the Moore-Dyson quaternion determinants. First, we give sufficient conditions that ensure that a self-dual quaternion kernel defines a valid random point field, and then we prove a CLT for Pfaffian point fields. The proofs are based on a new quaternion extension of the Cauchy-Binet determinantal identity. In addition, we derive the Fredholm determinantal formulas for the Pfaffian point fields which use the quaternion determinant.

  5. Effect of anger management education on mental health and aggression of prisoner women

    PubMed Central

    Bahrami, Elaheh; Mazaheri, Maryam Amidi; Hasanzadeh, Akbar

    2016-01-01

    Background and Purpose: “Uncontrolled anger” threats the compatible and health of people as serious risk. The effects of weaknesses and shortcomings in the management of anger, from personal distress and destruction interpersonal relationships beyond and linked to the public health problems, lack of compromises, and aggressive behavior adverse outcomes. This study investigates the effects of anger management education on mental health and aggression of prisoner women in Isfahan. Materials and Methods: The single-group quasi-experimental (pretest, posttest) by prisoner women in the central prison of Isfahan was done. Multi-stage random sampling method was used. Initially, 165 women were selected randomly and completed the Buss and Perry Aggression Questionnaire and the General Health Questionnaire-28, and among these, those with scores >78 (the cut point) in aggression scale was selected and among them 70 were randomly selected. In the next step, interventions in four 90 min training sessions were conducted. Posttest was performed within 1-month after the intervention. Data were analyzed using SPSS-20 software. Results: Data analysis showed that anger management training was effective in reducing aggression (P < 0.001) and also had a positive effect on mental health (P < 0.001). Conclusion: According to the importance of aggression in consistency and individual and collective health and according to findings, presented educational programs on anger management is essential for female prisoners. PMID:27512697

  6. Randomized pilot trial of gene expression profiling versus heart biopsy in the first year after heart transplant: early invasive monitoring attenuation through gene expression trial.

    PubMed

    Kobashigawa, Jon; Patel, Jignesh; Azarbal, Babak; Kittleson, Michelle; Chang, David; Czer, Lawrence; Daun, Tiffany; Luu, Minh; Trento, Alfredo; Cheng, Richard; Esmailian, Fardad

    2015-05-01

    The endomyocardial biopsy (EMB) is considered the gold standard in rejection surveillance post cardiac transplant, but is invasive, with risk of complications. A previous trial suggested that the gene expression profiling (GEP) blood test was noninferior to EMB between 6 and 60 months post transplant. As most rejections occur in the first 6 months, we conducted a single-center randomized trial of GEP versus EMB starting at 55 days post transplant (when GEP is valid). Sixty heart transplant patients meeting inclusion criteria were randomized beginning at 55 days post transplant to either GEP or EMB arms. A positive GEP ≥30 between 2 and 6 months, or ≥34 after 6 months, prompted a follow-up biopsy. The primary end point included a composite of death/retransplant, rejection with hemodynamic compromise or graft dysfunction at 18 months post transplant. A coprimary end point included change in first-year maximal intimal thickness by intravascular ultrasound, a recognized surrogate for long-term outcome. Corticosteroid weaning was assessed in both the groups. The composite end point was similar between the GEP and EMB groups (10% versus 17%; log-rank P=0.44). The coprimary end point of first-year intravascular ultrasound change demonstrated no difference in mean maximal intimal thickness (0.35±0.36 versus 0.36±0.26 mm; P=0.944). Steroid weaning was successful in both the groups (91% versus 95%). In this pilot study, GEP starting at 55 days post transplant seems comparable with EMB for rejection surveillance in selected heart transplant patients and does not result in increased adverse outcomes. GEP also seems useful to guide corticosteroid weaning. Larger randomized trials are required to confirm these findings. URL: http://www.clinicaltrials.gov. Unique identifier: NCT014182482377. © 2015 American Heart Association, Inc.

  7. Salient Point Detection in Protrusion Parts of 3D Object Robust to Isometric Variations

    NASA Astrophysics Data System (ADS)

    Mirloo, Mahsa; Ebrahimnezhad, Hosein

    2018-03-01

    In this paper, a novel method is proposed to detect 3D object salient points robust to isometric variations and stable against scaling and noise. Salient points can be used as the representative points from object protrusion parts in order to improve the object matching and retrieval algorithms. The proposed algorithm is started by determining the first salient point of the model based on the average geodesic distance of several random points. Then, according to the previous salient point, a new point is added to this set of points in each iteration. By adding every salient point, decision function is updated. Hence, a condition is created for selecting the next point in which the iterative point is not extracted from the same protrusion part so that drawing out of a representative point from every protrusion part is guaranteed. This method is stable against model variations with isometric transformations, scaling, and noise with different levels of strength due to using a feature robust to isometric variations and considering the relation between the salient points. In addition, the number of points used in averaging process is decreased in this method, which leads to lower computational complexity in comparison with the other salient point detection algorithms.

  8. Variation in the annual unsatisfactory rates of selected pathogens and indicators in ready-to-eat food sampled from the point of sale or service in Wales, United Kingdom.

    PubMed

    Meldrum, R J; Garside, J; Mannion, P; Charles, D; Ellis, P

    2012-12-01

    The Welsh Food Microbiological Forum "shopping basket" survey is a long running, structured surveillance program examining ready-to-eat food randomly sampled from the point of sale or service in Wales, United Kingdom. The annual unsatisfactory rates for selected indicators and pathogens for 1998 through 2008 were examined. All the annual unsatisfactory rates for the selected pathogens were <0.5%, and no pattern with the annual rate was observed. There was also no discernible trend observed for the annual rates of Listeria spp. (not moncytogenes), with all rates <0.5%. However, there was a trend observed for Esherichia coli, with a decrease in rate between 1998 and 2003, rapid in the first few years, and then a gradual increase in rate up to 2008. It was concluded that there was no discernible pattern to the annual unsatisfactory rates for Listeria spp. (not monocytogenes), L. monocytogenes, Staphylococcus aureus, and Bacillus cereus, but that a definite trend had been observed for E. coli.

  9. Bayesian dynamic modeling of time series of dengue disease case counts

    PubMed Central

    López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-01-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model’s short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health. PMID:28671941

  10. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    NASA Astrophysics Data System (ADS)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  11. Transcranial direct current stimulation to primary motor area improves hand dexterity and selective attention in chronic stroke.

    PubMed

    Au-Yeung, Stephanie S Y; Wang, Juliana; Chen, Ye; Chua, Eldrich

    2014-12-01

    The aim of this study was to determine whether transcranial direct current stimulation (tDCS) applied to the primary motor hand area modulates hand dexterity and selective attention after stroke. This study was a double-blind, placebo-controlled, randomized crossover trial involving subjects with chronic stroke. Ten stroke survivors with some pinch strength in the paretic hand received three different tDCS interventions assigned in random order in separate sessions-anodal tDCS targeting the primary motor area of the lesioned hemisphere (M1lesioned), cathodal tDCS applied to the contralateral hemisphere (M1nonlesioned), and sham tDCS-each for 20 mins. The primary outcome measures were Purdue pegboard test scores for hand dexterity and response time in the color-word Stroop test for selective attention. Pinch strength of the paretic hand was the secondary outcome. Cathodal tDCS to M1nonlesioned significantly improved affected hand dexterity (by 1.1 points on the Purdue pegboard unimanual test, P = 0.014) and selective attention (0.6 secs faster response time on the level 3 Stroop interference test for response inhibition, P = 0.017), but not pinch strength. The outcomes were not improved with anodal tDCS to M1lesioned or sham tDCS. Twenty minutes of cathodal tDCS to M1nonlesioned can promote both paretic hand dexterity and selective attention in people with chronic stroke.

  12. Academic Freedom in Al Al-Bayt University and the Level of Practicing It from the View Point of the Faculty Members Based on Some Variables

    ERIC Educational Resources Information Center

    Al-Madi, Bayan

    2013-01-01

    The purpose of this study is to identify the level of practicing academic freedom by the faculty members of Al al-Bayt University. The study population included all the faculty members (297) of Al al-Bayt University, during the academic year, 2010/2011. The study sample was randomly selected and included 250 faculty members. To achieve the aims of…

  13. 47 CFR 1.1602 - Designation for random selection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...

  14. 47 CFR 1.1602 - Designation for random selection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...

  15. 47 CFR 1.1603 - Conduct of random selection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...

  16. 47 CFR 1.1603 - Conduct of random selection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...

  17. High-speed, random-access fluorescence microscopy: I. High-resolution optical recording with voltage-sensitive dyes and ion indicators.

    PubMed

    Bullen, A; Patel, S S; Saggau, P

    1997-07-01

    The design and implementation of a high-speed, random-access, laser-scanning fluorescence microscope configured to record fast physiological signals from small neuronal structures with high spatiotemporal resolution is presented. The laser-scanning capability of this nonimaging microscope is provided by two orthogonal acousto-optic deflectors under computer control. Each scanning point can be randomly accessed and has a positioning time of 3-5 microseconds. Sampling time is also computer-controlled and can be varied to maximize the signal-to-noise ratio. Acquisition rates up to 200k samples/s at 16-bit digitizing resolution are possible. The spatial resolution of this instrument is determined by the minimal spot size at the level of the preparation (i.e., 2-7 microns). Scanning points are selected interactively from a reference image collected with differential interference contrast optics and a video camera. Frame rates up to 5 kHz are easily attainable. Intrinsic variations in laser light intensity and scanning spot brightness are overcome by an on-line signal-processing scheme. Representative records obtained with this instrument by using voltage-sensitive dyes and calcium indicators demonstrate the ability to make fast, high-fidelity measurements of membrane potential and intracellular calcium at high spatial resolution (2 microns) without any temporal averaging.

  18. High-speed, random-access fluorescence microscopy: I. High-resolution optical recording with voltage-sensitive dyes and ion indicators.

    PubMed Central

    Bullen, A; Patel, S S; Saggau, P

    1997-01-01

    The design and implementation of a high-speed, random-access, laser-scanning fluorescence microscope configured to record fast physiological signals from small neuronal structures with high spatiotemporal resolution is presented. The laser-scanning capability of this nonimaging microscope is provided by two orthogonal acousto-optic deflectors under computer control. Each scanning point can be randomly accessed and has a positioning time of 3-5 microseconds. Sampling time is also computer-controlled and can be varied to maximize the signal-to-noise ratio. Acquisition rates up to 200k samples/s at 16-bit digitizing resolution are possible. The spatial resolution of this instrument is determined by the minimal spot size at the level of the preparation (i.e., 2-7 microns). Scanning points are selected interactively from a reference image collected with differential interference contrast optics and a video camera. Frame rates up to 5 kHz are easily attainable. Intrinsic variations in laser light intensity and scanning spot brightness are overcome by an on-line signal-processing scheme. Representative records obtained with this instrument by using voltage-sensitive dyes and calcium indicators demonstrate the ability to make fast, high-fidelity measurements of membrane potential and intracellular calcium at high spatial resolution (2 microns) without any temporal averaging. Images FIGURE 6 PMID:9199810

  19. Learning a constrained conditional random field for enhanced segmentation of fallen trees in ALS point clouds

    NASA Astrophysics Data System (ADS)

    Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe

    2018-06-01

    In this study, we present a method for improving the quality of automatic single fallen tree stem segmentation in ALS data by applying a specialized constrained conditional random field (CRF). The entire processing pipeline is composed of two steps. First, short stem segments of equal length are detected and a subset of them is selected for further processing, while in the second step the chosen segments are merged to form entire trees. The first step is accomplished using the specialized CRF defined on the space of segment labelings, capable of finding segment candidates which are easier to merge subsequently. To achieve this, the CRF considers not only the features of every candidate individually, but incorporates pairwise spatial interactions between adjacent segments into the model. In particular, pairwise interactions include a collinearity/angular deviation probability which is learned from training data as well as the ratio of spatial overlap, whereas unary potentials encode a learned probabilistic model of the laser point distribution around each segment. Each of these components enters the CRF energy with its own balance factor. To process previously unseen data, we first calculate the subset of segments for merging on a grid of balance factors by minimizing the CRF energy. Then, we perform the merging and rank the balance configurations according to the quality of their resulting merged trees, obtained from a learned tree appearance model. The final result is derived from the top-ranked configuration. We tested our approach on 5 plots from the Bavarian Forest National Park using reference data acquired in a field inventory. Compared to our previous segment selection method without pairwise interactions, an increase in detection correctness and completeness of up to 7 and 9 percentage points, respectively, was observed.

  20. National Emphysema Treatment Trial redux: accentuating the positive.

    PubMed

    Sanchez, Pablo Gerardo; Kucharczuk, John Charles; Su, Stacey; Kaiser, Larry Robert; Cooper, Joel David

    2010-09-01

    Under the Freedom of Information Act, we obtained the follow-up data of the National Emphysema Treatment Trial (NETT) to determine the long-term outcome for "a heterogeneous distribution of emphysema with upper lobe predominance," postulated by the NETT hypothesis to be optimal candidates for lung volume reduction surgery. Using the NETT database, we identified patients with heterogeneous distribution of emphysema with upper lobe predominance and analyzed for the first time follow-up data for those receiving lung volume reduction surgery and those receiving medical management. Furthermore, we compared the results of the NETT reduction surgery group with a previously reported consecutive case series of 250 patients undergoing bilateral lung volume reduction surgery using similar selection criteria. Of the 1218 patients enrolled, 511 (42%) conformed to the NETT hypothesis selection criteria and received the randomly assigned surgical or medical treatment (surgical = 261; medical = 250). Lung volume reduction surgery resulted in a 5-year survival benefit (70% vs 60%; P = .02). Results at 3 years compared with baseline data favored surgical reduction in terms of residual volume reduction (25% vs 2%; P < .001), University of California San Diego dyspnea score (16 vs 0 points; P < .001), and improved St George Respiratory Questionnaire quality of life score (12 points vs 0 points; P < .001). For the 513 patients with a homogeneous pattern of emphysema randomized to surgical or medical treatment, lung volume reduction surgery produced no survival advantage and very limited functional benefit. Patients most likely to benefit from lung volume reduction surgery have heterogeneously distributed emphysema involving the upper lung zones predominantly. Such patients in the NETT trial had results nearly identical to those previously reported in a nonrandomized series of similar patients undergoing lung volume reduction surgery. 2010 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.

  1. Mixing rates and limit theorems for random intermittent maps

    NASA Astrophysics Data System (ADS)

    Bahsoun, Wael; Bose, Christopher

    2016-04-01

    We study random transformations built from intermittent maps on the unit interval that share a common neutral fixed point. We focus mainly on random selections of Pomeu-Manneville-type maps {{T}α} using the full parameter range 0<α <∞ , in general. We derive a number of results around a common theme that illustrates in detail how the constituent map that is fastest mixing (i.e. smallest α) combined with details of the randomizing process, determines the asymptotic properties of the random transformation. Our key result (theorem 1.1) establishes sharp estimates on the position of return time intervals for the quenched dynamics. The main applications of this estimate are to limit laws (in particular, CLT and stable laws, depending on the parameters chosen in the range 0<α <1 ) for the associated skew product; these are detailed in theorem 3.2. Since our estimates in theorem 1.1 also hold for 1≤slant α <∞ we study a second class of random transformations derived from piecewise affine Gaspard-Wang maps, prove existence of an infinite (σ-finite) invariant measure and study the corresponding correlation asymptotics. To the best of our knowledge, this latter kind of result is completely new in the setting of random transformations.

  2. Enhancing local health department disaster response capacity with rapid community needs assessments: validation of a computerized program for binary attribute cluster sampling.

    PubMed

    Groenewold, Matthew R

    2006-01-01

    Local health departments are among the first agencies to respond to disasters or other mass emergencies. However, they often lack the ability to handle large-scale events. Plans including locally developed and deployed tools may enhance local response. Simplified cluster sampling methods can be useful in assessing community needs after a sudden-onset, short duration event. Using an adaptation of the methodology used by the World Health Organization Expanded Programme on Immunization (EPI), a Microsoft Access-based application for two-stage cluster sampling of residential addresses in Louisville/Jefferson County Metro, Kentucky was developed. The sampling frame was derived from geographically referenced data on residential addresses and political districts available through the Louisville/Jefferson County Information Consortium (LOJIC). The program randomly selected 30 clusters, defined as election precincts, from within the area of interest, and then, randomly selected 10 residential addresses from each cluster. The program, called the Rapid Assessment Tools Package (RATP), was tested in terms of accuracy and precision using data on a dichotomous characteristic of residential addresses available from the local tax assessor database. A series of 30 samples were produced and analyzed with respect to their precision and accuracy in estimating the prevalence of the study attribute. Point estimates with 95% confidence intervals were calculated by determining the proportion of the study attribute values in each of the samples and compared with the population proportion. To estimate the design effect, corresponding simple random samples of 300 addresses were taken after each of the 30 cluster samples. The sample proportion fell within +/-10 absolute percentage points of the true proportion in 80% of the samples. In 93.3% of the samples, the point estimate fell within +/-12.5%, and 96.7% fell within +/-15%. All of the point estimates fell within +/-20% of the true proportion. Estimates of the design effect ranged from 0.926 to 1.436 (mean = 1.157, median = 1.170) for the 30 samples. Although prospective evaluation of its performance in field trials or a real emergency is required to confirm its utility, this study suggests that the RATP, a locally designed and deployed tool, may provide population-based estimates of community needs or the extent of event-related consequences that are precise enough to serve as the basis for the initial post-event decisions regarding relief efforts.

  3. Generation of kth-order random toposequences

    NASA Astrophysics Data System (ADS)

    Odgers, Nathan P.; McBratney, Alex. B.; Minasny, Budiman

    2008-05-01

    The model presented in this paper derives toposequences from a digital elevation model (DEM). It is written in ArcInfo Macro Language (AML). The toposequences are called kth-order random toposequences, because they take a random path uphill to the top of a hill and downhill to a stream or valley bottom from a randomly selected seed point, and they are located in a streamshed of order k according to a particular stream-ordering system. We define a kth-order streamshed as the area of land that drains directly to a stream segment of stream order k. The model attempts to optimise the spatial configuration of a set of derived toposequences iteratively by using simulated annealing to maximise the total sum of distances between each toposequence hilltop in the set. The user is able to select the order, k, of the derived toposequences. Toposequences are useful for determining soil sampling locations for use in collecting soil data for digital soil mapping applications. Sampling locations can be allocated according to equal elevation or equal-distance intervals along the length of the toposequence, for example. We demonstrate the use of this model for a study area in the Hunter Valley of New South Wales, Australia. Of the 64 toposequences derived, 32 were first-order random toposequences according to Strahler's stream-ordering system, and 32 were second-order random toposequences. The model that we present in this paper is an efficient method for sampling soil along soil toposequences. The soils along a toposequence are related to each other by the topography they are found in, so soil data collected by this method is useful for establishing soil-landscape rules for the preparation of digital soil maps.

  4. A nonparametric significance test for sampled networks.

    PubMed

    Elliott, Andrew; Leicht, Elizabeth; Whitmore, Alan; Reinert, Gesine; Reed-Tsochas, Felix

    2018-01-01

    Our work is motivated by an interest in constructing a protein-protein interaction network that captures key features associated with Parkinson's disease. While there is an abundance of subnetwork construction methods available, it is often far from obvious which subnetwork is the most suitable starting point for further investigation. We provide a method to assess whether a subnetwork constructed from a seed list (a list of nodes known to be important in the area of interest) differs significantly from a randomly generated subnetwork. The proposed method uses a Monte Carlo approach. As different seed lists can give rise to the same subnetwork, we control for redundancy by constructing a minimal seed list as the starting point for the significance test. The null model is based on random seed lists of the same length as a minimum seed list that generates the subnetwork; in this random seed list the nodes have (approximately) the same degree distribution as the nodes in the minimum seed list. We use this null model to select subnetworks which deviate significantly from random on an appropriate set of statistics and might capture useful information for a real world protein-protein interaction network. The software used in this paper are available for download at https://sites.google.com/site/elliottande/. The software is written in Python and uses the NetworkX library. ande.elliott@gmail.com or felix.reed-tsochas@sbs.ox.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  5. A nonparametric significance test for sampled networks

    PubMed Central

    Leicht, Elizabeth; Whitmore, Alan; Reinert, Gesine; Reed-Tsochas, Felix

    2018-01-01

    Abstract Motivation Our work is motivated by an interest in constructing a protein–protein interaction network that captures key features associated with Parkinson’s disease. While there is an abundance of subnetwork construction methods available, it is often far from obvious which subnetwork is the most suitable starting point for further investigation. Results We provide a method to assess whether a subnetwork constructed from a seed list (a list of nodes known to be important in the area of interest) differs significantly from a randomly generated subnetwork. The proposed method uses a Monte Carlo approach. As different seed lists can give rise to the same subnetwork, we control for redundancy by constructing a minimal seed list as the starting point for the significance test. The null model is based on random seed lists of the same length as a minimum seed list that generates the subnetwork; in this random seed list the nodes have (approximately) the same degree distribution as the nodes in the minimum seed list. We use this null model to select subnetworks which deviate significantly from random on an appropriate set of statistics and might capture useful information for a real world protein–protein interaction network. Availability and implementation The software used in this paper are available for download at https://sites.google.com/site/elliottande/. The software is written in Python and uses the NetworkX library. Contact ande.elliott@gmail.com or felix.reed-tsochas@sbs.ox.ac.uk Supplementary information Supplementary data are available at Bioinformatics online. PMID:29036452

  6. Toward Large-Graph Comparison Measures to Understand Internet Topology Dynamics

    DTIC Science & Technology

    2013-09-01

    continuously from randomly selected vantage points in these monitors to destination IP addresses . From each IPv4 /24 prefix on the Internet, a destination is...expected to be more similar. This was verified when the esd and vsd measures applied to this dataset gave a low reading 5 An IPv4 address is a 32-bit...integer value. /24 is the prefix of the IPv4 network starting at a given address , having 24 bits allocated for the network prefix. 6 This utility

  7. Antagonistic effect of disulfide-rich peptide aptamers selected by cDNA display on interleukin-6-dependent cell proliferation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nemoto, Naoto, E-mail: nemoto@fms.saitama-u.ac.jp; Innovation Center for Startups, National Institute of Advanced Industrial Science and Technology, 2-2-2 Marunouchi, Chiyoda-ku, Tokyo 100-0005; Janusys Corporation, 508, Saitama Industrial Technology Center, Skip City, 3-12-18 Kami-Aoki, Kawaguchi, Saitama 333-0844

    2012-04-27

    Highlights: Black-Right-Pointing-Pointer Disulfide-rich peptide aptamer inhibits IL-6-dependent cell proliferation. Black-Right-Pointing-Pointer Disulfide bond of peptide aptamer is essential for its affinity to IL-6R. Black-Right-Pointing-Pointer Inhibitory effect of peptide depends on number and pattern of its disulfide bonds. -- Abstract: Several engineered protein scaffolds have been developed recently to circumvent particular disadvantages of antibodies such as their large size and complex composition, low stability, and high production costs. We previously identified peptide aptamers containing one or two disulfide-bonds as an alternative ligand to the interleukin-6 receptor (IL-6R). Peptide aptamers (32 amino acids in length) were screened from a random peptide library bymore » in vitro peptide selection using the evolutionary molecular engineering method 'cDNA display'. In this report, the antagonistic activity of the peptide aptamers were examined by an in vitro competition enzyme-linked immunosorbent assay (ELISA) and an IL-6-dependent cell proliferation assay. The results revealed that a disulfide-rich peptide aptamer inhibited IL-6-dependent cell proliferation with similar efficacy to an anti-IL-6R monoclonal antibody.« less

  8. Ultrafast random-access scanning in two-photon microscopy using acousto-optic deflectors.

    PubMed

    Salomé, R; Kremer, Y; Dieudonné, S; Léger, J-F; Krichevsky, O; Wyart, C; Chatenay, D; Bourdieu, L

    2006-06-30

    Two-photon scanning microscopy (TPSM) is a powerful tool for imaging deep inside living tissues with sub-cellular resolution. The temporal resolution of TPSM is however strongly limited by the galvanometric mirrors used to steer the laser beam. Fast physiological events can therefore only be followed by scanning repeatedly a single line within the field of view. Because acousto-optic deflectors (AODs) are non-mechanical devices, they allow access at any point within the field of view on a microsecond time scale and are therefore excellent candidates to improve the temporal resolution of TPSM. However, the use of AOD-based scanners with femtosecond pulses raises several technical difficulties. In this paper, we describe an all-digital TPSM setup based on two crossed AODs. It includes in particular an acousto-optic modulator (AOM) placed at 45 degrees with respect to the AODs to pre-compensate for the large spatial distortions of femtosecond pulses occurring in the AODs, in order to optimize the spatial resolution and the fluorescence excitation. Our setup allows recording from freely selectable point-of-interest at high speed (1kHz). By maximizing the time spent on points of interest, random-access TPSM (RA-TPSM) constitutes a promising method for multiunit recordings with millisecond resolution in biological tissues.

  9. Comparison of cutting and pencil-point spinal needle in spinal anesthesia regarding postdural puncture headache

    PubMed Central

    Xu, Hong; Liu, Yang; Song, WenYe; Kan, ShunLi; Liu, FeiFei; Zhang, Di; Ning, GuangZhi; Feng, ShiQing

    2017-01-01

    Abstract Background: Postdural puncture headache (PDPH), mainly resulting from the loss of cerebral spinal fluid (CSF), is a well-known iatrogenic complication of spinal anesthesia and diagnostic lumbar puncture. Spinal needles have been modified to minimize complications. Modifiable risk factors of PDPH mainly included needle size and needle shape. However, whether the incidence of PDPH is significantly different between cutting-point and pencil-point needles was controversial. Then we did a meta-analysis to assess the incidence of PDPH of cutting spinal needle and pencil-point spinal needle. Methods: We included all randomly designed trials, assessing the clinical outcomes in patients given elective spinal anesthesia or diagnostic lumbar puncture with either cutting or pencil-point spinal needle as eligible studies. All selected studies and the risk of bias of them were assessed by 2 investigators. Clinical outcomes including success rates, frequency of PDPH, reported severe PDPH, and the use of epidural blood patch (EBP) were recorded as primary results. Results were evaluated using risk ratio (RR) with 95% confidence interval (CI) for dichotomous variables. Rev Man software (version 5.3) was used to analyze all appropriate data. Results: Twenty-five randomized controlled trials (RCTs) were included in our study. The analysis result revealed that pencil-point spinal needle would result in lower rate of PDPH (RR 2.50; 95% CI [1.96, 3.19]; P < 0.00001) and severe PDPH (RR 3.27; 95% CI [2.15, 4.96]; P < 0.00001). Furthermore, EBP was less used in pencil-point spine needle group (RR 3.69; 95% CI [1.96, 6.95]; P < 0.0001). Conclusions: Current evidences suggest that pencil-point spinal needle was significantly superior compared with cutting spinal needle regarding the frequency of PDPH, PDPH severity, and the use of EBP. In view of this, we recommend the use of pencil-point spinal needle in spinal anesthesia and lumbar puncture. PMID:28383416

  10. Habitat classification modeling with incomplete data: Pushing the habitat envelope

    USGS Publications Warehouse

    Zarnetske, P.L.; Edwards, T.C.; Moisen, Gretchen G.

    2007-01-01

    Habitat classification models (HCMs) are invaluable tools for species conservation, land-use planning, reserve design, and metapopulation assessments, particularly at broad spatial scales. However, species occurrence data are often lacking and typically limited to presence points at broad scales. This lack of absence data precludes the use of many statistical techniques for HCMs. One option is to generate pseudo-absence points so that the many available statistical modeling tools can be used. Traditional techniques generate pseudoabsence points at random across broadly defined species ranges, often failing to include biological knowledge concerning the species-habitat relationship. We incorporated biological knowledge of the species-habitat relationship into pseudo-absence points by creating habitat envelopes that constrain the region from which points were randomly selected. We define a habitat envelope as an ecological representation of a species, or species feature's (e.g., nest) observed distribution (i.e., realized niche) based on a single attribute, or the spatial intersection of multiple attributes. We created HCMs for Northern Goshawk (Accipiter gentilis atricapillus) nest habitat during the breeding season across Utah forests with extant nest presence points and ecologically based pseudo-absence points using logistic regression. Predictor variables were derived from 30-m USDA Landfire and 250-m Forest Inventory and Analysis (FIA) map products. These habitat-envelope-based models were then compared to null envelope models which use traditional practices for generating pseudo-absences. Models were assessed for fit and predictive capability using metrics such as kappa, thresholdindependent receiver operating characteristic (ROC) plots, adjusted deviance (Dadj2), and cross-validation, and were also assessed for ecological relevance. For all cases, habitat envelope-based models outperformed null envelope models and were more ecologically relevant, suggesting that incorporating biological knowledge into pseudo-absence point generation is a powerful tool for species habitat assessments. Furthermore, given some a priori knowledge of the species-habitat relationship, ecologically based pseudo-absence points can be applied to any species, ecosystem, data resolution, and spatial extent. ?? 2007 by the Ecological Society of America.

  11. Designs for testing group-based interventions with limited numbers of social units: The dynamic wait-listed and regression point displacement designs

    PubMed Central

    Wyman, Peter A.; Brown, C. Hendricks

    2015-01-01

    The dynamic wait-listed design (DWLD) and regression point displacement design (RPDD) address several challenges in evaluating group-based interventions when there is a limited number of groups. Both DWLD and RPDD utilize efficiencies that increase statistical power and can enhance balance between community needs and research priorities. The DWLD blocks on more time units than traditional wait-listed designs, thereby increasing the proportion of a study period during which intervention and control conditions can be compared, and can also improve logistics of implementing intervention across multiple sites and strengthen fidelity. We discuss DWLDs in the larger context of roll-out randomized designs and compare it with its cousin the Stepped Wedge design. The RPDD uses archival data on the population of settings from which intervention unit(s) are selected to create expected posttest scores for units receiving intervention, to which actual posttest scores are compared. High pretest-posttest correlations give the RPDD statistical power for assessing intervention impact even when one or a few settings receive intervention. RPDD works best when archival data are available over a number of years prior to and following intervention. If intervention units were not randomly selected, propensity scores can be used to control for nonrandom selection factors. Examples are provided of the DWLD and RPDD used to evaluate, respectively, suicide prevention training (QPR) in 32 schools and a violence prevention program (CeaseFire) in 2 Chicago police districts over a 10-year period. How DWLD and RPDD address common threats to internal and external validity, as well as their limitations, are discussed. PMID:25481512

  12. A Test for Anchoring and Yea-Saying in Experimental Consumption Data.

    PubMed

    van Soest, Arthur; Hurd, Michael

    2008-01-01

    We analyze experimental survey data, with a random split into respondents who get an open-ended question on the amount of total family consumption (with follow-up unfolding brackets of the form "Is consumption $X or more?" for those who answer "don't know" or "refuse") and respondents who are immediately directed to unfolding brackets. In both cases, the entry point of the unfolding bracket sequence is randomized. Allowing for any type of selection into answering the open-ended or bracket questions, a nonparametric test is developed for errors in the answers to the first bracket question that are different from the usual reporting errors that will also affect open-ended answers. Two types of errors are considered explicitly: anchoring and yea-saying. Data are collected in the 1995 wave of the Assets and Health Dynamics survey, which is representative of the population in the United States that is 70 years and older. We reject the joint hypothesis of no anchoring and no yea-saying. Once yea-saying is taken into account, we find no evidence of anchoring at the entry point.

  13. A Test for Anchoring and Yea-Saying in Experimental Consumption Data

    PubMed Central

    van Soest, Arthur; Hurd, Michael

    2017-01-01

    We analyze experimental survey data, with a random split into respondents who get an open-ended question on the amount of total family consumption (with follow-up unfolding brackets of the form “Is consumption $X or more?” for those who answer “don’t know” or “refuse”) and respondents who are immediately directed to unfolding brackets. In both cases, the entry point of the unfolding bracket sequence is randomized. Allowing for any type of selection into answering the open-ended or bracket questions, a nonparametric test is developed for errors in the answers to the first bracket question that are different from the usual reporting errors that will also affect open-ended answers. Two types of errors are considered explicitly: anchoring and yea-saying. Data are collected in the 1995 wave of the Assets and Health Dynamics survey, which is representative of the population in the United States that is 70 years and older. We reject the joint hypothesis of no anchoring and no yea-saying. Once yea-saying is taken into account, we find no evidence of anchoring at the entry point. PMID:29056797

  14. Efficacy and safety of pimecrolimus cream 1% in mild-to-moderate chronic hand dermatitis: a randomized, double-blind trial.

    PubMed

    Hordinsky, Maria; Fleischer, Alan; Rivers, Jason K; Poulin, Yves; Belsito, Donald; Hultsch, Thomas

    2010-08-01

    Chronic hand dermatitis is common and difficult to treat. Our aim was to assess the efficacy of pimecrolimus cream 1% in mild-to-moderate chronic hand dermatitis. Adult patients (n = 652) were randomized to pimecrolimus 1% or vehicle cream twice daily with overnight occlusion for 6 weeks, followed by a 6-week open-label pimecrolimus treatment. Primary efficacy was 5-point Investigators' Global Assessment of prospectively selected 'target hand' as treatment success (Investigators' Global Assessment 0 or 1) and treatment failure. Pruritus relief was also assessed. Following double-blind phase treatment, target hand treatment success was achieved in 29.8 and 23.2% of the patients in the pimecrolimus and vehicle groups, respectively (p = 0.057). The proportion of patients experiencing pruritus relief was significantly higher in the pimecrolimus group compared to the vehicle group at all time points throughout the double-blind phase. The groups were comparable with respect to treating disease signs. Pruritus relief, however, was significantly greater in the pimecrolimus group. Copyright 2010 S. Karger AG, Basel.

  15. Blessing of dimensionality: mathematical foundations of the statistical physics of data.

    PubMed

    Gorban, A N; Tyukin, I Y

    2018-04-28

    The concentrations of measure phenomena were discovered as the mathematical background to statistical mechanics at the end of the nineteenth/beginning of the twentieth century and have been explored in mathematics ever since. At the beginning of the twenty-first century, it became clear that the proper utilization of these phenomena in machine learning might transform the curse of dimensionality into the blessing of dimensionality This paper summarizes recently discovered phenomena of measure concentration which drastically simplify some machine learning problems in high dimension, and allow us to correct legacy artificial intelligence systems. The classical concentration of measure theorems state that i.i.d. random points are concentrated in a thin layer near a surface (a sphere or equators of a sphere, an average or median-level set of energy or another Lipschitz function, etc.). The new stochastic separation theorems describe the thin structure of these thin layers: the random points are not only concentrated in a thin layer but are all linearly separable from the rest of the set, even for exponentially large random sets. The linear functionals for separation of points can be selected in the form of the linear Fisher's discriminant. All artificial intelligence systems make errors. Non-destructive correction requires separation of the situations (samples) with errors from the samples corresponding to correct behaviour by a simple and robust classifier. The stochastic separation theorems provide us with such classifiers and determine a non-iterative (one-shot) procedure for their construction.This article is part of the theme issue 'Hilbert's sixth problem'. © 2018 The Author(s).

  16. Effect of led photobiomodulation on analgesia during labor: Study protocol for a randomized clinical trial.

    PubMed

    Traverzim, Maria Aparecida Dos Santos; Makabe, Sergio; Silva, Daniela Fátima Teixeira; Pavani, Christiane; Bussadori, Sandra Kalil; Fernandes, Kristianne Santos Porta; Motta, Lara Jansiski

    2018-06-01

    Labor pain is one of the most intense pains experienced by women, which leads to an increase in the number of women opting to undergo a cesarean delivery. Pharmacological and nonpharmacological analgesia methods are used to control labor pain. Epidural analgesia is the most commonly used pharmacological analgesia method. However, it may have side effects on the fetus and the mother. Light-emitting diode (LED) photobiomodulation is an effective and noninvasive alternative to pharmacological methods. To evaluate the effects of LED photobiomodulation on analgesia during labor. In total, 60 women in labor admitted to a public maternity hospital will be selected for a randomized controlled trial. The participants will be randomized into 2 groups: intervention group [analgesia with LED therapy (n = 30)] and control group [analgesia with bath therapy (n = 30)]. The perception of pain will be assessed using the visual analogue scale (VAS), with a score from 0 to 10 at baseline, that is, before the intervention. In both the groups, the procedures will last 10 minutes and will be performed at 3 time points during labor: during cervical dilation of 4 to 5 cm, 6 to 7 cm, and 8 to 9 cm. At all 3 time points, pain perception will be evaluated using VAS shortly after the intervention. In addition, the evaluation of membrane characteristics (intact or damaged), heart rate, uterine dynamics, and cardiotocography will be performed at all time points. The use of LED photobiomodulation will have an analgesic effect superior to that of the bath therapy.

  17. Blessing of dimensionality: mathematical foundations of the statistical physics of data

    NASA Astrophysics Data System (ADS)

    Gorban, A. N.; Tyukin, I. Y.

    2018-04-01

    The concentrations of measure phenomena were discovered as the mathematical background to statistical mechanics at the end of the nineteenth/beginning of the twentieth century and have been explored in mathematics ever since. At the beginning of the twenty-first century, it became clear that the proper utilization of these phenomena in machine learning might transform the curse of dimensionality into the blessing of dimensionality. This paper summarizes recently discovered phenomena of measure concentration which drastically simplify some machine learning problems in high dimension, and allow us to correct legacy artificial intelligence systems. The classical concentration of measure theorems state that i.i.d. random points are concentrated in a thin layer near a surface (a sphere or equators of a sphere, an average or median-level set of energy or another Lipschitz function, etc.). The new stochastic separation theorems describe the thin structure of these thin layers: the random points are not only concentrated in a thin layer but are all linearly separable from the rest of the set, even for exponentially large random sets. The linear functionals for separation of points can be selected in the form of the linear Fisher's discriminant. All artificial intelligence systems make errors. Non-destructive correction requires separation of the situations (samples) with errors from the samples corresponding to correct behaviour by a simple and robust classifier. The stochastic separation theorems provide us with such classifiers and determine a non-iterative (one-shot) procedure for their construction. This article is part of the theme issue `Hilbert's sixth problem'.

  18. A systematic study of acupuncture practice: acupoint usage in an outpatient setting in Beijing, China.

    PubMed

    Napadow, Vitaly; Liu, Jing; Kaptchuk, Ted J

    2004-12-01

    Acupuncture textbooks mention a wide assortment of indications for each acupuncture point and, conversely, each disease or indication can be treated by a wide assortment of acupoints. However, little systematic information exists on how acupuncture is actually used in practice: i.e. which points are actually selected and for which conditions. This study prospectively gathered data on acupuncture point usage in two primarily acupuncture hospital clinics in Beijing, China. Of the more than 150 unique acupoints, the 30 most commonly used points represented 68% of the total number of acupoints needled at the first clinic, and 63% of points needled at the second clinic. While acupuncturists use a similar set of most prevalent points, such as LI-4 (used in >65% of treatments at both clinic sites), this core of points only partially overlaps. These results support the hypothesis that while the most commonly used points are similar from one acupuncturist to another, each practitioner tends to have certain acupoints, which are favorites as core points or to round out the point prescription. In addition, the results of this study are consistent with the recent development of "manualized" protocols in randomized controlled trials of acupuncture where a fixed set of acupoints are augmented depending on individualized signs and symptoms (TCM patterns).

  19. Response rate differences between web and alternative data collection methods for public health research: a systematic review of the literature.

    PubMed

    Blumenberg, Cauane; Barros, Aluísio J D

    2018-07-01

    To systematically review the literature and compare response rates (RRs) of web surveys to alternative data collection methods in the context of epidemiologic and public health studies. We reviewed the literature using PubMed, LILACS, SciELO, WebSM, and Google Scholar databases. We selected epidemiologic and public health studies that considered the general population and used two parallel data collection methods, being one web-based. RR differences were analyzed using two-sample test of proportions, and pooled using random effects. We investigated agreement using Bland-and-Altman, and correlation using Pearson's coefficient. We selected 19 studies (nine randomized trials). The RR of the web-based data collection was 12.9 percentage points (p.p.) lower (95% CI = - 19.0, - 6.8) than the alternative methods, and 15.7 p.p. lower (95% CI = - 24.2, - 7.3) considering only randomized trials. Monetary incentives did not reduce the RR differences. A strong positive correlation (r = 0.83) between the RRs was observed. Web-based data collection present lower RRs compared to alternative methods. However, it is not recommended to interpret this as a meta-analytical evidence due to the high heterogeneity of the studies.

  20. Construction and identification of a D-Vine model applied to the probability distribution of modal parameters in structural dynamics

    NASA Astrophysics Data System (ADS)

    Dubreuil, S.; Salaün, M.; Rodriguez, E.; Petitjean, F.

    2018-01-01

    This study investigates the construction and identification of the probability distribution of random modal parameters (natural frequencies and effective parameters) in structural dynamics. As these parameters present various types of dependence structures, the retained approach is based on pair copula construction (PCC). A literature review leads us to choose a D-Vine model for the construction of modal parameters probability distributions. Identification of this model is based on likelihood maximization which makes it sensitive to the dimension of the distribution, namely the number of considered modes in our context. To this respect, a mode selection preprocessing step is proposed. It allows the selection of the relevant random modes for a given transfer function. The second point, addressed in this study, concerns the choice of the D-Vine model. Indeed, D-Vine model is not uniquely defined. Two strategies are proposed and compared. The first one is based on the context of the study whereas the second one is purely based on statistical considerations. Finally, the proposed approaches are numerically studied and compared with respect to their capabilities, first in the identification of the probability distribution of random modal parameters and second in the estimation of the 99 % quantiles of some transfer functions.

  1. A randomized, Phase IIb study investigating oliceridine (TRV130), a novel µ-receptor G-protein pathway selective (μ-GPS) modulator, for the management of moderate to severe acute pain following abdominoplasty.

    PubMed

    Singla, Neil; Minkowitz, Harold S; Soergel, David G; Burt, David A; Subach, Ruth Ann; Salamea, Monica Y; Fossler, Michael J; Skobieranda, Franck

    2017-01-01

    Oliceridine (TRV130), a novel μ-receptor G-protein pathway selective (μ-GPS) modulator, was designed to improve the therapeutic window of conventional opioids by activating G-protein signaling while causing low β-arrestin recruitment to the μ receptor. This randomized, double-blind, patient-controlled analgesia Phase IIb study was conducted to investigate the efficacy, safety, and tolerability of oliceridine compared with morphine and placebo in patients with moderate to severe pain following abdominoplasty (NCT02335294; oliceridine is an investigational agent not yet approved by the US Food and Drug Administration). Patients were randomized to receive postoperative regimens of intravenous oliceridine (loading/patient-controlled demand doses [mg/mg]: 1.5/0.10 [regimen A]; 1.5/0.35 [regimen B]), morphine (4.0/1.0), or placebo with treatment initiated within 4 hours of surgery and continued as needed for 24 hours. Two hundred patients were treated (n=39, n=39, n=83, and n=39 in the oliceridine regimen A, oliceridine regimen B, morphine, and placebo groups, respectively). Patients were predominantly female (n=198 [99%]) and had a mean age of 38.2 years, weight of 71.2 kg, and baseline pain score of 7.7 (on 11-point numeric pain rating scale). Patients receiving the oliceridine regimens had reductions in average pain scores (model-based change in time-weighted average versus placebo over 24 hours) of 2.3 and 2.1 points, respectively ( P =0.0001 and P =0.0005 versus placebo); patients receiving morphine had a similar reduction (2.1 points; P <0.0001 versus placebo). A lower prevalence of adverse events (AEs) related to nausea, vomiting, and respiratory function was observed with the oliceridine regimens than with morphine ( P <0.05). Other AEs with oliceridine were generally dose-related and similar in nature to those observed with conventional opioids; no serious AEs were reported with oliceridine. These results suggest that oliceridine may provide effective, rapid analgesia in patients with moderate to severe postoperative pain, with an acceptable safety/tolerability profile and potentially wider therapeutic window than morphine.

  2. Comparison of the effects of firocoxib, carprofen and vedaprofen in a sodium urate crystal induced synovitis model of arthritis in dogs.

    PubMed

    Hazewinkel, Herman A W; van den Brom, Walter E; Theyse, Lars F H; Pollmeier, Matthias; Hanson, Peter D

    2008-02-01

    A randomized, placebo-controlled, four-period cross-over laboratory study involving eight dogs was conducted to confirm the effective analgesic dose of firocoxib, a selective COX-2 inhibitor, in a synovitis model of arthritis. Firocoxib was compared to vedaprofen and carprofen, and the effect, defined as a change in weight bearing measured via peak ground reaction, was evaluated at treatment dose levels. A lameness score on a five point scale was also assigned to the affected limb. Peak vertical ground reaction force was considered to be the most relevant measurement in this study. The firocoxib treatment group performed significantly better than placebo at the 3 h post-treatment time point and significantly better than placebo and carprofen at the 7 h post-treatment time point. Improvement in lameness score was also significantly better in the dogs treated with firocoxib than placebo and carprofen at both the 3 and 7 h post-treatment time points.

  3. Feature extraction and descriptor calculation methods for automatic georeferencing of Philippines' first microsatellite imagery

    NASA Astrophysics Data System (ADS)

    Tupas, M. E. A.; Dasallas, J. A.; Jiao, B. J. D.; Magallon, B. J. P.; Sempio, J. N. H.; Ramos, M. K. F.; Aranas, R. K. D.; Tamondong, A. M.

    2017-10-01

    The FAST-SIFT corner detector and descriptor extractor combination was used to automatically georeference DIWATA-1 Spaceborne Multispectral Imager images. Features from the Fast Accelerated Segment Test (FAST) algorithm detects corners or keypoints in an image, and these robustly detected keypoints have well-defined positions. Descriptors were computed using Scale-Invariant Feature Transform (SIFT) extractor. FAST-SIFT method effectively SMI same-subscene images detected by the NIR sensor. The method was also tested in stitching NIR images with varying subscene swept by the camera. The slave images were matched to the master image. The keypoints served as the ground control points. Random sample consensus was used to eliminate fall-out matches and ensure accuracy of the feature points from which the transformation parameters were derived. Keypoints are matched based on their descriptor vector. Nearest-neighbor matching is employed based on a metric distance between the descriptors. The metrics include Euclidean and city block, among others. Rough matching outputs not only the correct matches but also the faulty matches. A previous work in automatic georeferencing incorporates a geometric restriction. In this work, we applied a simplified version of the learning method. RANSAC was used to eliminate fall-out matches and ensure accuracy of the feature points. This method identifies if a point fits the transformation function and returns inlier matches. The transformation matrix was solved by Affine, Projective, and Polynomial models. The accuracy of the automatic georeferencing method were determined by calculating the RMSE of interest points, selected randomly, between the master image and transformed slave image.

  4. Microbial contamination of meat during the skinning of beef carcass hindquarters at three slaughtering plants.

    PubMed

    Gill, C O; McGinnis, J C; Bryant, J

    1998-07-21

    The microbiological effects on the product of the series of operations for skinning the hindquarters of beef carcasses at three packing plants were assessed. Samples were obtained at each plant from randomly selected carcasses, by swabbing specified sites related to opening cuts, rump skinning or flank skinning operations, randomly selected sites along the lines of the opening cuts, or randomly selected sites on the skinned hindquarters of carcasses. A set of 25 samples of each type was collected at each plant, with the collection of a single sample from each selected carcass. Aerobic counts, coliforms and Escherichia coli were enumerated in each sample, and a log mean value was estimated for each set of 25 counts on the assumption of a log normal distribution of the counts. The data indicated that the hindquarters skinning operations at plant A were hygienically inferior to those at the other two plants, with mean numbers of coliforms and E. coli being about two orders of magnitude greater, and aerobic counts being an order of magnitude greater on the skinned hindquarters of carcasses from plant A than on those from plants B or C. The data further indicated that the operation for cutting open the skin at plant C was hygienically superior to the equivalent operation at plant B, but that the operations for skinning the rump and flank at plant B were hygienically superior to the equivalent operations at plant C. The findings suggest that objective assessment of the microbiological effects on carcasses of beef carcass dressing processes will be required to ensure that Hazard Analysis: Critical Control Point and Quality Management Systems are operated to control the microbiological condition of carcasses.

  5. Does sex induce a phase transition?

    NASA Astrophysics Data System (ADS)

    de Oliveira, P. M. C.; Moss de Oliveira, S.; Stauffer, D.; Cebrat, S.; Pękalski, A.

    2008-05-01

    We discovered a dynamic phase transition induced by sexual reproduction. The dynamics is a pure Darwinian rule applied to diploid bit-strings with both fundamental ingredients to drive Darwin's evolution: (1) random mutations and crossings which act in the sense of increasing the entropy (or diversity); and (2) selection which acts in the opposite sense by limiting the entropy explosion. Selection wins this competition if mutations performed at birth are few enough, and thus the wild genotype dominates the steady-state population. By slowly increasing the average number m of mutations, however, the population suddenly undergoes a mutational degradation precisely at a transition point mc. Above this point, the “bad” alleles (represented by 1-bits) spread over the genetic pool of the population, overcoming the selection pressure. Individuals become selectively alike, and evolution stops. Only below this point, m < mc, evolutionary life is possible. The finite-size-scaling behaviour of this transition is exhibited for large enough “chromosome” lengths L, through lengthy computer simulations. One important and surprising observation is the L-independence of the transition curves, for large L. They are also independent on the population size. Another is that mc is near unity, i.e. life cannot be stable with much more than one mutation per diploid genome, independent of the chromosome length, in agreement with reality. One possible consequence is that an eventual evolutionary jump towards larger L enabling the storage of more genetic information would demand an improved DNA copying machinery in order to keep the same total number of mutations per offspring.

  6. Capture-SELEX: Selection of DNA Aptamers for Aminoglycoside Antibiotics

    PubMed Central

    2012-01-01

    Small organic molecules are challenging targets for an aptamer selection using the SELEX technology (SELEX—Systematic Evolution of Ligans by EXponential enrichment). Often they are not suitable for immobilization on solid surfaces, which is a common procedure in known aptamer selection methods. The Capture-SELEX procedure allows the selection of DNA aptamers for solute targets. A special SELEX library was constructed with the aim to immobilize this library on magnetic beads or other surfaces. For this purpose a docking sequence was incorporated into the random region of the library enabling hybridization to a complementary oligo fixed on magnetic beads. Oligonucleotides of the library which exhibit high affinity to the target and a secondary structure fitting to the target are released from the beads for binding to the target during the aptamer selection process. The oligonucleotides of these binding complexes were amplified, purified, and immobilized via the docking sequence to the magnetic beads as the starting point of the following selection round. Based on this Capture-SELEX procedure, the successful DNA aptamer selection for the aminoglycoside antibiotic kanamycin A as a small molecule target is described. PMID:23326761

  7. Stepwise magnetic-geochemical approach for efficient assessment of heavy metal polluted sites

    NASA Astrophysics Data System (ADS)

    Appel, E.; Rösler, W.; Ojha, G.

    2012-04-01

    Previous studies have shown that magnetometry can outline the distribution of fly ash deposition in the surroundings of coal-burning power plants and steel industries. Especially the easy-to-measure magnetic susceptibility (MS) is capable to act as a proxy for heavy metal (HM) pollution caused by such kind of point source pollution. Here we present a demonstration project around the coal-burning power plant complex "Schwarze Pumpe" in eastern Germany. Before reunification of West and East Germany huge amounts of HM pollutants were emitted from the "Schwarze Pumpe" into the environment by both fly ash emission and dumped clinker. The project has been conducted as part of the TASK Centre of Competence which aims at bringing new innovative techniques closer to the market. Our project combines in situ and laboratory MS measurements and HM analyses in order to demonstrate the efficiency of a stepwise approach for site assessment of HM pollution around point sources of fly-ash emission and deposition into soil. The following scenario is played through: We assume that the "true" spatial distribution of HM pollution (given by the pollution load index PLI comprising Fe, Zn, Pb, and Cu) is represented by our entire set of 85 measured samples (XRF analyses) from forest sites around the "Schwarze Pumpe". Surface MS data (collected with a Bartington MS2D) and in situ vertical MS sections (logged by an SM400 instrument) are used to determine a qualitative overview of potentially higher and lower polluted areas. A suite of spatial HM distribution maps obtained by random selections of 30 out of the 85 analysed sites is compared to the HM map obtained from a targeted 30-sites-selection based on pre-information from the MS results. The PLI distribution map obtained from the targeted 30-sites-selection shows all essential details of the "true" pollution map, while the different random 30-sites-selections miss important features. This comparison shows that, for the same cost investment, a stepwise combined magnetic-geochemical site assessment leads to a clearly more significant characterization of soil pollution than by a common approach with exclusively random sampling for geochemical analysis, or alternatively to an equal quality result for lower costs.

  8. Entanglement entropy at infinite-randomness fixed points in higher dimensions.

    PubMed

    Lin, Yu-Cheng; Iglói, Ferenc; Rieger, Heiko

    2007-10-05

    The entanglement entropy of the two-dimensional random transverse Ising model is studied with a numerical implementation of the strong-disorder renormalization group. The asymptotic behavior of the entropy per surface area diverges at, and only at, the quantum phase transition that is governed by an infinite-randomness fixed point. Here we identify a double-logarithmic multiplicative correction to the area law for the entanglement entropy. This contrasts with the pure area law valid at the infinite-randomness fixed point in the diluted transverse Ising model in higher dimensions.

  9. Time-lapse culture with morphokinetic embryo selection improves pregnancy and live birth chances and reduces early pregnancy loss: a meta-analysis.

    PubMed

    Pribenszky, Csaba; Nilselid, Anna-Maria; Montag, Markus

    2017-11-01

    Embryo evaluation and selection is fundamental in clinical IVF. Time-lapse follow-up of embryo development comprises undisturbed culture and the application of the visual information to support embryo evaluation. A meta-analysis of randomized controlled trials was carried out to study whether time-lapse monitoring with the prospective use of a morphokinetic algorithm for selection of embryos improves overall clinical outcome (pregnancy, early pregnancy loss, stillbirth and live birth rate) compared with embryo selection based on single time-point morphology in IVF cycles. The meta-analysis of five randomized controlled trials (n = 1637) showed that the application of time-lapse monitoring was associated with a significantly higher ongoing clinical pregnancy rate (51.0% versus 39.9%), with a pooled odds ratio of 1.542 (P < 0.001), significantly lower early pregnancy loss (15.3% versus 21.3%; OR: 0.662; P = 0.019) and a significantly increased live birth rate (44.2% versus 31.3%; OR 1.668; P = 0.009). Difference in stillbirth was not significant between groups (4.7% versus 2.4%). Quality of the evidence was moderate to low owing to inconsistencies across the studies. Selective application and variability were also limitations. Although time-lapse is shown to significantly improve overall clinical outcome, further high-quality evidence is needed before universal conclusions can be drawn. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  10. Circulating intact and cleaved forms of the urokinase-type plasminogen activator receptor: biological variation, reference intervals and clinical useful cut-points.

    PubMed

    Thurison, Tine; Christensen, Ib J; Lund, Ida K; Nielsen, Hans J; Høyer-Hansen, Gunilla

    2015-01-15

    High levels of circulating forms of the urokinase-type plasminogen activator receptor (uPAR) are significantly associated to poor prognosis in cancer patients. Our aim was to determine biological variations and reference intervals of the uPAR forms in blood, and in addition, to test the clinical relevance of using these as cut-points in colorectal cancer (CRC) prognosis. uPAR forms were measured in citrated and EDTA plasma samples using time-resolved fluorescence immunoassays. Diurnal, intra- and inter-individual variations were assessed in plasma samples from cohorts of healthy individuals. Reference intervals were determined in plasma from healthy individuals randomly selected from a Danish multi-center cross-sectional study. A cohort of CRC patients was selected from the same cross-sectional study. The reference intervals showed a slight increase with age and women had ~20% higher levels. The intra- and inter-individual variations were ~10% and ~20-30%, respectively and the measured levels of the uPAR forms were within the determined 95% reference intervals. No diurnal variation was found. Applying the normal upper limit of the reference intervals as cut-point for dichotomizing CRC patients revealed significantly decreased overall survival of patients with levels above this cut-point of any uPAR form. The reference intervals for the different uPAR forms are valid and the upper normal limits are clinically relevant cut-points for CRC prognosis. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. 3D reconstruction of laser projective point with projection invariant generated from five points on 2D target.

    PubMed

    Xu, Guan; Yuan, Jing; Li, Xiaotao; Su, Jian

    2017-08-01

    Vision measurement on the basis of structured light plays a significant role in the optical inspection research. The 2D target fixed with a line laser projector is designed to realize the transformations among the world coordinate system, the camera coordinate system and the image coordinate system. The laser projective point and five non-collinear points that are randomly selected from the target are adopted to construct a projection invariant. The closed form solutions of the 3D laser points are solved by the homogeneous linear equations generated from the projection invariants. The optimization function is created by the parameterized re-projection errors of the laser points and the target points in the image coordinate system. Furthermore, the nonlinear optimization solutions of the world coordinates of the projection points, the camera parameters and the lens distortion coefficients are contributed by minimizing the optimization function. The accuracy of the 3D reconstruction is evaluated by comparing the displacements of the reconstructed laser points with the actual displacements. The effects of the image quantity, the lens distortion and the noises are investigated in the experiments, which demonstrate that the reconstruction approach is effective to contribute the accurate test in the measurement system.

  12. Different treatment modalities of fusiform basilar trunk aneurysm: study on computational hemodynamics.

    PubMed

    Wu, Chen; Xu, Bai-Nan; Sun, Zheng-Hui; Wang, Fu-Yu; Liu, Lei; Zhang, Xiao-Jun; Zhou, Ding-Biao

    2012-01-01

    Unclippable fusiform basilar trunk aneurysm is a formidable condition for surgical treatment. The aim of this study was to establish a computational model and to investigate the hemodynamic characteristics in a fusiform basilar trunk aneurysm. The three-dimensional digital model of a fusiform basilar trunk aneurysm was constructed using MIMICS, ANSYS and CFX software. Different hemodynamic modalities and border conditions were assigned to the model. Thirty points were selected randomly on the wall and within the aneurysm. Wall total pressure (WTP), wall shear stress (WSS), and blood flow velocity of each point were calculated and hemodynamic status was compared between different modalities. The quantitative average values of the 30 points on the wall and within the aneurysm were obtained by computational calculation point by point. The velocity and WSS in modalities A and B were different from those of the remaining 5 modalities; and the WTP in modalities A, E and F were higher than those of the remaining 4 modalities. The digital model of a fusiform basilar artery aneurysm is feasible and reliable. This model could provide some important information to clinical treatment options.

  13. Effect of Electroacupuncture at The Zusanli Point (Stomach-36) on Dorsal Random Pattern Skin Flap Survival in a Rat Model.

    PubMed

    Wang, Li-Ren; Cai, Le-Yi; Lin, Ding-Sheng; Cao, Bin; Li, Zhi-Jie

    2017-10-01

    Random skin flaps are commonly used for wound repair and reconstruction. Electroacupuncture at The Zusanli point could enhance microcirculation and blood perfusion in random skin flaps. To determine whether electroacupuncture at The Zusanli point can improve the survival of random skin flaps in a rat model. Thirty-six male Sprague Dawley rats were randomly divided into 3 groups: control group (no electroacupuncture), Group A (electroacupuncture at a nonacupoint near The Zusanli point), and Group B (electroacupuncture at The Zusanli point). McFarlane flaps were established. On postoperative Day 2, malondialdehyde (MDA) and superoxide dismutase were detected. The flap survival rate was evaluated, inflammation was examined in hematoxylin and eosin-stained slices, and the expression of vascular endothelial growth factor (VEGF) was measured immunohistochemically on Day 7. The mean survival area of the flaps in Group B was significantly larger than that in the control group and Group A. Superoxide dismutase activity and VEGF expression level were significantly higher in Group B than those in the control group and Group A, whereas MDA and inflammation levels in Group B were significantly lower than those in the other 2 groups. Electroacupuncture at The Zusanli point can effectively improve the random flap survival.

  14. A hybrid CS-SA intelligent approach to solve uncertain dynamic facility layout problems considering dependency of demands

    NASA Astrophysics Data System (ADS)

    Moslemipour, Ghorbanali

    2018-07-01

    This paper aims at proposing a quadratic assignment-based mathematical model to deal with the stochastic dynamic facility layout problem. In this problem, product demands are assumed to be dependent normally distributed random variables with known probability density function and covariance that change from period to period at random. To solve the proposed model, a novel hybrid intelligent algorithm is proposed by combining the simulated annealing and clonal selection algorithms. The proposed model and the hybrid algorithm are verified and validated using design of experiment and benchmark methods. The results show that the hybrid algorithm has an outstanding performance from both solution quality and computational time points of view. Besides, the proposed model can be used in both of the stochastic and deterministic situations.

  15. A Comparison of the Hot Spot and the Average Cancer Cell Counting Methods and the Optimal Cutoff Point of the Ki-67 Index for Luminal Type Breast Cancer.

    PubMed

    Arima, Nobuyuki; Nishimura, Reiki; Osako, Tomofumi; Nishiyama, Yasuyuki; Fujisue, Mamiko; Okumura, Yasuhiro; Nakano, Masahiro; Tashima, Rumiko; Toyozumi, Yasuo

    2016-01-01

    In this case-control study, we investigated the most suitable cell counting area and the optimal cutoff point of the Ki-67 index. Thirty recurrent cases were selected among hormone receptor (HR)-positive/HER2-negative breast cancer patients. As controls, 90 nonrecurrent cases were randomly selected by allotting 3 controls to each recurrent case based on the following criteria: age, nodal status, tumor size, and adjuvant endocrine therapy alone. Both the hot spot and the average area of the tumor were evaluated on a Ki-67 immunostaining slide. The median Ki-67 index value at the hot spot and average area were 25.0 and 14.5%, respectively. Irrespective of the area counted, the Ki-67 index value was significantly higher in all of the recurrent cases (p < 0.0001). The multivariate analysis revealed that the Ki-67 index value of 20% at the hot spot was the most suitable cutoff point for predicting recurrence. Moreover, higher x0394;Ki-67 index value (the difference between the hot spot and the average area, ≥10%) and lower progesterone receptor expression (<20%) were significantly correlated with recurrence. A higher Ki-67 index value at the hot spot strongly correlated with recurrence, and the optimal cutoff point was found to be 20%. © 2015 S. Karger AG, Basel.

  16. Random Walks in a One-Dimensional Lévy Random Environment

    NASA Astrophysics Data System (ADS)

    Bianchi, Alessandra; Cristadoro, Giampaolo; Lenci, Marco; Ligabò, Marilena

    2016-04-01

    We consider a generalization of a one-dimensional stochastic process known in the physical literature as Lévy-Lorentz gas. The process describes the motion of a particle on the real line in the presence of a random array of marked points, whose nearest-neighbor distances are i.i.d. and long-tailed (with finite mean but possibly infinite variance). The motion is a continuous-time, constant-speed interpolation of a symmetric random walk on the marked points. We first study the quenched random walk on the point process, proving the CLT and the convergence of all the accordingly rescaled moments. Then we derive the quenched and annealed CLTs for the continuous-time process.

  17. [Plaque segmentation of intracoronary optical coherence tomography images based on K-means and improved random walk algorithm].

    PubMed

    Wang, Guanglei; Wang, Pengyu; Han, Yechen; Liu, Xiuling; Li, Yan; Lu, Qian

    2017-06-01

    In recent years, optical coherence tomography (OCT) has developed into a popular coronary imaging technology at home and abroad. The segmentation of plaque regions in coronary OCT images has great significance for vulnerable plaque recognition and research. In this paper, a new algorithm based on K -means clustering and improved random walk is proposed and Semi-automated segmentation of calcified plaque, fibrotic plaque and lipid pool was achieved. And the weight function of random walk is improved. The distance between the edges of pixels in the image and the seed points is added to the definition of the weight function. It increases the weak edge weights and prevent over-segmentation. Based on the above methods, the OCT images of 9 coronary atherosclerotic patients were selected for plaque segmentation. By contrasting the doctor's manual segmentation results with this method, it was proved that this method had good robustness and accuracy. It is hoped that this method can be helpful for the clinical diagnosis of coronary heart disease.

  18. Congruence analysis of point clouds from unstable stereo image sequences

    NASA Astrophysics Data System (ADS)

    Jepping, C.; Bethmann, F.; Luhmann, T.

    2014-06-01

    This paper deals with the correction of exterior orientation parameters of stereo image sequences over deformed free-form surfaces without control points. Such imaging situation can occur, for example, during photogrammetric car crash test recordings where onboard high-speed stereo cameras are used to measure 3D surfaces. As a result of such measurements 3D point clouds of deformed surfaces are generated for a complete stereo sequence. The first objective of this research focusses on the development and investigation of methods for the detection of corresponding spatial and temporal tie points within the stereo image sequences (by stereo image matching and 3D point tracking) that are robust enough for a reliable handling of occlusions and other disturbances that may occur. The second objective of this research is the analysis of object deformations in order to detect stable areas (congruence analysis). For this purpose a RANSAC-based method for congruence analysis has been developed. This process is based on the sequential transformation of randomly selected point groups from one epoch to another by using a 3D similarity transformation. The paper gives a detailed description of the congruence analysis. The approach has been tested successfully on synthetic and real image data.

  19. Effects on readiness to change of an educational intervention on depressive disorders for general physicians in primary care based on a modified Prochaska model--a randomized controlled study.

    PubMed

    Shirazi, M; Zeinaloo, A A; Parikh, S V; Sadeghi, M; Taghva, A; Arbabi, M; Kashani, A Sabouri; Alaeddini, F; Lonka, K; Wahlström, R

    2008-04-01

    The Prochaska model of readiness to change has been proposed to be used in educational interventions to improve medical care. To evaluate the impact on readiness to change of an educational intervention on management of depressive disorders based on a modified version of the Prochaska model in comparison with a standard programme of continuing medical education (CME). This is a randomized controlled trial within primary care practices in southern Tehran, Iran. The participants included 192 general physicians working in primary care (GPs) were recruited after random selection and randomized to intervention (96) and control (96). Intervention consisted of interactive, learner-centred educational methods in large and small group settings depending on the GPs' stages of readiness to change. Change in stage of readiness to change measured by the modified version of the Prochaska questionnaire was the The final number of participants was 78 (81%) in the intervention arm and 81 (84%) in the control arm. Significantly (P < 0.01), more GPs (57/96 = 59% versus 12/96 = 12%) in the intervention group changed to higher stages of readiness to change. The intervention effect was 46% points (P < 0.001) and 50% points (P < 0.001) in the large and small group setting, respectively. Educational formats that suit different stages of learning can support primary care doctors to reach higher stages of behavioural change in the topic of depressive disorders. Our findings have practical implications for conducting CME programmes in Iran and are possibly also applicable in other parts of the world.

  20. Evaluation of wet cupping therapy on the arterial and venous blood parameters in healthy Arabian horses

    PubMed Central

    Shawaf, Turke; El-Deeb, Wael; Hussen, Jamal; Hendi, Mahmoud; Al-Bulushi, Shahab

    2018-01-01

    Aim: Recently, the complementary therapies such as cupping and acupuncture are being used in veterinary medicine. This research was carried out to determine the effects of wet cupping therapy (Hijama) on the hematological and the biochemical parameters in the healthy Arabian horses for the first time. Materials and Methods: In this study, seven clinically healthy Arabian horses were randomly selected. Four points on the animal body were selected to perform the cupping therapy. Two points were selected at the back just behind the scapula on the left and right sides; another two points were located in the rump. Cups with 4 oz (125 ml) size with narrow mouths were used. A manual pump (sucking cups) was used to create the negative pressure within the cups during cupping. Arterial and venous blood parameters and serum cortisol concentration were measured before cupping and 3 days and 2, 4, and 8 weeks after cupping. Results: No significant differences were estimated in most hematological and biochemical parameters after cupping. A significant decrease in the concentration of serum cortisol was observed in 3 and 14 days after cupping. Conclusions: Cupping induced minor changes on the hematological and biochemical parameters in Arabian horses. This is the first trial on the effects of wet cupping therapy on the different parameters in Arabian horses, which would be useful for further investigations on the role of complementary therapies in horses. Our further studies will include different disease models.

  1. Branching random walk with step size coming from a power law

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Ayan; Subhra Hazra, Rajat; Roy, Parthanil

    2015-09-01

    In their seminal work, Brunet and Derrida made predictions on the random point configurations associated with branching random walks. We shall discuss the limiting behavior of such point configurations when the displacement random variables come from a power law. In particular, we establish that two prediction of remains valid in this setup and investigate various other issues mentioned in their paper.

  2. Vitamin-mineral intake and intelligence: a macrolevel analysis of randomized controlled trials.

    PubMed

    Schoenthaler, S J; Bier, I D

    1999-04-01

    Two independent groups suspected that poor diets in school children might impair intelligence. Because dietary changes produce psychological effects, both groups conducted randomized trials in which children were challenged with placebo or vitamin-mineral tablets. Both reported significantly greater gains in intelligence among the actives. The findings were important because of the apparent inadequacy of diet they revealed, and the magnitude of the potential for increased intelligence. However, 5 of 11 replications were not significant, leaving the issue in doubt. To determine if school children who receive low-dose vitamin-mineral tablets produce significantly higher IQ scores than children who receive placebo. A macrolevel analysis of the 13 known randomized, double-blind trials was undertaken. A total of 15 public schools in Arizona, California, Missouri, Oklahoma, Belgium, England, Scotland, and Wales participated, with 1477 school children, aged 6 to 17 years, and 276 young adult males, aged 18 to 25 years, in 2 American correctional facilities. All studies used 1 of 3 standardized tests of nonverbal intelligence: the Wechsler Intelligence Scale for Children-Revised, the Wechsler Adult Intelligence Scale, or the Calvert Non-verbal test. The activities in each study performed better, on average, than placebo in nonverbal IQ, regardless of formula, location, age, race, gender, or research team composition. The probability of 13 randomly selected experimental groups always performing better than 13 randomly selected independent control groups is one-half to the 13th power (p = 0.000122). The mean difference across all studies is 3.2 IQ points. Furthermore, the standard deviation in the variable "IQ change" was also consistently larger in each active group when compared to its controls. This confirms that a few children in each study, presumably the poorly nourished minority, were producing large differences, rather than a 3.2 point gain in all active children. There are important health risks when school children's dietary habits depart substantially from government guidelines; poor dietary habits may lead to impaired intelligence. Low-dose vitamin-mineral supplementation may restore the cognitive abilities of these children by raising low blood nutrient concentrations. However, there is also evidence that supplementation has no measurable effect on the intelligence of well-nourished children with normal blood nutrient concentrations.

  3. The Effect of Random Error on Diagnostic Accuracy Illustrated with the Anthropometric Diagnosis of Malnutrition

    PubMed Central

    2016-01-01

    Background It is often thought that random measurement error has a minor effect upon the results of an epidemiological survey. Theoretically, errors of measurement should always increase the spread of a distribution. Defining an illness by having a measurement outside an established healthy range will lead to an inflated prevalence of that condition if there are measurement errors. Methods and results A Monte Carlo simulation was conducted of anthropometric assessment of children with malnutrition. Random errors of increasing magnitude were imposed upon the populations and showed that there was an increase in the standard deviation with each of the errors that became exponentially greater with the magnitude of the error. The potential magnitude of the resulting error of reported prevalence of malnutrition were compared with published international data and found to be of sufficient magnitude to make a number of surveys and the numerous reports and analyses that used these data unreliable. Conclusions The effect of random error in public health surveys and the data upon which diagnostic cut-off points are derived to define “health” has been underestimated. Even quite modest random errors can more than double the reported prevalence of conditions such as malnutrition. Increasing sample size does not address this problem, and may even result in less accurate estimates. More attention needs to be paid to the selection, calibration and maintenance of instruments, measurer selection, training & supervision, routine estimation of the likely magnitude of errors using standardization tests, use of statistical likelihood of error to exclude data from analysis and full reporting of these procedures in order to judge the reliability of survey reports. PMID:28030627

  4. Evaluation of a novel antiplatelet agent for secondary prevention in patients with a history of atherosclerotic disease: design and rationale for the Thrombin-Receptor Antagonist in Secondary Prevention of Atherothrombotic Ischemic Events (TRA 2 degrees P)-TIMI 50 trial.

    PubMed

    Morrow, David A; Scirica, Benjamin M; Fox, Keith A A; Berman, Gail; Strony, John; Veltri, Enrico; Bonaca, Marc P; Fish, Polly; McCabe, Carolyn H; Braunwald, Eugene

    2009-09-01

    Thrombin potently activates platelets via interaction with the protease-activated receptor 1. SCH 530348 is a novel antiplatelet agent that selectively inhibits the cellular actions of thrombin via antagonism of the protease-activated receptor 1. Because SCH 530348 does not interfere with other pathways for hemostasis, it is possible that SCH 530348 reduces thrombosis with less increase in bleeding than do other potent antiplatelet agents. TRA 2 degrees P-TIMI 50 is a phase III, randomized, double-blind, placebo-controlled, multinational clinical trial designed to evaluate the efficacy and safety of SCH 530348 during long-term treatment of patients with established atherosclerotic disease receiving standard therapy (up to 27,000). Eligible patients with a history of myocardial infarction, ischemic stroke, or peripheral arterial disease are randomized 1:1 to SCH 530348 2.5 mg daily or matched placebo until the end of study. Randomization is stratified by the qualifying disease and planned use of a thienopyridine. The primary end point is the composite of cardiovascular death, myocardial infarction, stroke, or urgent coronary revascularization. The major secondary end point is the composite of cardiovascular death, myocardial infarction, or stroke. The evaluation of long-term safety includes bleeding defined by the GUSTO and TIMI criteria. Recruitment began in September 2007. The trial will continue until 2,279 primary end points and 1,400 secondary end points are recorded with expected completion in 36 to 44 months from first enrollment. TRA 2 degrees P-TIMI 50 is evaluating whether a new approach to platelet inhibition via interruption of thrombin-mediated platelet activation reduces major cardiovascular events with a favorable safety profile in patients with established atherosclerosis.

  5. Reduced duration of dual antiplatelet therapy using an improved drug-eluting stent for percutaneous coronary intervention of the left main artery in a real-world, all-comer population: Rationale and study design of the prospective randomized multicenter IDEAL-LM trial.

    PubMed

    Lemmert, Miguel E; Oldroyd, Keith; Barragan, Paul; Lesiak, Maciej; Byrne, Robert A; Merkulov, Evgeny; Daemen, Joost; Onuma, Yoshinobu; Witberg, Karen; van Geuns, Robert-Jan

    2017-05-01

    Continuous improvements in stent technology make percutaneous coronary intervention (PCI) a potential alternative to surgery in selected patients with unprotected left main coronary artery (uLMCA) disease. The optimal duration of dual antiplatelet therapy (DAPT) in these patients remains undetermined, and in addition, new stent designs using a bioabsorbable polymer might allow shorter duration of DAPT. IDEAL-LM is a prospective, randomized, multicenter study that will enroll 818 patients undergoing uLMCA PCI. Patients will be randomized in a 1:1 fashion to intravascular ultrasound-guided PCI with the novel everolimus-eluting platinum-chromium Synergy stent with a biodegradable polymer (Boston Scientific, Natick, MA) followed by 4 months of DAPT or the everolimus-eluting cobalt-chromium Xience stent (Abbott Vascular, Santa Clara, CA) followed by 12 months of DAPT. The total follow-up period will be 5 years. A subset of 100 patients will undergo optical coherence tomography at 3 months. The primary end point will be major adverse cardiovascular events (composite of all-cause mortality, myocardial infarction, and ischemia-driven target vessel revascularization) at 2 years. Secondary end points will consist of the individual components of the primary end point, procedural success, a device-oriented composite end point, stent thrombosis as per Academic Research Consortium criteria, and bleeding as per Bleeding Academic Research Consortium criteria. IDEAL-LM is designed to assess the safety and efficacy of the novel Synergy stent followed by 4 months of DAPT vs the Xience stent followed by 12 months of DAPT in patients undergoing uLMCA PCI. The study will provide novel insights regarding optimal treatment strategy for patients undergoing PCI of uLMCA disease (www.clinicaltrials.gov, NCT 02303717). Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  6. LQTA-QSAR: a new 4D-QSAR methodology.

    PubMed

    Martins, João Paulo A; Barbosa, Euzébio G; Pasqualoto, Kerly F M; Ferreira, Márcia M C

    2009-06-01

    A novel 4D-QSAR approach which makes use of the molecular dynamics (MD) trajectories and topology information retrieved from the GROMACS package is presented in this study. This new methodology, named LQTA-QSAR (LQTA, Laboratório de Quimiometria Teórica e Aplicada), has a module (LQTAgrid) that calculates intermolecular interaction energies at each grid point considering probes and all aligned conformations resulting from MD simulations. These interaction energies are the independent variables or descriptors employed in a QSAR analysis. The comparison of the proposed methodology to other 4D-QSAR and CoMFA formalisms was performed using a set of forty-seven glycogen phosphorylase b inhibitors (data set 1) and a set of forty-four MAP p38 kinase inhibitors (data set 2). The QSAR models for both data sets were built using the ordered predictor selection (OPS) algorithm for variable selection. Model validation was carried out applying y-randomization and leave-N-out cross-validation in addition to the external validation. PLS models for data set 1 and 2 provided the following statistics: q(2) = 0.72, r(2) = 0.81 for 12 variables selected and 2 latent variables and q(2) = 0.82, r(2) = 0.90 for 10 variables selected and 5 latent variables, respectively. Visualization of the descriptors in 3D space was successfully interpreted from the chemical point of view, supporting the applicability of this new approach in rational drug design.

  7. Validation of the Edinburgh Postnatal Depression Scale (EPDS) for screening of major depressive episode among adults from the general population.

    PubMed

    Matijasevich, Alicia; Munhoz, Tiago N; Tavares, Beatriz Franck; Barbosa, Ana Paula Pereira Neto; da Silva, Diego Mello; Abitante, Morgana Sonza; Dall'Agnol, Tatiane Abreu; Santos, Iná S

    2014-10-08

    Standardized questionnaires designed for the identification of depression are useful for monitoring individual as well as population mental health. The Edinburgh Postnatal Depression Scale (EPDS) has originally been developed to assist primary care health professionals to detect postnatal depression, but several authors recommend its use outside of the postpartum period. In Brazil, the use of the EPDS for screening depression outside the postpartum period and among non-selected populations has not been validated. The present study aimed to assess the validity of the EPDS as a screening instrument for major depressive episode (MDE) among adults from the general population. This is a validation study that used a population-based sampling technique to select the participants. The study was conducted in the city of Pelotas, Brazil. Households were randomly selected by two stage conglomerates with probability proportional to size. EPDS was administered to 447 adults (≥20 years). Approximately 17 days later, participants were reinterviewed by psychiatrics and psychologists using a structured diagnostic interview (Mini International Neuropsychiatric Interview, MINI). We calculated the sensitivity and specificity of each cutoff point of EPDS, and values were plotted as a receiver operator characteristic curve. The best cutoff point for screening depression was ≥8, with 80.0% (64.4 - 90.9%) sensitivity and 87.0% (83.3 - 90.1%) specificity. Among women the best cutoff point was ≥8 too with values of sensitivity and specificity of 84.4% (67.2 - 94.7%) and 81.3% (75.5 - 86.1%), respectively. Among men, the best cutoff point was ≥7 (75% sensitivity and 89% specificity). The EPDS was shown to be suitable for screening MDE among adults in the community.

  8. 'Chain pooling' model selection as developed for the statistical analysis of a rotor burst protection experiment

    NASA Technical Reports Server (NTRS)

    Holms, A. G.

    1977-01-01

    A statistical decision procedure called chain pooling had been developed for model selection in fitting the results of a two-level fixed-effects full or fractional factorial experiment not having replication. The basic strategy included the use of one nominal level of significance for a preliminary test and a second nominal level of significance for the final test. The subject has been reexamined from the point of view of using as many as three successive statistical model deletion procedures in fitting the results of a single experiment. The investigation consisted of random number studies intended to simulate the results of a proposed aircraft turbine-engine rotor-burst-protection experiment. As a conservative approach, population model coefficients were chosen to represent a saturated 2 to the 4th power experiment with a distribution of parameter values unfavorable to the decision procedures. Three model selection strategies were developed.

  9. Forty cases of gastrointestinal neurosis treated by acupunture.

    PubMed

    Zhao, Yaping; Ding, Min; Wang, Yanjun

    2008-03-01

    To compare the therapeutic effect of acupuncture for gastrointestinal neurosis with that of oral remedy. Eighty cases were randomly divided into the following 2 groups. In the treatment group, acupuncture was given for one month at the main points of Zhongwan (CV 12), Zusanli (ST 36), Taichong (LR 3) and Shenmen (HT 7), with the auxiliary points selected according to TCM differentiation. In the control group, Domperidone was orally administered for one month. The total effective rate was 92.5% in the treatment group and 75.0% in the control group, with a significant difference between the 2 groups (chi2 = 4.423, P < 0.05). Acupuncture was superior to the oral remedy in therapeutic effects. Acupuncture may show better results for gastrointestinal neurosis and with less toxic side effects.

  10. Nest placement of the giant Amazon river turtle, Podocnemis expansa, in the Araguaia River, Goiás State, Brazil.

    PubMed

    Ferreira, Paulo Dias Júnior; Castro, Paulo de Tarso Amorim

    2005-05-01

    The giant Amazon river turtle (Podocnemis expansa) nests on extensive sand bars on the margins and interior of the channel during the dry season. The high concentration of nests in specific points of certain beaches indicates that the selection of nest placement is not random but is related to some geological aspects, such as bar margin inclination and presence of a high, sandy platform. The presence of access channels to high platform points or ramp morphology are decisive factors in the choice of nesting areas. The eroded and escarped margins of the beaches hinder the Amazon river turtle arriving at the most suitable places for nesting. Through the years, changes in beach morphology can alter nest distribution.

  11. Random regression analysis for body weights and main morphological traits in genetically improved farmed tilapia (Oreochromis niloticus).

    PubMed

    He, Jie; Zhao, Yunfeng; Zhao, Jingli; Gao, Jin; Xu, Pao; Yang, Runqing

    2018-02-01

    To genetically analyse growth traits in genetically improved farmed tilapia (GIFT), the body weight (BWE) and main morphological traits, including body length (BL), body depth (BD), body width (BWI), head length (HL) and length of the caudal peduncle (CPL), were measured six times in growth duration on 1451 fish from 45 mixed families of full and half sibs. A random regression model (RRM) was used to model genetic changes of the growth traits with days of age and estimate the heritability for any growth point and genetic correlations between pairwise growth points. Using the covariance function based on optimal RRMs, the heritabilities were estimated to be from 0.102 to 0.662 for BWE, 0.157 to 0.591 for BL, 0.047 to 0.621 for BD, 0.018 to 0.577 for BWI, 0.075 to 0.597 for HL and 0.032 to 0.610 for CPL between 60 and 140 days of age. All genetic correlations exceeded 0.5 between pairwise growth points. Moreover, the traits at initial days of age showed less correlation with those at later days of age. With phenotypes observed repeatedly, the model choice showed that the optimal RRMs could more precisely predict breeding values at a specific growth time than repeatability models or multiple trait animal models, which enhanced the efficiency of selection for the BWE and main morphological traits.

  12. Psychosocial education improves low back pain beliefs: results from a cluster randomized clinical trial (NCT00373009) in a primary prevention setting.

    PubMed

    George, Steven Z; Teyhen, Deydre S; Wu, Samuel S; Wright, Alison C; Dugan, Jessica L; Yang, Guijun; Robinson, Michael E; Childs, John D

    2009-07-01

    The general population has a pessimistic view of low back pain (LBP), and evidence-based information has been used to positively influence LBP beliefs in previously reported mass media studies. However, there is a lack of randomized trials investigating whether LBP beliefs can be modified in primary prevention settings. This cluster randomized clinical trial investigated the effect of an evidence-based psychosocial educational program (PSEP) on LBP beliefs for soldiers completing military training. A military setting was selected for this clinical trial, because LBP is a common cause of soldier disability. Companies of soldiers (n = 3,792) were recruited, and cluster randomized to receive a PSEP or no education (control group, CG). The PSEP consisted of an interactive seminar, and soldiers were issued the Back Book for reference material. The primary outcome measure was the back beliefs questionnaire (BBQ), which assesses inevitable consequences of and ability to cope with LBP. The BBQ was administered before randomization and 12 weeks later. A linear mixed model was fitted for the BBQ at the 12-week follow-up, and a generalized linear mixed model was fitted for the dichotomous outcomes on BBQ change of greater than two points. Sensitivity analyses were performed to account for drop out. BBQ scores (potential range: 9-45) improved significantly from baseline of 25.6 +/- 5.7 (mean +/- SD) to 26.9 +/- 6.2 for those receiving the PSEP, while there was a significant decline from 26.1 +/- 5.7 to 25.6 +/- 6.0 for those in the CG. The adjusted mean BBQ score at follow-up for those receiving the PSEP was 1.49 points higher than those in the CG (P < 0.0001). The adjusted odds ratio of BBQ improvement of greater than two points for those receiving the PSEP was 1.51 (95% CI = 1.22-1.86) times that of those in the CG. BBQ improvement was also mildly associated with race and college education. Sensitivity analyses suggested minimal influence of drop out. In conclusion, soldiers that received the PSEP had an improvement in their beliefs related to the inevitable consequences of and ability to cope with LBP. This is the first randomized trial to show positive influence on LBP beliefs in a primary prevention setting, and these findings have potentially important public health implications for prevention of LBP.

  13. Insensitivity of the octahedral spherical hohlraum to power imbalance, pointing accuracy, and assemblage accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huo, Wen Yi; Zhao, Yiqing; Zheng, Wudi

    2014-11-15

    The random radiation asymmetry in the octahedral spherical hohlraum [K. Lan et al., Phys. Plasmas 21, 0 10704 (2014)] arising from the power imbalance, pointing accuracy of laser quads, and the assemblage accuracy of capsule is investigated by using the 3-dimensional view factor model. From our study, for the spherical hohlraum, the random radiation asymmetry arising from the power imbalance of the laser quads is about half of that in the cylindrical hohlraum; the random asymmetry arising from the pointing error is about one order lower than that in the cylindrical hohlraum; and the random asymmetry arising from the assemblage errormore » of capsule is about one third of that in the cylindrical hohlraum. Moreover, the random radiation asymmetry in the spherical hohlraum is also less than the amount in the elliptical hohlraum. The results indicate that the spherical hohlraum is more insensitive to the random variations than the cylindrical hohlraum and the elliptical hohlraum. Hence, the spherical hohlraum can relax the requirements to the power imbalance and pointing accuracy of laser facility and the assemblage accuracy of capsule.« less

  14. A nonparametric method to generate synthetic populations to adjust for complex sampling design features.

    PubMed

    Dong, Qi; Elliott, Michael R; Raghunathan, Trivellore E

    2014-06-01

    Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs.

  15. A nonparametric method to generate synthetic populations to adjust for complex sampling design features

    PubMed Central

    Dong, Qi; Elliott, Michael R.; Raghunathan, Trivellore E.

    2017-01-01

    Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs. PMID:29200608

  16. The tree balance signature of mass extinction is erased by continued evolution in clades of constrained size with trait-dependent speciation

    PubMed Central

    Yang, Guan-Dong; Agapow, Paul-Michael

    2017-01-01

    The kind and duration of phylogenetic topological “signatures” left in the wake of macroevolutionary events remain poorly understood. To this end, we examined a broad range of simulated phylogenies generated using trait-biased, heritable speciation probabilities and mass extinction that could be either random or selective on trait value, but also using background extinction and diversity-dependence to constrain clade sizes. In keeping with prior results, random mass extinction increased imbalance of clades that recovered to pre-extinction size, but was a relatively weak effect. Mass extinction that was selective on trait values tended to produce clades of similar or greater balance compared to random extinction or controls. Allowing evolution to continue past the point of clade-size recovery resulted in erosion and eventual erasure of this signal, with all treatments converging on similar values of imbalance, except for very intense extinction regimes targeted at taxa with high speciation rates. Return to a more balanced state with extended post-extinction evolution was also associated with loss of the previous phylogenetic root in most treatments. These results further demonstrate that while a mass extinction event can produce a recognizable phylogenetic signal, its effects become increasingly obscured the further an evolving clade gets from that event, with any sharp imbalance due to unrelated evolutionary factors. PMID:28644846

  17. COMPARISON OF RANDOM AND SYSTEMATIC SITE SELECTION FOR ASSESSING ATTAINMENT OF AQUATIC LIFE USES IN SEGMENTS OF THE OHIO RIVER

    EPA Science Inventory

    This report is a description of field work and data analysis results comparing a design comparable to systematic site selection with one based on random selection of sites. The report is expected to validate the use of random site selection in the bioassessment program for the O...

  18. Standardized Uptake Decrease on [18F]-Fluorodeoxyglucose Positron Emission Tomography After Neoadjuvant Chemotherapy Is a Prognostic Classifier for Long-Term Outcome After Multimodality Treatment: Secondary Analysis of a Randomized Trial for Resectable Stage IIIA/B Non-Small-Cell Lung Cancer.

    PubMed

    Pöttgen, Christoph; Gauler, Thomas; Bellendorf, Alexander; Guberina, Maja; Bockisch, Andreas; Schwenzer, Nina; Heinzelmann, Frank; Cordes, Sebastian; Schuler, Martin H; Welter, Stefan; Stamatis, Georgios; Friedel, Godehard; Darwiche, Kaid; Jöckel, Karl-Heinz; Eberhardt, Wilfried; Stuschke, Martin

    2016-07-20

    A confirmatory analysis was performed to determine the prognostic value of metabolic response during induction chemotherapy followed by bimodality/trimodality treatment of patients with operable locally advanced non-small-cell lung cancer. Patients with potentially operable stage IIIA(N2) or selected stage IIIB non-small-cell lung cancer received three cycles of cisplatin/paclitaxel (induction chemotherapy) followed by neoadjuvant radiochemotherapy (RCT) to 45 Gy (1.5 Gy twice per day concurrent cisplatin/vinorelbine) within the ESPATUE (Phase III Study of Surgery Versus Definitive Concurrent Chemoradiotherapy Boost in Patients With Resectable Stage IIIA[N2] and Selected IIIB Non-Small-Cell Lung Cancer After Induction Chemotherapy and Concurrent Chemoradiotherapy) trial. Positron emission tomography scans were recommended before (t0) and after (t2) induction chemotherapy. Patients who were eligible for surgery after neoadjuvant RCT were randomly assigned to definitive RCT or surgery. The prognostic value of percentage of maximum standardized uptake value (%SUVmax) remaining in the primary tumor after induction chemotherapy-%SUVremaining = SUVmax(t2)/SUVmax(t0)-was assessed by proportional hazard analysis and receiver operating characteristic analysis. Overall, 161 patients were randomly assigned (155 from the Essen and Tübingen centers), and 124 of these received positron emission tomography scans at t0 and t2. %SUVremaining as a continuous variable was prognostic for the three end points of overall survival, progression-free survival, and freedom from extracerebral progression in univariable and multivariable analysis (P < .016). The respective hazard ratios per 50% increase in %SUVremaining from multivariable analysis were 2.3 (95% CI, 1.5 to 3.4; P < .001), 1.8 (95% CI, 1.3 to 2.5; P < .001), and 1.8 (95% CI, 1.2 to 2.7; P = .006) for the three end points. %SUVremaining dichotomized at a cut point of maximum sum of sensitivity and specificity from receiver operating characteristic analysis at 36 months was also prognostic. Exploratory analysis revealed that %SUVremaining was likewise prognostic for overall survival in both treatment arms and was more closely associated with extracerebral distant metastases (P = .016) than with isolated locoregional relapses (P = .97). %SUVremaining is a predictor for survival and other end points after multimodality treatment and can serve as a parameter for treatment stratification after induction chemotherapy or for evaluation of adjuvant new systemic treatment options for high-risk patients. © 2016 by American Society of Clinical Oncology.

  19. Computer simulation of the probability that endangered whales will interact with oil spills

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, M.; Jayko, K.; Bowles, A.

    1987-03-01

    A numerical model system was developed to assess quantitatively the probability that endangered bowhead and gray whales will encounter spilled oil in Alaskan waters. Bowhead and gray whale migration and diving-surfacing models, and an oil-spill trajectory model comprise the system. The migration models were developed from conceptual considerations, then calibrated with and tested against observations. The movement of a whale point is governed by a random walk algorithm which stochastically follows a migratory pathway. The oil-spill model, developed under a series of other contracts, accounts for transport and spreading behavior in open water and in the presence of sea ice.more » Historical wind records and heavy, normal, or light ice cover data sets are selected at random to provide stochastic oil-spill scenarios for whale-oil interaction simulations.« less

  20. Efficacy of low interatrial septum and right atrial appendage pacing for prevention of permanent atrial fibrillation in patients with sinus node disease: results from the electrophysiology-guided pacing site selection (EPASS) study.

    PubMed

    Verlato, Roberto; Botto, Giovanni Luca; Massa, Riccardo; Amellone, Claudia; Perucca, Antonello; Bongiorni, Maria Grazia; Bertaglia, Emanuele; Ziacchi, Vigilio; Piacenti, Marcello; Del Rosso, Attilio; Russo, Giovanni; Baccillieri, Maria Stella; Turrini, Pietro; Corbucci, Giorgio

    2011-12-01

    The role of pacing sites and atrial electrophysiology on the progression of atrial fibrillation (AF) to the permanent form in patients with sinus node dysfunction (SND) has never been investigated. The aim of the study was to investigate the relationship between atrial electrophysiology and the efficacy of atrial pacing at the low interatrial septum (IAS) or at the right atrial appendage (RAA) to prevent persistent/permanent AF in patients with SND. The Electrophysiology-Guided Pacing Site Selection (EPASS) Study was a prospective, controlled, randomized study. Atrial refractoriness, basal and incremental conduction times from the RAA to the coronary sinus ostium were measured before implantation, and the difference (ΔCTos) was calculated. Patients with ΔCTos ≥ 50 ms (study group) and those with ΔCTos <50 ms (control group) were randomly assigned to RAA or IAS with algorithms for continuous atrial stimulation "on." The primary end point was time to development of permanent or persistent AF within a 2-year follow-up in the study group, IAS versus RAA. Data were analyzed by intention to treat. One hundred two patients (77 ± 7 years, 44 mol/L) were enrolled, 69 (68%) in the study group and 33 (32%) in the control group. Of these, 97 ended the study, respectively, randomly assigned: 29 IAS versus 36 RAA and 18 IAS versus 14 RAA. After a mean follow-up of 15 ± 7 (median, 17) months, 11 (16.6%) patients in the study group met the primary end point: 2 IAS versus 9 RAA (log rank=3.93, P=0.047). In patients with SND and intra-atrial conduction delay, low IAS pacing was superior to RAA pacing in preventing progression to persistent or permanent AF. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00239226.

  1. Application of Adaptive Design Methodology in Development of a Long-Acting Glucagon-Like Peptide-1 Analog (Dulaglutide): Statistical Design and Simulations

    PubMed Central

    Skrivanek, Zachary; Berry, Scott; Berry, Don; Chien, Jenny; Geiger, Mary Jane; Anderson, James H.; Gaydos, Brenda

    2012-01-01

    Background Dulaglutide (dula, LY2189265), a long-acting glucagon-like peptide-1 analog, is being developed to treat type 2 diabetes mellitus. Methods To foster the development of dula, we designed a two-stage adaptive, dose-finding, inferentially seamless phase 2/3 study. The Bayesian theoretical framework is used to adaptively randomize patients in stage 1 to 7 dula doses and, at the decision point, to either stop for futility or to select up to 2 dula doses for stage 2. After dose selection, patients continue to be randomized to the selected dula doses or comparator arms. Data from patients assigned the selected doses will be pooled across both stages and analyzed with an analysis of covariance model, using baseline hemoglobin A1c and country as covariates. The operating characteristics of the trial were assessed by extensive simulation studies. Results Simulations demonstrated that the adaptive design would identify the correct doses 88% of the time, compared to as low as 6% for a fixed-dose design (the latter value based on frequentist decision rules analogous to the Bayesian decision rules for adaptive design). Conclusions This article discusses the decision rules used to select the dula dose(s); the mathematical details of the adaptive algorithm—including a description of the clinical utility index used to mathematically quantify the desirability of a dose based on safety and efficacy measurements; and a description of the simulation process and results that quantify the operating characteristics of the design. PMID:23294775

  2. Analysis of Feature Intervisibility and Cumulative Visibility Using GIS, Bayesian and Spatial Statistics: A Study from the Mandara Mountains, Northern Cameroon

    PubMed Central

    Wright, David K.; MacEachern, Scott; Lee, Jaeyong

    2014-01-01

    The locations of diy-geδ-bay (DGB) sites in the Mandara Mountains, northern Cameroon are hypothesized to occur as a function of their ability to see and be seen from points on the surrounding landscape. A series of geostatistical, two-way and Bayesian logistic regression analyses were performed to test two hypotheses related to the intervisibility of the sites to one another and their visual prominence on the landscape. We determine that the intervisibility of the sites to one another is highly statistically significant when compared to 10 stratified-random permutations of DGB sites. Bayesian logistic regression additionally demonstrates that the visibility of the sites to points on the surrounding landscape is statistically significant. The location of sites appears to have also been selected on the basis of lower slope than random permutations of sites. Using statistical measures, many of which are not commonly employed in archaeological research, to evaluate aspects of visibility on the landscape, we conclude that the placement of DGB sites improved their conspicuousness for enhanced ritual, social cooperation and/or competition purposes. PMID:25383883

  3. Determining knowledge and behaviour change after nutrition screening among older adults.

    PubMed

    Southgate, Katherine M; Keller, Heather H; Reimer, Holly D

    2010-01-01

    Two education interventions involving personalized messages after nutrition screening in older adults were compared to determine changes in nutrition knowledge and risk behaviour. Of 150 older adults randomly selected from a local seniors' centre, 61 completed baseline screening and a demographic and nutrition knowledge questionnaire and were randomized to one of two groups. Group A received personalized letters plus an educational booklet, and Group B received personalized letters only. All materials were sent through the mail. Forty-four participants completed post-test questionnaires to determine change in knowledge and risk behaviour. Both groups had reduced nutrition risk scores and increased knowledge scores at post-test. After the intervention, a significant difference was observed in knowledge change by treatment group. Group A participants experienced greater gains in knowledge, with a mean gain of 5.43 points, than did those in Group B, who had a mean gain of 1.36 points (p=0.018). Screening and education with print materials have the potential to change risk behaviour and nutrition knowledge in older adults. A specially designed booklet on older adults' nutrition risk factors plus a personalized letter provide an effective education strategy for older adults after screening.

  4. Visual Perception of Touchdown Point During Simulated Landing

    ERIC Educational Resources Information Center

    Palmisano, Stephen; Gillam, Barbara

    2005-01-01

    Experiments examined the accuracy of visual touchdown point perception during oblique descents (1.5?-15?) toward a ground plane consisting of (a) randomly positioned dots, (b) a runway outline, or (c) a grid. Participants judged whether the perceived touchdown point was above or below a probe that appeared at a random position following each…

  5. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    ERIC Educational Resources Information Center

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  6. Continuous and discontinuous phase transitions in the evolution of a polygenic trait under stabilizing selective pressure

    NASA Astrophysics Data System (ADS)

    Fierro, Annalisa; Cocozza, Sergio; Monticelli, Antonella; Scala, Giovanni; Miele, Gennaro

    2017-06-01

    The presence of phenomena analogous to phase transition in Statistical Mechanics has been suggested in the evolution of a polygenic trait under stabilizing selection, mutation and genetic drift. By using numerical simulations of a model system, we analyze the evolution of a population of N diploid hermaphrodites in random mating regime. The population evolves under the effect of drift, selective pressure in form of viability on an additive polygenic trait, and mutation. The analysis allows to determine a phase diagram in the plane of mutation rate and strength of selection. The involved pattern of phase transitions is characterized by a line of critical points for weak selective pressure (smaller than a threshold), whereas discontinuous phase transitions, characterized by metastable hysteresis, are observed for strong selective pressure. A finite-size scaling analysis suggests the analogy between our system and the mean-field Ising model for selective pressure approaching the threshold from weaker values. In this framework, the mutation rate, which allows the system to explore the accessible microscopic states, is the parameter controlling the transition from large heterozygosity ( disordered phase) to small heterozygosity ( ordered one).

  7. Evaluation of a multi-point method for determining acoustic impedance

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.; Parrott, Tony L.

    1988-01-01

    An investigation was conducted to explore potential improvements provided by a Multi-Point Method (MPM) over the Standing Wave Method (SWM) and Two-Microphone Method (TMM) for determining acoustic impedance. A wave propagation model was developed to model the standing wave pattern in an impedance tube. The acoustic impedance of a test specimen was calculated from a best fit of this standing wave pattern to pressure measurements obtained along the impedance tube centerline. Three measurement spacing distributions were examined: uniform, random, and selective. Calculated standing wave patterns match the point pressure measurement distributions with good agreement for a reflection factor magnitude range of 0.004 to 0.999. Comparisons of results using 2, 3, 6, and 18 measurement points showed that the most consistent results are obtained when using at least 6 evenly spaced pressure measurements per half-wavelength. Also, data were acquired with broadband noise added to the discrete frequency noise and impedances were calculated using the MPM and TMM algorithms. The results indicate that the MPM will be superior to the TMM in the presence of significant broadband noise levels associated with mean flow.

  8. [Clinical observation on auricular point magnetotherapy for treatment of senile low back pain].

    PubMed

    Sun, Gui-Ping

    2007-02-01

    To compare the therapeutic effects of auricular point magnetotherapy and auricular point sticking of Vaccaria seed on senile low back pain. Sixty cases, aged 60 or over 60 years with back pain, were randomly divided into 2 groups, a control group and a test group. The control group were treated with auricular sticking of Vaccaria seed with no pressing, and the test group with sticking magnetic bead of 66 gauss each piece with no pressing. Auricular points, Shenmen, Kidney, Bladder, Yaodizhui, Gluteus, Liver and Spleen were selected. Three weeks constituted one course. The effects before, during and after the course were assessed by questionnaire about back pain. Compared with the control group, in the test group the back pain was more effectively improved, including reducing pain and numbness in the back and the legs, decreasing the disorder of physical strength induced by this disease, and improving daily life quality of the patient. Follow-up survey for 2-4 weeks showed the effects still were kept. Auricular magnetotherapy can effectively improve senile back pain.

  9. Analgesic effects of treatments for non-specific low back pain: a meta-analysis of placebo-controlled randomized trials.

    PubMed

    Machado, L A C; Kamper, S J; Herbert, R D; Maher, C G; McAuley, J H

    2009-05-01

    Estimates of treatment effects reported in placebo-controlled randomized trials are less subject to bias than those estimates provided by other study designs. The objective of this meta-analysis was to estimate the analgesic effects of treatments for non-specific low back pain reported in placebo-controlled randomized trials. Medline, Embase, Cinahl, PsychInfo and Cochrane Central Register of Controlled Trials databases were searched for eligible trials from earliest records to November 2006. Continuous pain outcomes were converted to a common 0-100 scale and pooled using a random effects model. A total of 76 trials reporting on 34 treatments were included. Fifty percent of the investigated treatments had statistically significant effects, but for most the effects were small or moderate: 47% had point estimates of effects of <10 points on the 100-point scale, 38% had point estimates from 10 to 20 points and 15% had point estimates of >20 points. Treatments reported to have large effects (>20 points) had been investigated only in a single trial. This meta-analysis revealed that the analgesic effects of many treatments for non-specific low back pain are small and that they do not differ in populations with acute or chronic symptoms.

  10. Pharmacological and Phytochemical Appraisal of Selected Medicinal Plants from Jordan with Claimed Antidiabetic Activities

    PubMed Central

    Afifi, Fatma U.; Kasabri, Violet

    2013-01-01

    Plant species have long been regarded as possessing the principal ingredients used in widely disseminated ethnomedical practices. Different surveys showed that medicinal plant species used by the inhabitants of Jordan for the traditional treatment of diabetes are inadequately screened for their therapeutic/preventive potential and phytochemical findings. In this review, traditional herbal medicine pursued indigenously with its methods of preparation and its active constituents are listed. Studies of random screening for selective antidiabetic bioactivity and plausible mechanisms of action of local species, domesticated greens, or wild plants are briefly discussed. Recommended future directives incurring the design and conduct of comprehensive trials are pointed out to validate the usefulness of these active plants or bioactive secondary metabolites either alone or in combination with existing conventional therapies. PMID:24482764

  11. Impact of a maternal health voucher scheme on institutional delivery among low income women in Pakistan.

    PubMed

    Agha, Sohail

    2011-05-03

    Only 39% of deliveries in Pakistan are attended by skilled birth attendants, while Pakistan's target for skilled birth attendance by 2015 is > 90%. A 12-month maternal health voucher intervention was implemented in Dera Ghazi Khan City, located in Southern Punjab, Pakistan in 2009. A pre-test/post-test non-experimental study was conducted to assess the impact of the intervention. Household interviews were conducted with randomly selected women who delivered in 2008 (the year prior to the voucher intervention), and with randomly selected women who delivered in 2009. A strong outreach model was used and voucher booklets valued at $50, containing redeemable coupons for three antenatal care (ANC) visits, a postnatal care (PNC) visit and institutional delivery, were sold for $1.25 to low-income women targeted by project workers. Regression analysis was conducted to determine the impact of the voucher scheme on ANC, PNC, and institutional delivery. Marginal effects estimated from logistic regression analyses were used to assess the magnitude of the impact of the intervention. The women targeted by voucher outreach workers were poorer, less educated, and at higher parity. After adjusting for these differences, women who delivered in 2009 and were sold voucher booklets were significantly more likely than women who delivered in 2008 to make at least three ANC visits, deliver in a health facility, and make a postnatal visit. Purchase of a voucher booklet was associated with a 22 percentage point increase in ANC use, a 22 percentage point increase in institutional delivery, and a 35 percentage point increase in PNC use. A voucher intervention implemented for 12 months was associated with a substantial increase in institutional delivery. A substantial scale-up of maternal health vouchers that focus on institutional delivery is likely to bring Pakistan closer to achieving its 2015 target for institutional delivery.

  12. Near-optimal alternative generation using modified hit-and-run sampling for non-linear, non-convex problems

    NASA Astrophysics Data System (ADS)

    Rosenberg, D. E.; Alafifi, A.

    2016-12-01

    Water resources systems analysis often focuses on finding optimal solutions. Yet an optimal solution is optimal only for the modelled issues and managers often seek near-optimal alternatives that address un-modelled objectives, preferences, limits, uncertainties, and other issues. Early on, Modelling to Generate Alternatives (MGA) formalized near-optimal as the region comprising the original problem constraints plus a new constraint that allowed performance within a specified tolerance of the optimal objective function value. MGA identified a few maximally-different alternatives from the near-optimal region. Subsequent work applied Markov Chain Monte Carlo (MCMC) sampling to generate a larger number of alternatives that span the near-optimal region of linear problems or select portions for non-linear problems. We extend the MCMC Hit-And-Run method to generate alternatives that span the full extent of the near-optimal region for non-linear, non-convex problems. First, start at a feasible hit point within the near-optimal region, then run a random distance in a random direction to a new hit point. Next, repeat until generating the desired number of alternatives. The key step at each iterate is to run a random distance along the line in the specified direction to a new hit point. If linear equity constraints exist, we construct an orthogonal basis and use a null space transformation to confine hits and runs to a lower-dimensional space. Linear inequity constraints define the convex bounds on the line that runs through the current hit point in the specified direction. We then use slice sampling to identify a new hit point along the line within bounds defined by the non-linear inequity constraints. This technique is computationally efficient compared to prior near-optimal alternative generation techniques such MGA, MCMC Metropolis-Hastings, evolutionary, or firefly algorithms because search at each iteration is confined to the hit line, the algorithm can move in one step to any point in the near-optimal region, and each iterate generates a new, feasible alternative. We use the method to generate alternatives that span the near-optimal regions of simple and more complicated water management problems and may be preferred to optimal solutions. We also discuss extensions to handle non-linear equity constraints.

  13. Application of random effects to the study of resource selection by animals

    USGS Publications Warehouse

    Gillies, C.S.; Hebblewhite, M.; Nielsen, S.E.; Krawchuk, M.A.; Aldridge, Cameron L.; Frair, J.L.; Saher, D.J.; Stevens, C.E.; Jerde, C.L.

    2006-01-01

    1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence.2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability.3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed.4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects.5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection.6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions limiting their generality. This approach will allow researchers to appropriately estimate marginal (population) and conditional (individual) responses, and account for complex grouping, unbalanced sample designs and autocorrelation.

  14. Application of random effects to the study of resource selection by animals.

    PubMed

    Gillies, Cameron S; Hebblewhite, Mark; Nielsen, Scott E; Krawchuk, Meg A; Aldridge, Cameron L; Frair, Jacqueline L; Saher, D Joanne; Stevens, Cameron E; Jerde, Christopher L

    2006-07-01

    1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions limiting their generality. This approach will allow researchers to appropriately estimate marginal (population) and conditional (individual) responses, and account for complex grouping, unbalanced sample designs and autocorrelation.

  15. Automated Coarse Registration of Point Clouds in 3d Urban Scenes Using Voxel Based Plane Constraint

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Boerner, R.; Yao, W.; Hoegner, L.; Stilla, U.

    2017-09-01

    For obtaining a full coverage of 3D scans in a large-scale urban area, the registration between point clouds acquired via terrestrial laser scanning (TLS) is normally mandatory. However, due to the complex urban environment, the automatic registration of different scans is still a challenging problem. In this work, we propose an automatic marker free method for fast and coarse registration between point clouds using the geometric constrains of planar patches under a voxel structure. Our proposed method consists of four major steps: the voxelization of the point cloud, the approximation of planar patches, the matching of corresponding patches, and the estimation of transformation parameters. In the voxelization step, the point cloud of each scan is organized with a 3D voxel structure, by which the entire point cloud is partitioned into small individual patches. In the following step, we represent points of each voxel with the approximated plane function, and select those patches resembling planar surfaces. Afterwards, for matching the corresponding patches, a RANSAC-based strategy is applied. Among all the planar patches of a scan, we randomly select a planar patches set of three planar surfaces, in order to build a coordinate frame via their normal vectors and their intersection points. The transformation parameters between scans are calculated from these two coordinate frames. The planar patches set with its transformation parameters owning the largest number of coplanar patches are identified as the optimal candidate set for estimating the correct transformation parameters. The experimental results using TLS datasets of different scenes reveal that our proposed method can be both effective and efficient for the coarse registration task. Especially, for the fast orientation between scans, our proposed method can achieve a registration error of less than around 2 degrees using the testing datasets, and much more efficient than the classical baseline methods.

  16. Accuracy improvement of laser line scanning for feature measurements on CMM

    NASA Astrophysics Data System (ADS)

    Bešić, Igor; Van Gestel, Nick; Kruth, Jean-Pierre; Bleys, Philip; Hodolič, Janko

    2011-11-01

    Because of its high speed and high detail output, laser line scanning is increasingly included in coordinate metrology applications where its performance can satisfy specified tolerances. Increasing its accuracy will open the possibility to use it in other areas where contact methods are still dominant. Multi-sensor systems allow to select discrete probing or scanning methods to measure part elements. Decision is often based on the principle that tight toleranced elements should be measured by contact methods, while other more loose toleranced elements can be laser scanned. This paper aims to introduce a method for improving the output of a CMM mounted laser line scanner for metrology applications. This improvement is achieved by filtering of the scanner's random error and by combination with widely spread and reliable but slow touch trigger probing. The filtered point cloud is used to estimate the form deviation of the inspected element while few tactile obtained points were used to effectively compensate for errors in the point cloud position.

  17. ROLE OF TIMING IN ASSESSMENT OF NERVE REGENERATION

    PubMed Central

    BRENNER, MICHAEL J.; MORADZADEH, ARASH; MYCKATYN, TERENCE M.; TUNG, THOMAS H. H.; MENDEZ, ALLEN B.; HUNTER, DANIEL A.; MACKINNON, SUSAN E.

    2014-01-01

    Small animal models are indispensable for research on nerve injury and reconstruction, but their superlative regenerative potential may confound experimental interpretation. This study investigated time-dependent neuroregenerative phenomena in rodents. Forty-six Lewis rats were randomized to three nerve allograft groups treated with 2 mg/(kg day) tacrolimus; 5 mg/(kg day) Cyclosporine A; or placebo injection. Nerves were subjected to histomorphometric and walking track analysis at serial time points. Tacrolimus increased fiber density, percent neural tissue, and nerve fiber count and accelerated functional recovery at 40 days, but these differences were undetectable by 70 days. Serial walking track analysis showed a similar pattern of recovery. A ‘blow-through’ effect is observed in rodents whereby an advancing nerve front overcomes an experimental defect given sufficient time, rendering experimental groups indistinguishable at late time points. Selection of validated time points and corroboration in higher animal models are essential prerequisites for the clinical application of basic research on nerve regeneration. PMID:18381659

  18. Error Distribution Evaluation of the Third Vanishing Point Based on Random Statistical Simulation

    NASA Astrophysics Data System (ADS)

    Li, C.

    2012-07-01

    POS, integrated by GPS / INS (Inertial Navigation Systems), has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems). However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus) and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY). How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY) and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ) is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.

  19. Seismic random noise attenuation method based on empirical mode decomposition of Hausdorff dimension

    NASA Astrophysics Data System (ADS)

    Yan, Z.; Luan, X.

    2017-12-01

    Introduction Empirical mode decomposition (EMD) is a noise suppression algorithm by using wave field separation, which is based on the scale differences between effective signal and noise. However, since the complexity of the real seismic wave field results in serious aliasing modes, it is not ideal and effective to denoise with this method alone. Based on the multi-scale decomposition characteristics of the signal EMD algorithm, combining with Hausdorff dimension constraints, we propose a new method for seismic random noise attenuation. First of all, We apply EMD algorithm adaptive decomposition of seismic data and obtain a series of intrinsic mode function (IMF)with different scales. Based on the difference of Hausdorff dimension between effectively signals and random noise, we identify IMF component mixed with random noise. Then we use threshold correlation filtering process to separate the valid signal and random noise effectively. Compared with traditional EMD method, the results show that the new method of seismic random noise attenuation has a better suppression effect. The implementation process The EMD algorithm is used to decompose seismic signals into IMF sets and analyze its spectrum. Since most of the random noise is high frequency noise, the IMF sets can be divided into three categories: the first category is the effective wave composition of the larger scale; the second category is the noise part of the smaller scale; the third category is the IMF component containing random noise. Then, the third kind of IMF component is processed by the Hausdorff dimension algorithm, and the appropriate time window size, initial step and increment amount are selected to calculate the Hausdorff instantaneous dimension of each component. The dimension of the random noise is between 1.0 and 1.05, while the dimension of the effective wave is between 1.05 and 2.0. On the basis of the previous steps, according to the dimension difference between the random noise and effective signal, we extracted the sample points, whose fractal dimension value is less than or equal to 1.05 for the each IMF components, to separate the residual noise. Using the IMF components after dimension filtering processing and the effective wave IMF components after the first selection for reconstruction, we can obtained the results of de-noising.

  20. A randomized, Phase IIb study investigating oliceridine (TRV130), a novel µ-receptor G-protein pathway selective (μ-GPS) modulator, for the management of moderate to severe acute pain following abdominoplasty

    PubMed Central

    Singla, Neil; Minkowitz, Harold S; Soergel, David G; Burt, David A; Subach, Ruth Ann; Salamea, Monica Y; Fossler, Michael J; Skobieranda, Franck

    2017-01-01

    Background Oliceridine (TRV130), a novel μ-receptor G-protein pathway selective (μ-GPS) modulator, was designed to improve the therapeutic window of conventional opioids by activating G-protein signaling while causing low β-arrestin recruitment to the μ receptor. This randomized, double-blind, patient-controlled analgesia Phase IIb study was conducted to investigate the efficacy, safety, and tolerability of oliceridine compared with morphine and placebo in patients with moderate to severe pain following abdominoplasty (NCT02335294; oliceridine is an investigational agent not yet approved by the US Food and Drug Administration). Methods Patients were randomized to receive postoperative regimens of intravenous oliceridine (loading/patient-controlled demand doses [mg/mg]: 1.5/0.10 [regimen A]; 1.5/0.35 [regimen B]), morphine (4.0/1.0), or placebo with treatment initiated within 4 hours of surgery and continued as needed for 24 hours. Results Two hundred patients were treated (n=39, n=39, n=83, and n=39 in the oliceridine regimen A, oliceridine regimen B, morphine, and placebo groups, respectively). Patients were predominantly female (n=198 [99%]) and had a mean age of 38.2 years, weight of 71.2 kg, and baseline pain score of 7.7 (on 11-point numeric pain rating scale). Patients receiving the oliceridine regimens had reductions in average pain scores (model-based change in time-weighted average versus placebo over 24 hours) of 2.3 and 2.1 points, respectively (P=0.0001 and P=0.0005 versus placebo); patients receiving morphine had a similar reduction (2.1 points; P<0.0001 versus placebo). A lower prevalence of adverse events (AEs) related to nausea, vomiting, and respiratory function was observed with the oliceridine regimens than with morphine (P<0.05). Other AEs with oliceridine were generally dose-related and similar in nature to those observed with conventional opioids; no serious AEs were reported with oliceridine. Conclusion These results suggest that oliceridine may provide effective, rapid analgesia in patients with moderate to severe postoperative pain, with an acceptable safety/tolerability profile and potentially wider therapeutic window than morphine. PMID:29062240

  1. Effect of optical digitizer selection on the application accuracy of a surgical localization system-a quantitative comparison between the OPTOTRAK and flashpoint tracking systems

    NASA Technical Reports Server (NTRS)

    Li, Q.; Zamorano, L.; Jiang, Z.; Gong, J. X.; Pandya, A.; Perez, R.; Diaz, F.

    1999-01-01

    Application accuracy is a crucial factor for stereotactic surgical localization systems, in which space digitization camera systems are one of the most critical components. In this study we compared the effect of the OPTOTRAK 3020 space digitization system and the FlashPoint Model 3000 and 5000 3D digitizer systems on the application accuracy for interactive localization of intracranial lesions. A phantom was mounted with several implantable frameless markers which were randomly distributed on its surface. The target point was digitized and the coordinates were recorded and compared with reference points. The differences from the reference points represented the deviation from the "true point." The root mean square (RMS) was calculated to show the differences, and a paired t-test was used to analyze the results. The results with the phantom showed that, for 1-mm sections of CT scans, the RMS was 0.76 +/- 0. 54 mm for the OPTOTRAK system, 1.23 +/- 0.53 mm for the FlashPoint Model 3000 3D digitizer system, and 1.00 +/- 0.42 mm for the FlashPoint Model 5000 system. These preliminary results showed that there is no significant difference between the three tracking systems, and, from the quality point of view, they can all be used for image-guided surgery procedures. Copyright 1999 Wiley-Liss, Inc.

  2. Effect of optical digitizer selection on the application accuracy of a surgical localization system-a quantitative comparison between the OPTOTRAK and flashpoint tracking systems.

    PubMed

    Li, Q; Zamorano, L; Jiang, Z; Gong, J X; Pandya, A; Perez, R; Diaz, F

    1999-01-01

    Application accuracy is a crucial factor for stereotactic surgical localization systems, in which space digitization camera systems are one of the most critical components. In this study we compared the effect of the OPTOTRAK 3020 space digitization system and the FlashPoint Model 3000 and 5000 3D digitizer systems on the application accuracy for interactive localization of intracranial lesions. A phantom was mounted with several implantable frameless markers which were randomly distributed on its surface. The target point was digitized and the coordinates were recorded and compared with reference points. The differences from the reference points represented the deviation from the "true point." The root mean square (RMS) was calculated to show the differences, and a paired t-test was used to analyze the results. The results with the phantom showed that, for 1-mm sections of CT scans, the RMS was 0.76 +/- 0. 54 mm for the OPTOTRAK system, 1.23 +/- 0.53 mm for the FlashPoint Model 3000 3D digitizer system, and 1.00 +/- 0.42 mm for the FlashPoint Model 5000 system. These preliminary results showed that there is no significant difference between the three tracking systems, and, from the quality point of view, they can all be used for image-guided surgery procedures. Copyright 1999 Wiley-Liss, Inc.

  3. Improving Adherence to Smoking Cessation Treatment: Smoking Outcomes in a Web-based Randomized Trial.

    PubMed

    Graham, Amanda L; Papandonatos, George D; Cha, Sarah; Erar, Bahar; Amato, Michael S

    2018-03-15

    Partial adherence in Internet smoking cessation interventions presents treatment and evaluation challenges. Increasing adherence may improve outcomes. To present smoking outcomes from an Internet randomized trial of two strategies to encourage adherence to tobacco dependence treatment components: (i) a social network (SN) strategy to integrate smokers into an online community and (ii) free nicotine replacement therapy (NRT). In addition to intent-to-treat analyses, we used novel statistical methods to distinguish the impact of treatment assignment from treatment utilization. A total of 5,290 current smokers on a cessation website (WEB) were randomized to WEB, WEB + SN, WEB + NRT, or WEB + SN + NRT. The main outcome was 30-day point prevalence abstinence at 3 and 9 months post-randomization. Adherence measures included self-reported medication use (meds), and website metrics of skills training (sk) and community use (comm). Inverse Probability of Retention Weighting and Inverse Probability of Treatment Weighting jointly addressed dropout and treatment selection. Propensity weights were used to calculate Average Treatment effects on the Treated. Treatment assignment analyses showed no effects on abstinence for either adherence strategy. Abstinence rates were 25.7%-32.2% among participants that used all three treatment components (sk+comm +meds).Treatment utilization analyses revealed that among such participants, sk+comm+meds yielded large percentage point increases in 3-month abstinence rates over sk alone across arms: WEB = 20.6 (95% CI = 10.8, 30.4), WEB + SN = 19.2 (95% CI = 11.1, 27.3), WEB + NRT = 13.1 (95% CI = 4.1, 22.0), and WEB + SN + NRT = 20.0 (95% CI = 12.2, 27.7). Novel propensity weighting approaches can serve as a model for establishing efficacy of Internet interventions and yield important insights about mechanisms. NCT01544153.

  4. Randomized comparison of operator radiation exposure comparing transradial and transfemoral approach for percutaneous coronary procedures: rationale and design of the minimizing adverse haemorrhagic events by TRansradial access site and systemic implementation of angioX - RAdiation Dose study (RAD-MATRIX).

    PubMed

    Sciahbasi, Alessandro; Calabrò, Paolo; Sarandrea, Alessandro; Rigattieri, Stefano; Tomassini, Francesco; Sardella, Gennaro; Zavalloni, Dennis; Cortese, Bernardo; Limbruno, Ugo; Tebaldi, Matteo; Gagnor, Andrea; Rubartelli, Paolo; Zingarelli, Antonio; Valgimigli, Marco

    2014-06-01

    Radiation absorbed by interventional cardiologists is a frequently under-evaluated important issue. Aim is to compare radiation dose absorbed by interventional cardiologists during percutaneous coronary procedures for acute coronary syndromes comparing transradial and transfemoral access. The randomized multicentre MATRIX (Minimizing Adverse Haemorrhagic Events by TRansradial Access Site and Systemic Implementation of angioX) trial has been designed to compare the clinical outcome of patients with acute coronary syndromes treated invasively according to the access site (transfemoral vs. transradial) and to the anticoagulant therapy (bivalirudin vs. heparin). Selected experienced interventional cardiologists involved in this study have been equipped with dedicated thermoluminescent dosimeters to evaluate the radiation dose absorbed during transfemoral or right transradial or left transradial access. For each access we evaluate the radiation dose absorbed at wrist, at thorax and at eye level. Consequently the operator is equipped with three sets (transfemoral, right transradial or left transradial access) of three different dosimeters (wrist, thorax and eye dosimeter). Primary end-point of the study is the procedural radiation dose absorbed by operators at thorax. An important secondary end-point is the procedural radiation dose absorbed by operators comparing the right or left radial approach. Patient randomization is performed according to the MATRIX protocol for the femoral or radial approach. A further randomization for the radial approach is performed to compare right and left transradial access. The RAD-MATRIX study will probably consent to clarify the radiation issue for interventional cardiologist comparing transradial and transfemoral access in the setting of acute coronary syndromes. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. A penalized quantitative structure-property relationship study on melting point of energetic carbocyclic nitroaromatic compounds using adaptive bridge penalty.

    PubMed

    Al-Fakih, A M; Algamal, Z Y; Lee, M H; Aziz, M

    2018-05-01

    A penalized quantitative structure-property relationship (QSPR) model with adaptive bridge penalty for predicting the melting points of 92 energetic carbocyclic nitroaromatic compounds is proposed. To ensure the consistency of the descriptor selection of the proposed penalized adaptive bridge (PBridge), we proposed a ridge estimator ([Formula: see text]) as an initial weight in the adaptive bridge penalty. The Bayesian information criterion was applied to ensure the accurate selection of the tuning parameter ([Formula: see text]). The PBridge based model was internally and externally validated based on [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], the Y-randomization test, [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text] and the applicability domain. The validation results indicate that the model is robust and not due to chance correlation. The descriptor selection and prediction performance of PBridge for the training dataset outperforms the other methods used. PBridge shows the highest [Formula: see text] of 0.959, [Formula: see text] of 0.953, [Formula: see text] of 0.949 and [Formula: see text] of 0.959, and the lowest [Formula: see text] and [Formula: see text]. For the test dataset, PBridge shows a higher [Formula: see text] of 0.945 and [Formula: see text] of 0.948, and a lower [Formula: see text] and [Formula: see text], indicating its better prediction performance. The results clearly reveal that the proposed PBridge is useful for constructing reliable and robust QSPRs for predicting melting points prior to synthesizing new organic compounds.

  6. Genome-wide association study for backfat thickness in Canchim beef cattle using Random Forest approach

    PubMed Central

    2013-01-01

    Background Meat quality involves many traits, such as marbling, tenderness, juiciness, and backfat thickness, all of which require attention from livestock producers. Backfat thickness improvement by means of traditional selection techniques in Canchim beef cattle has been challenging due to its low heritability, and it is measured late in an animal’s life. Therefore, the implementation of new methodologies for identification of single nucleotide polymorphisms (SNPs) linked to backfat thickness are an important strategy for genetic improvement of carcass and meat quality. Results The set of SNPs identified by the random forest approach explained as much as 50% of the deregressed estimated breeding value (dEBV) variance associated with backfat thickness, and a small set of 5 SNPs were able to explain 34% of the dEBV for backfat thickness. Several quantitative trait loci (QTL) for fat-related traits were found in the surrounding areas of the SNPs, as well as many genes with roles in lipid metabolism. Conclusions These results provided a better understanding of the backfat deposition and regulation pathways, and can be considered a starting point for future implementation of a genomic selection program for backfat thickness in Canchim beef cattle. PMID:23738659

  7. Nest-site selection and nest success of an Arctic-breeding passerine, Smith's Longspur, in a changing climate

    USGS Publications Warehouse

    McFarland, Heather R.; Kendall, Steve J.; Powell, Abby

    2017-01-01

    Despite changes in shrub cover and weather patterns associated with climate change in the Arctic, little is known about the breeding requirements of most passerines tied to northern regions. We investigated the nesting biology and nest habitat characteristics of Smith's Longspurs (Calcarius pictus) in 2 study areas in the Brooks Range of Alaska, USA. First, we examined variation in nesting phenology in relation to local temperatures. We then characterized nesting habitat and analyzed nest-site selection for a subset of nests (n = 86) in comparison with paired random points. Finally, we estimated the daily survival rate of 257 nests found in 2007–2013 with respect to both habitat characteristics and weather variables. Nest initiation was delayed in years with snow events, heavy rain, and freezing temperatures early in the breeding season. Nests were typically found in open, low-shrub tundra, and never among tall shrubs (mean shrub height at nests = 26.8 ± 6.7 cm). We observed weak nest-site selection patterns. Considering the similarity between nest sites and paired random points, coupled with the unique social mating system of Smith's Longspurs, we suggest that habitat selection may occur at the neighborhood scale and not at the nest-site scale. The best approximating model explaining nest survival suggested a positive relationship with the numbers of days above 21°C that an individual nest experienced; there was little support for models containing habitat variables. The daily nest survival rate was high (0.972–0.982) compared with that of most passerines in forested or grassland habitats, but similar to that of passerines nesting on tundra. Considering their high nesting success and ability to delay nest initiation during inclement weather, Smith's Longspurs may be resilient to predicted changes in weather regimes on the breeding grounds. Thus, the greatest threat to breeding Smith's Longspurs associated with climate change may be the loss of low-shrub habitat types, which could significantly change the characteristics of breeding areas.

  8. Risk of predation and weather events affect nest site selection by sympatric Pacific (Gavia pacifica) and Yellow-billed (Gavia adamsii) loons in Arctic habitats

    USGS Publications Warehouse

    Haynes, Trevor B.; Schmutz, Joel A.; Lindberg, Mark S.; Rosenberger, Amanda E.

    2014-01-01

    Pacific (Gavia pacifica) and Yellow-billed (G. adamsii) loons nest sympatrically in Arctic regions. These related species likely face similar constraints and requirements for nesting success; therefore, use of similar habitats and direct competition for nesting habitat is likely. Both of these loon species must select a breeding lake that provides suitable habitat for nesting and raising chicks; however, characteristics of nest site selection by either species on interior Arctic lakes remains poorly understood. Here, logistic regression was used to compare structural and habitat characteristics of all loon nest locations with random points from lakes on the interior Arctic Coastal Plain, Alaska. Results suggest that both loon species select nest sites to avoid predation and exposure to waves and shifting ice. Loon nest sites were more likely to be on islands and peninsulas (odds ratio = 16.13, 95% CI = 4.64–56.16) than mainland shoreline, which may help loons avoid terrestrial predators. Further, nest sites had a higher degree of visibility (mean degrees of visibility to 100 and 200 m) of approaching predators than random points (odds ratio = 2.57, 95% CI = 1.22–5.39). Nests were sheltered from exposure, having lower odds of being exposed to prevailing winds (odds ratio = 0.34, 95% CI = 0.13–0.92) and lower odds of having high fetch values (odds ratio = 0.46, 95% CI = 0.22–0.96). Differences between Pacific and Yellow-billed loon nesting sites were subtle, suggesting that both species have similar general nest site requirements. However, Yellow-billed Loons nested at slightly higher elevations and were more likely to nest on peninsulas than Pacific Loons. Pacific Loons constructed built up nests from mud and vegetation, potentially in response to limited access to suitable shoreline due to other territorial loons. Results suggest that land managers wishing to protect habitats for these species should focus on lakes with islands as well as shorelines sheltered from exposure to prevailing wind and ice patterns.

  9. Nest-site selection and hatching success of waterbirds in coastal Virginia: some results of habitat manipulation

    USGS Publications Warehouse

    Rounds, R.A.; Erwin, R.M.; Portera, J.H.

    2004-01-01

    Rising sea levels in the mid-Atlantic region pose a long-term threat to marshes and their avian inhabitants. The Gull-billed Tern (Sterna nilotica), Common Tern (S. hirundo), Black Skimmer (Rynchops niger), and American Oystercatcher (Haematopus palliatus), species of concern in Virginia, nest on low shelly perimeters of salt marsh islands on the Eastern Shore of Virginia. Marsh shellpiles are free of mammalian predators, but subject to frequent floods that reduce reproductive success. In an attempt to examine nest-site selection, enhance habitat, and improve hatching success, small (2 ? 2 m) plots on five island shellpiles were experimentally elevated, and nest-site selection and hatching success were monitored from 1 May to 1 August, 2002. In addition, location, elevation, and nesting performance of all other nests in the colonies were also monitored. No species selected the elevated experimental plots preferentially over adjacent control plots at any of the sites. When all nests were considered, Common Tern nests were located significantly lower than were random point elevations at two sites, as they tended to concentrate on low-lying wrack. At two other sites, however, Common Tern nests were significantly higher than were random points. Gull-billed Terns and American Oystercatchers showed a weak preference for higher elevations on bare shell at most sites. Hatching success was not improved on elevated plots, despite the protection they provided from flooding. Because of a 7 June flood, when 47% of all nests flooded, hatching success for all species was low. Nest elevation had the strongest impact on a nest's probability of hatching, followed by nest-initiation date. Predation rates were high at small colonies, and Ruddy Turnstones (Arenaria interpres) depredated 90% of early Gull-billed Tern nests at one shellpile. The importance of nest elevation and flooding on hatching success demonstrates the potential for management of certain waterbird nesting sites. Facing threats from predators on barrier islands and rising sea levels especially in the mid-Atlantic region, several species of nesting waterbirds may benefit dramatically with modest manipulation of even small habitat patches on isolated marsh islands.

  10. Urban tree cover change in Detroit and Atlanta, USA, 1951-2010

    Treesearch

    Krista Merry; Jacek Siry; Pete Bettinger; J.M. Bowker

    2014-01-01

    We assessed tree cover using random points and polygons distributed within the administrative boundaries of Detroit, MI and Atlanta, GA. Two approaches were tested, a point-based approach using 1000 randomly located sample points, and polygon-based approach using 250 circular areas, 200 m in radius (12.56 ha). In the case of Atlanta, both approaches arrived at similar...

  11. A Cautious Note on Auxiliary Variables That Can Increase Bias in Missing Data Problems.

    PubMed

    Thoemmes, Felix; Rose, Norman

    2014-01-01

    The treatment of missing data in the social sciences has changed tremendously during the last decade. Modern missing data techniques such as multiple imputation and full-information maximum likelihood are used much more frequently. These methods assume that data are missing at random. One very common approach to increase the likelihood that missing at random is achieved consists of including many covariates as so-called auxiliary variables. These variables are either included based on data considerations or in an inclusive fashion; that is, taking all available auxiliary variables. In this article, we point out that there are some instances in which auxiliary variables exhibit the surprising property of increasing bias in missing data problems. In a series of focused simulation studies, we highlight some situations in which this type of biasing behavior can occur. We briefly discuss possible ways how one can avoid selecting bias-inducing covariates as auxiliary variables.

  12. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    PubMed

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  13. Meta-analysis: exercise therapy for nonspecific low back pain.

    PubMed

    Hayden, Jill A; van Tulder, Maurits W; Malmivaara, Antti V; Koes, Bart W

    2005-05-03

    Exercise therapy is widely used as an intervention in low back pain. To evaluate the effectiveness of exercise therapy in adult nonspecific acute, subacute, and chronic low back pain versus no treatment and other conservative treatments. MEDLINE, EMBASE, PsychInfo, CINAHL, and Cochrane Library databases to October 2004; citation searches and bibliographic reviews of previous systematic reviews. Randomized, controlled trials evaluating exercise therapy for adult nonspecific low back pain and measuring pain, function, return to work or absenteeism, and global improvement outcomes. Two reviewers independently selected studies and extracted data on study characteristics, quality, and outcomes at short-, intermediate-, and long-term follow-up. 61 randomized, controlled trials (6390 participants) met inclusion criteria: acute (11 trials), subacute (6 trials), and chronic (43 trials) low back pain (1 trial was unclear). Evidence suggests that exercise therapy is effective in chronic back pain relative to comparisons at all follow-up periods. Pooled mean improvement (of 100 points) was 7.3 points (95% CI, 3.7 to 10.9 points) for pain and 2.5 points (CI, 1.0 to 3.9 points) for function at earliest follow-up. In studies investigating patients (people seeking care for back pain), mean improvement was 13.3 points (CI, 5.5 to 21.1 points) for pain and 6.9 points (CI, 2.2 to 11.7 points) for function, compared with studies where some participants had been recruited from a general population (for example, with advertisements). Some evidence suggests effectiveness of a graded-activity exercise program in subacute low back pain in occupational settings, although the evidence for other types of exercise therapy in other populations is inconsistent. In acute low back pain, exercise therapy and other programs were equally effective (pain, 0.03 point [CI, -1.3 to 1.4 points]). Limitations of the literature, including low-quality studies with heterogeneous outcome measures inconsistent and poor reporting, and possibility of publication bias. Exercise therapy seems to be slightly effective at decreasing pain and improving function in adults with chronic low back pain, particularly in health care populations. In subacute low back pain populations, some evidence suggests that a graded-activity program improves absenteeism outcomes, although evidence for other types of exercise is unclear. In acute low back pain populations, exercise therapy is as effective as either no treatment or other conservative treatments.

  14. Analysis of a Spatial Point Pattern: Examining the Damage to Pavement and Pipes in Santa Clara Valley Resulting from the Loma Prieta Earthquake

    USGS Publications Warehouse

    Phelps, G.A.

    2008-01-01

    This report describes some simple spatial statistical methods to explore the relationships of scattered points to geologic or other features, represented by points, lines, or areas. It also describes statistical methods to search for linear trends and clustered patterns within the scattered point data. Scattered points are often contained within irregularly shaped study areas, necessitating the use of methods largely unexplored in the point pattern literature. The methods take advantage of the power of modern GIS toolkits to numerically approximate the null hypothesis of randomly located data within an irregular study area. Observed distributions can then be compared with the null distribution of a set of randomly located points. The methods are non-parametric and are applicable to irregularly shaped study areas. Patterns within the point data are examined by comparing the distribution of the orientation of the set of vectors defined by each pair of points within the data with the equivalent distribution for a random set of points within the study area. A simple model is proposed to describe linear or clustered structure within scattered data. A scattered data set of damage to pavement and pipes, recorded after the 1989 Loma Prieta earthquake, is used as an example to demonstrate the analytical techniques. The damage is found to be preferentially located nearer a set of mapped lineaments than randomly scattered damage, suggesting range-front faulting along the base of the Santa Cruz Mountains is related to both the earthquake damage and the mapped lineaments. The damage also exhibit two non-random patterns: a single cluster of damage centered in the town of Los Gatos, California, and a linear alignment of damage along the range front of the Santa Cruz Mountains, California. The linear alignment of damage is strongest between 45? and 50? northwest. This agrees well with the mean trend of the mapped lineaments, measured as 49? northwest.

  15. Modeling and Simulation of a Novel Relay Node Based Secure Routing Protocol Using Multiple Mobile Sink for Wireless Sensor Networks.

    PubMed

    Perumal, Madhumathy; Dhandapani, Sivakumar

    2015-01-01

    Data gathering and optimal path selection for wireless sensor networks (WSN) using existing protocols result in collision. Increase in collision further increases the possibility of packet drop. Thus there is a necessity to eliminate collision during data aggregation. Increasing the efficiency is the need of the hour with maximum security. This paper is an effort to come up with a reliable and energy efficient WSN routing and secure protocol with minimum delay. This technique is named as relay node based secure routing protocol for multiple mobile sink (RSRPMS). This protocol finds the rendezvous point for optimal transmission of data using a "splitting tree" technique in tree-shaped network topology and then to determine all the subsequent positions of a sink the "Biased Random Walk" model is used. In case of an event, the sink gathers the data from all sources, when they are in the sensing range of rendezvous point. Otherwise relay node is selected from its neighbor to transfer packets from rendezvous point to sink. A symmetric key cryptography is used for secure transmission. The proposed relay node based secure routing protocol for multiple mobile sink (RSRPMS) is experimented and simulation results are compared with Intelligent Agent-Based Routing (IAR) protocol to prove that there is increase in the network lifetime compared with other routing protocols.

  16. Factors Associated With Time to Site Activation, Randomization, and Enrollment Performance in a Stroke Prevention Trial.

    PubMed

    Demaerschalk, Bart M; Brown, Robert D; Roubin, Gary S; Howard, Virginia J; Cesko, Eldina; Barrett, Kevin M; Longbottom, Mary E; Voeks, Jenifer H; Chaturvedi, Seemant; Brott, Thomas G; Lal, Brajesh K; Meschia, James F; Howard, George

    2017-09-01

    Multicenter clinical trials attempt to select sites that can move rapidly to randomization and enroll sufficient numbers of patients. However, there are few assessments of the success of site selection. In the CREST-2 (Carotid Revascularization and Medical Management for Asymptomatic Carotid Stenosis Trials), we assess factors associated with the time between site selection and authorization to randomize, the time between authorization to randomize and the first randomization, and the average number of randomizations per site per month. Potential factors included characteristics of the site, specialty of the principal investigator, and site type. For 147 sites, the median time between site selection to authorization to randomize was 9.9 months (interquartile range, 7.7, 12.4), and factors associated with early site activation were not identified. The median time between authorization to randomize and a randomization was 4.6 months (interquartile range, 2.6, 10.5). Sites with authorization to randomize in only the carotid endarterectomy study were slower to randomize, and other factors examined were not significantly associated with time-to-randomization. The recruitment rate was 0.26 (95% confidence interval, 0.23-0.28) patients per site per month. By univariate analysis, factors associated with faster recruitment were authorization to randomize in both trials, principal investigator specialties of interventional radiology and cardiology, pre-trial reported performance >50 carotid angioplasty and stenting procedures per year, status in the top half of recruitment in the CREST trial, and classification as a private health facility. Participation in StrokeNet was associated with slower recruitment as compared with the non-StrokeNet sites. Overall, selection of sites with high enrollment rates will likely require customization to align the sites selected to the factor under study in the trial. URL: http://www.clinicaltrials.gov. Unique identifier: NCT02089217. © 2017 American Heart Association, Inc.

  17. Impact of heterogeneous activity and community structure on the evolutionary success of cooperators in social networks

    NASA Astrophysics Data System (ADS)

    Wu, Zhi-Xi; Rong, Zhihai; Yang, Han-Xin

    2015-01-01

    Recent empirical studies suggest that heavy-tailed distributions of human activities are universal in real social dynamics [L. Muchnik, S. Pei, L. C. Parra, S. D. S. Reis, J. S. Andrade Jr., S. Havlin, and H. A. Makse, Sci. Rep. 3, 1783 (2013), 10.1038/srep01783]. On the other hand, community structure is ubiquitous in biological and social networks [M. E. J. Newman, Nat. Phys. 8, 25 (2012), 10.1038/nphys2162]. Motivated by these facts, we here consider the evolutionary prisoner's dilemma game taking place on top of a real social network to investigate how the community structure and the heterogeneity in activity of individuals affect the evolution of cooperation. In particular, we account for a variation of the birth-death process (which can also be regarded as a proportional imitation rule from a social point of view) for the strategy updating under both weak and strong selection (meaning the payoffs harvested from games contribute either slightly or heavily to the individuals' performance). By implementing comparative studies, where the players are selected either randomly or in terms of their actual activities to play games with their immediate neighbors, we figure out that heterogeneous activity benefits the emergence of collective cooperation in a harsh environment (the action for cooperation is costly) under strong selection, whereas it impairs the formation of altruism under weak selection. Moreover, we find that the abundance of communities in the social network can evidently foster the formation of cooperation under strong selection, in contrast to the games evolving on randomized counterparts. Our results are therefore helpful for us to better understand the evolution of cooperation in real social systems.

  18. Survival of the most transferable at the top of Jacob's ladder: Defining and testing the ωB97M(2) double hybrid density functional

    NASA Astrophysics Data System (ADS)

    Mardirossian, Narbe; Head-Gordon, Martin

    2018-06-01

    A meta-generalized gradient approximation, range-separated double hybrid (DH) density functional with VV10 non-local correlation is presented. The final 14-parameter functional form is determined by screening trillions of candidate fits through a combination of best subset selection, forward stepwise selection, and random sample consensus (RANSAC) outlier detection. The MGCDB84 database of 4986 data points is employed in this work, containing a training set of 870 data points, a validation set of 2964 data points, and a test set of 1152 data points. Following an xDH approach, orbitals from the ωB97M-V density functional are used to compute the second-order perturbation theory correction. The resulting functional, ωB97M(2), is benchmarked against a variety of leading double hybrid density functionals, including B2PLYP-D3(BJ), B2GPPLYP-D3(BJ), ωB97X-2(TQZ), XYG3, PTPSS-D3(0), XYGJ-OS, DSD-PBEP86-D3(BJ), and DSD-PBEPBE-D3(BJ). Encouragingly, the overall performance of ωB97M(2) on nearly 5000 data points clearly surpasses that of all of the tested density functionals. As a Rung 5 density functional, ωB97M(2) completes our family of combinatorially optimized functionals, complementing B97M-V on Rung 3, and ωB97X-V and ωB97M-V on Rung 4. The results suggest that ωB97M(2) has the potential to serve as a powerful predictive tool for accurate and efficient electronic structure calculations of main-group chemistry.

  19. Vendor compliance with Ontario's tobacco point of sale legislation.

    PubMed

    Dubray, Jolene M; Schwartz, Robert M; Garcia, John M; Bondy, Susan J; Victor, J Charles

    2009-01-01

    On May 31, 2006, Ontario joined a small group of international jurisdictions to implement legislative restrictions on tobacco point of sale promotions. This study compares the presence of point of sale promotions in the retail tobacco environment from three surveys: one prior to and two following implementation of the legislation. Approximately 1,575 tobacco vendors were randomly selected for each survey. Each regionally-stratified sample included equal numbers of tobacco vendors categorized into four trade classes: chain convenience, independent convenience and discount, gas stations, and grocery. Data regarding the six restricted point of sale promotions were collected using standardized protocols and inspection forms. Weighted estimates and 95% confidence intervals were produced at the provincial, regional and vendor trade class level using the bootstrap method for estimating variance. At baseline, the proportion of tobacco vendors who did not engage in each of the six restricted point of sale promotions ranged from 41% to 88%. Within four months following implementation of the legislation, compliance with each of the six restricted point of sale promotions exceeded 95%. Similar levels of compliance were observed one year later. Grocery stores had the fewest point of sale promotions displayed at baseline. Compliance rates did not differ across vendor trade classes at either follow-up survey. Point of sale promotions did not differ across regions in any of the three surveys. Within a short period of time, a high level of compliance with six restricted point of sale promotions was achieved.

  20. Evaluation of some random effects methodology applicable to bird ringing data

    USGS Publications Warehouse

    Burnham, K.P.; White, Gary C.

    2002-01-01

    Existing models for ring recovery and recapture data analysis treat temporal variations in annual survival probability (S) as fixed effects. Often there is no explainable structure to the temporal variation in S1,..., Sk; random effects can then be a useful model: Si = E(S) + ??i. Here, the temporal variation in survival probability is treated as random with average value E(??2) = ??2. This random effects model can now be fit in program MARK. Resultant inferences include point and interval estimation for process variation, ??2, estimation of E(S) and var (E??(S)) where the latter includes a component for ??2 as well as the traditional component for v??ar(S??\\S??). Furthermore, the random effects model leads to shrinkage estimates, Si, as improved (in mean square error) estimators of Si compared to the MLE, S??i, from the unrestricted time-effects model. Appropriate confidence intervals based on the Si are also provided. In addition, AIC has been generalized to random effects models. This paper presents results of a Monte Carlo evaluation of inference performance under the simple random effects model. Examined by simulation, under the simple one group Cormack-Jolly-Seber (CJS) model, are issues such as bias of ??s2, confidence interval coverage on ??2, coverage and mean square error comparisons for inference about Si based on shrinkage versus maximum likelihood estimators, and performance of AIC model selection over three models: Si ??? S (no effects), Si = E(S) + ??i (random effects), and S1,..., Sk (fixed effects). For the cases simulated, the random effects methods performed well and were uniformly better than fixed effects MLE for the Si.

  1. Statistical power in parallel group point exposure studies with time-to-event outcomes: an empirical comparison of the performance of randomized controlled trials and the inverse probability of treatment weighting (IPTW) approach.

    PubMed

    Austin, Peter C; Schuster, Tibor; Platt, Robert W

    2015-10-15

    Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.

  2. Phase IIb, Randomized, Double-Blind, Placebo-Controlled Study of Naldemedine for the Treatment of Opioid-Induced Constipation in Patients With Cancer.

    PubMed

    Katakami, Nobuyuki; Oda, Koji; Tauchi, Katsunori; Nakata, Ken; Shinozaki, Katsunori; Yokota, Takaaki; Suzuki, Yura; Narabayashi, Masaru; Boku, Narikazu

    2017-06-10

    Purpose This randomized, double-blind, multicenter study aimed to determine the dose of naldemedine, a peripherally-acting μ-opioid receptor antagonist, for future trials by comparing the efficacy and safety of three doses of naldemedine versus placebo in patients with cancer and opioid-induced constipation. Methods Patients ≥ 18 years old with cancer, an Eastern Cooperative Oncology Group performance status ≤ 2, who had been receiving a stable regimen of opioid analgesics for ≥ 2 weeks, had at least one constipation symptom despite laxative use, and no more than five spontaneous bowel movements (SBMs) during the past 14 days, were randomly assigned (1:1:1:1) to oral, once-daily naldemedine 0.1, 0.2, or 0.4 mg, or placebo, for 14 days. The primary end point was change in SBM frequency per week from baseline during the treatment period. Secondary end points included SBM responder rates, change from baseline in the frequency of SBM without straining, and complete SBM. Safety was also assessed. Results Of 227 patients who were randomly assigned, 225 were assessed for efficacy (naldemedine 0.1 mg, n = 55; 0.2 mg, n = 58; 0.4 mg, n = 56; placebo, n = 56) and 226 for safety. Change in SBM frequency (primary end point) was higher with all naldemedine doses versus placebo ( P < .05 for all comparisons), as were SBM responder rates and change in complete SBM frequency. Change in SBM frequency without straining was significantly improved with naldemedine 0.2 and 0.4 (but not 0.1) mg versus placebo (at least P < .05). Treatment-emergent adverse events were more common with naldemedine (0.1 mg: 66.1%; 0.2 mg: 67.2%; 0.4 mg: 78.6%) than placebo (51.8%); the most common treatment-emergent adverse event was diarrhea. Conclusion Fourteen-day treatment with naldemedine significantly improved opioid-induced constipation in patients with cancer and was generally well tolerated. Naldemedine 0.2 mg was selected for phase III studies.

  3. Statistical properties of several models of fractional random point processes

    NASA Astrophysics Data System (ADS)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  4. The Mean Distance to the nth Neighbour in a Uniform Distribution of Random Points: An Application of Probability Theory

    ERIC Educational Resources Information Center

    Bhattacharyya, Pratip; Chakrabarti, Bikas K.

    2008-01-01

    We study different ways of determining the mean distance (r[subscript n]) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating…

  5. Clustering of galaxies near damped Lyman-alpha systems with (z) = 2.6

    NASA Technical Reports Server (NTRS)

    Wolfe, A. M

    1993-01-01

    The galaxy two-point correlation function, xi, at (z) = 2.6 is determined by comparing the number of Ly-alpha-emitting galaxies in narrowband CCD fields selected for the presence of damped L-alpha absorption to their number in randomly selected control fields. Comparisons between the presented determination of (xi), a density-weighted volume average of xi, and model predictions for (xi) at large redshifts show that models in which the clustering pattern is fixed in proper coordinates are highly unlikely, while better agreement is obtained if the clustering pattern is fixed in comoving coordinates. Therefore, clustering of Ly-alpha-emitting galaxies around damped Ly-alpha systems at large redshifts is strong. It is concluded that the faint blue galaxies are drawn from a parent population different from normal galaxies, the presumed offspring of damped Ly-alpha systems.

  6. Tumor progression: chance and necessity in Darwinian and Lamarckian somatic (mutationless) evolution.

    PubMed

    Huang, Sui

    2012-09-01

    Current investigation of cancer progression towards increasing malignancy focuses on the molecular pathways that produce the various cancerous traits of cells. Their acquisition is explained by the somatic mutation theory: tumor progression is the result of a neo-Darwinian evolution in the tissue. Herein cells are the units of selection. Random genetic mutations permanently affecting these pathways create malignant cell phenotypes that are selected for in the disturbed tissue. However, could it be that the capacity of the genome and its gene regulatory network to generate the vast diversity of cell types during development, i.e., to produce inheritable phenotypic changes without mutations, is harnessed by tumorigenesis to propel a directional change towards malignancy? Here we take an encompassing perspective, transcending the orthodoxy of molecular carcinogenesis and review mechanisms of somatic evolution beyond the Neo-Darwinian scheme. We discuss the central concept of "cancer attractors" - the hidden stable states of gene regulatory networks normally not occupied by cells. Noise-induced transitions into such attractors provide a source for randomness (chance) and regulatory constraints (necessity) in the acquisition of novel expression profiles that can be inherited across cell divisions, and hence, can be selected for. But attractors can also be reached in response to environmental signals - thus offering the possibility for inheriting acquired traits that can also be selected for. Therefore, we face the possibility of non-genetic (mutation-independent) equivalents to both Darwinian and Lamarckian evolution which may jointly explain the arrow of change pointing toward increasing malignancy. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. 3D statistical shape models incorporating 3D random forest regression voting for robust CT liver segmentation

    NASA Astrophysics Data System (ADS)

    Norajitra, Tobias; Meinzer, Hans-Peter; Maier-Hein, Klaus H.

    2015-03-01

    During image segmentation, 3D Statistical Shape Models (SSM) usually conduct a limited search for target landmarks within one-dimensional search profiles perpendicular to the model surface. In addition, landmark appearance is modeled only locally based on linear profiles and weak learners, altogether leading to segmentation errors from landmark ambiguities and limited search coverage. We present a new method for 3D SSM segmentation based on 3D Random Forest Regression Voting. For each surface landmark, a Random Regression Forest is trained that learns a 3D spatial displacement function between the according reference landmark and a set of surrounding sample points, based on an infinite set of non-local randomized 3D Haar-like features. Landmark search is then conducted omni-directionally within 3D search spaces, where voxelwise forest predictions on landmark position contribute to a common voting map which reflects the overall position estimate. Segmentation experiments were conducted on a set of 45 CT volumes of the human liver, of which 40 images were randomly chosen for training and 5 for testing. Without parameter optimization, using a simple candidate selection and a single resolution approach, excellent results were achieved, while faster convergence and better concavity segmentation were observed, altogether underlining the potential of our approach in terms of increased robustness from distinct landmark detection and from better search coverage.

  8. Pareto genealogies arising from a Poisson branching evolution model with selection.

    PubMed

    Huillet, Thierry E

    2014-02-01

    We study a class of coalescents derived from a sampling procedure out of N i.i.d. Pareto(α) random variables, normalized by their sum, including β-size-biasing on total length effects (β < α). Depending on the range of α we derive the large N limit coalescents structure, leading either to a discrete-time Poisson-Dirichlet (α, -β) Ξ-coalescent (α ε[0, 1)), or to a family of continuous-time Beta (2 - α, α - β)Λ-coalescents (α ε[1, 2)), or to the Kingman coalescent (α ≥ 2). We indicate that this class of coalescent processes (and their scaling limits) may be viewed as the genealogical processes of some forward in time evolving branching population models including selection effects. In such constant-size population models, the reproduction step, which is based on a fitness-dependent Poisson Point Process with scaling power-law(α) intensity, is coupled to a selection step consisting of sorting out the N fittest individuals issued from the reproduction step.

  9. Influence of stochastic geometric imperfections on the load-carrying behaviour of thin-walled structures using constrained random fields

    NASA Astrophysics Data System (ADS)

    Lauterbach, S.; Fina, M.; Wagner, W.

    2018-04-01

    Since structural engineering requires highly developed and optimized structures, the thickness dependency is one of the most controversially debated topics. This paper deals with stability analysis of lightweight thin structures combined with arbitrary geometrical imperfections. Generally known design guidelines only consider imperfections for simple shapes and loading, whereas for complex structures the lower-bound design philosophy still holds. Herein, uncertainties are considered with an empirical knockdown factor representing a lower bound of existing measurements. To fully understand and predict expected bearable loads, numerical investigations are essential, including geometrical imperfections. These are implemented into a stand-alone program code with a stochastic approach to compute random fields as geometric imperfections that are applied to nodes of the finite element mesh of selected structural examples. The stochastic approach uses the Karhunen-Loève expansion for the random field discretization. For this approach, the so-called correlation length l_c controls the random field in a powerful way. This parameter has a major influence on the buckling shape, and also on the stability load. First, the impact of the correlation length is studied for simple structures. Second, since most structures for engineering devices are more complex and combined structures, these are intensively discussed with the focus on constrained random fields for e.g. flange-web-intersections. Specific constraints for those random fields are pointed out with regard to the finite element model. Further, geometrical imperfections vanish where the structure is supported.

  10. Improving Critical Thinking Using a Web-Based Tutorial Environment.

    PubMed

    Wiesner, Stephen M; Walker, J D; Creeger, Craig R

    2017-01-01

    With a broad range of subject matter, students often struggle recognizing relationships between content in different subject areas. A scenario-based learning environment (SaBLE) has been developed to enhancing clinical reasoning and critical thinking among undergraduate students in a medical laboratory science program and help them integrate their new knowledge. SaBLE incorporates aspects of both cognitive theory and instructional design, including reduction of extraneous cognitive load, goal-based learning, feedback timing, and game theory. SaBLE is a website application that runs in most browsers and devices, and is used to develop randomly selected scenarios that challenge user thinking in almost any scenario-based instruction. User progress is recorded to allow comprehensive data analysis of changes in user performance. Participation is incentivized using a point system and digital badges or awards. SaBLE was deployed in one course with a total exposure for the treatment group of approximately 9 weeks. When assessing performance of SaBLE participants, and controlling for grade point average as a possible confounding variable, there was a statistically significant correlation between the number of SaBLE levels completed and performance on selected critical-thinking exam questions addressing unrelated content.

  11. 47 CFR 1.1604 - Post-selection hearings.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random...

  12. 47 CFR 1.1604 - Post-selection hearings.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random...

  13. Determining the Number of Clusters in a Data Set Without Graphical Interpretation

    NASA Technical Reports Server (NTRS)

    Aguirre, Nathan S.; Davies, Misty D.

    2011-01-01

    Cluster analysis is a data mining technique that is meant ot simplify the process of classifying data points. The basic clustering process requires an input of data points and the number of clusters wanted. The clustering algorithm will then pick starting C points for the clusters, which can be either random spatial points or random data points. It then assigns each data point to the nearest C point where "nearest usually means Euclidean distance, but some algorithms use another criterion. The next step is determining whether the clustering arrangement this found is within a certain tolerance. If it falls within this tolerance, the process ends. Otherwise the C points are adjusted based on how many data points are in each cluster, and the steps repeat until the algorithm converges,

  14. Development of Curie point switching for thin film, random access, memory device

    NASA Technical Reports Server (NTRS)

    Lewicki, G. W.; Tchernev, D. I.

    1967-01-01

    Managanese bismuthide films are used in the development of a random access memory device of high packing density and nondestructive readout capability. Memory entry is by Curie point switching using a laser beam. Readout is accomplished by microoptical or micromagnetic scanning.

  15. A random wave model for the Aharonov-Bohm effect

    NASA Astrophysics Data System (ADS)

    Houston, Alexander J. H.; Gradhand, Martin; Dennis, Mark R.

    2017-05-01

    We study an ensemble of random waves subject to the Aharonov-Bohm effect. The introduction of a point with a magnetic flux of arbitrary strength into a random wave ensemble gives a family of wavefunctions whose distribution of vortices (complex zeros) is responsible for the topological phase associated with the Aharonov-Bohm effect. Analytical expressions are found for the vortex number and topological charge densities as functions of distance from the flux point. Comparison is made with the distribution of vortices in the isotropic random wave model. The results indicate that as the flux approaches half-integer values, a vortex with the same sign as the fractional part of the flux is attracted to the flux point, merging with it in the limit of half-integer flux. We construct a statistical model of the neighbourhood of the flux point to study how this vortex-flux merger occurs in more detail. Other features of the Aharonov-Bohm vortex distribution are also explored.

  16. Waveform analysis-guided treatment versus a standard shock-first protocol for the treatment of out-of-hospital cardiac arrest presenting in ventricular fibrillation: results of an international randomized, controlled trial.

    PubMed

    Freese, John P; Jorgenson, Dawn B; Liu, Ping-Yu; Innes, Jennifer; Matallana, Luis; Nammi, Krishnakant; Donohoe, Rachael T; Whitbread, Mark; Silverman, Robert A; Prezant, David J

    2013-08-27

    Ventricular fibrillation (VF) waveform properties have been shown to predict defibrillation success and outcomes among patients treated with immediate defibrillation. We postulated that a waveform analysis algorithm could be used to identify VF unlikely to respond to immediate defibrillation, allowing selective initial treatment with cardiopulmonary resuscitation in an effort to improve overall survival. In a multicenter, double-blind, randomized study, out-of-hospital cardiac arrest patients in 2 urban emergency medical services systems were treated with automated external defibrillators using either a VF waveform analysis algorithm or the standard shock-first protocol. The VF waveform analysis used a predefined threshold value below which return of spontaneous circulation (ROSC) was unlikely with immediate defibrillation, allowing selective treatment with a 2-minute interval of cardiopulmonary resuscitation before initial defibrillation. The primary end point was survival to hospital discharge. Secondary end points included ROSC, sustained ROSC, and survival to hospital admission. Of 6738 patients enrolled, 987 patients with VF of primary cardiac origin were included in the primary analysis. No immediate or long-term survival benefit was noted for either treatment algorithm (ROSC, 42.5% versus 41.2%, P=0.70; sustained ROSC, 32.4% versus 33.4%, P=0.79; survival to admission, 34.1% versus 36.4%, P=0.46; survival to hospital discharge, 15.6% versus 17.2%, P=0.55, respectively). Use of a waveform analysis algorithm to guide the initial treatment of out-of-hospital cardiac arrest patients presenting in VF did not improve overall survival compared with a standard shock-first protocol. Further study is recommended to examine the role of waveform analysis for the guided management of VF.

  17. Thermodynamical interpretation of an adaptive walk on a Mt. Fuji-type fitness landscape: Einstein relation-like formula holds in a stochastic evolution.

    PubMed

    Aita, Takuyo; Husimi, Yuzuru

    2003-11-21

    We have theoretically studied the statistical properties of adaptive walks (or hill-climbing) on a Mt. Fuji-type fitness landscape in the multi-dimensional sequence space through mathematical analysis and computer simulation. The adaptive walk is characterized by the "mutation distance" d as the step-width of the walker and the "population size" N as the number of randomly generated d-fold point mutants to be screened. In addition to the fitness W, we introduced the following quantities analogous to thermodynamical concepts: "free fitness" G(W) is identical with W+T x S(W), where T is the "evolutionary temperature" T infinity square root of d/lnN and S(W) is the entropy as a function of W, and the "evolutionary force" X is identical with d(G(W)/T)/dW, that is caused by the mutation and selection pressure. It is known that a single adaptive walker rapidly climbs on the fitness landscape up to the stationary state where a "mutation-selection-random drift balance" is kept. In our interpretation, the walker tends to the maximal free fitness state, driven by the evolutionary force X. Our major findings are as follows: First, near the stationary point W*, the "climbing rate" J as the expected fitness change per generation is described by J approximately L x X with L approximately V/2, where V is the variance of fitness distribution on a local landscape. This simple relationship is analogous to the well-known Einstein relation in Brownian motion. Second, the "biological information gain" (DeltaG/T) through adaptive walk can be described by combining the Shannon's information gain (DeltaS) and the "fitness information gain" (DeltaW/T).

  18. Estimating the Cost of Preeclampsia in the Healthcare System: Cross-Sectional Study Using Data From SCOPE Study (Screening for Pregnancy End Points).

    PubMed

    Fox, Aimée; McHugh, Sheena; Browne, John; Kenny, Louise C; Fitzgerald, Anthony; Khashan, Ali S; Dempsey, Eugene; Fahy, Ciara; O'Neill, Ciaran; Kearney, Patricia M

    2017-12-01

    To estimate the cost of preeclampsia from the national health payer's perspective using secondary data from the SCOPE study (Screening for Pregnancy End Points). SCOPE is an international observational prospective study of healthy nulliparous women with singleton pregnancies. Using data from the Irish cohort recruited between November 2008 and February 2011, all women with preeclampsia and a 10% random sample of women without preeclampsia were selected. Additional health service use data were extracted from the consenting participants' medical records for maternity services which were not included in SCOPE. Unit costs were based on estimates from 3 existing Irish studies. Costs were extrapolated to a national level using a prevalence rate of 5% to 7% among nulliparous pregnancies. Within the cohort of 1774 women, 68 developed preeclampsia (3.8%) and 171 women were randomly selected as controls. Women with preeclampsia used higher levels of maternity services. The average cost of a pregnancy complicated by preeclampsia was €5243 per case compared with €2452 per case for an uncomplicated pregnancy. The national cost of preeclampsia is between €6.5 and €9.1 million per annum based on the 5% to 7% prevalence rate. Postpartum care was the largest contributor to these costs (€4.9-€6.9 million), followed by antepartum care (€0.9-€1.3 million) and peripartum care (€0.6-€0.7 million). Women with preeclampsia generate significantly higher maternity costs than women without preeclampsia. These cost estimates will allow policy-makers to efficiently allocate resources for this pregnancy-specific condition. Moreover, these estimates are useful for future research assessing the cost-effectiveness of preeclampsia screening and treatment. © 2017 American Heart Association, Inc.

  19. Resource-enhancing group intervention against depression at workplace: who benefits? A randomised controlled study with a 7-month follow-up.

    PubMed

    Ahola, Kirsi; Vuori, Jukka; Toppinen-Tanner, Salla; Mutanen, Pertti; Honkonen, Teija

    2012-12-01

    The aim of the present study was to investigate whether participation in a structured resource-enhancing group intervention at work would act as primary prevention against depression. The authors analysed whether the intervention resulted in universal, selected or indicated prevention. A total of 566 persons participated in a prospective, within-organisation, randomly assigned field experimental study, which consisted of 34 workshops in 17 organisations. The participants filled in a questionnaire, were randomly assigned to either intervention (n=296) or comparison (n=324) groups and returned another questionnaire 7 months later. The intervention, lasting four half-day sessions, was delivered by trainers from occupational health services and human resources. The aim of the structured programme was to enhance participants' career management preparedness by strengthening self-efficacy and inoculation against setbacks. The comparison group received a literature package. The authors measured depressive symptoms using the short version of the Beck Depression Inventory. A high number of depressive symptoms (over 9 points) were used as a proxy for depression. At follow-up, the odds of depression were lower in the intervention group (OR=0.40, 95% CI 0.19 to 0.85) than in the comparison group when adjusted for baseline depressive symptoms, job strain and socio-demographics. In addition, the odds of depression among those with job strain (OR=0.15, 95% CI 0.03-0.81) at baseline were lower after the intervention. The intervention had no statistically significant effect on those with depressive symptoms (over 4 points) at baseline. The resource-enhancing group intervention appeared to be successful as universal and selective prevention of potential depression.

  20. Health information needs of professional nurses required at the point of care.

    PubMed

    Ricks, Esmeralda; ten Ham, Wilma

    2015-06-11

    Professional nurses work in dynamic environments and need to keep up to date with relevant information for practice in nursing to render quality patient care. Keeping up to date with current information is often challenging because of heavy workload, diverse information needs and the accessibility of the required information at the point of care. The aim of the study was to explore and describe the information needs of professional nurses at the point of care in order to make recommendations to stakeholders to develop a mobile library accessible by means of smart phones when needed. The researcher utilised a quantitative, descriptive survey design to conduct this study. The target population comprised 757 professional nurses employed at a state hospital. Simple random sampling was used to select a sample of the wards, units and departments for inclusion in the study. A convenience sample of 250 participants was selected. Two hundred and fifty structured self-administered questionnaires were distributed amongst the participants. Descriptive statistics were used to analyse the data. A total of 136 completed questionnaires were returned. The findings highlighted the types and accessible sources of information. Information needs of professional nurses were identified such as: extremely drug-resistant tuberculosis, multi-drug-resistant tuberculosis, HIV, antiretrovirals and all chronic lifestyle diseases. This study has enabled the researcher to identify the information needs required by professional nurses at the point of care to enhance the delivery of patient care. The research results were used to develop a mobile library that could be accessed by professional nurses.

  1. Bilateral effects of 6 weeks' unilateral acupuncture and electroacupuncture on ankle dorsiflexors muscle strength: a pilot study.

    PubMed

    Zhou, Shi; Huang, Li-Ping; Liu, Jun; Yu, Jun-Hai; Tian, Qiang; Cao, Long-Jun

    2012-01-01

    To determine the effect of unilateral manual acupuncture at selected acupoints on ankle dorsiflexion strength of both limbs, and compare the effect with that of electroacupuncture at the same acupoints and sham points. Randomized controlled trial. Rehabilitation laboratory of a university. Young men (N=43) were randomly allocated into 4 groups: control; manual acupuncture and electroacupuncture on 2 acupoints (ST-36 and ST-39); and electroacupuncture on 2 nonacupoints. These points were located on the tibialis anterior muscle. The participants in the experimental groups received 15 to 30 minutes of acupuncture or electroacupuncture on the right leg in each session, 3 sessions per week for 6 weeks. The maximal strength in isometric ankle dorsiflexion of both legs was assessed before and after the experimental period. Repeated-measures analysis of variance identified significant and similar strength gains (range, 35%-64% in the right leg and 32%-49% in the left leg; P<.01) in all acupuncture groups, but not in the control group (-2% to 2%, P>.05). Unilateral manual acupuncture and electroacupuncture at the acupoints can improve muscle strength in both limbs, and electroacupuncture at the nonacupoints as used in this study can also induce similar strength gains. Copyright © 2012 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  2. Hibernacula and summer den sites of pine snakes (Pituophis melanoleucus) in the New Jersey pine barrens

    USGS Publications Warehouse

    Burger, J.; Zappalorti, R.T.; Gochfeld, M.; Boarman, W.I.; Caffrey, M.; Doig, V.; Garber, S.D.; Lauro, B.; Mikovsky, M.; Safina, C.; Saliva, Jorge

    1988-01-01

    We examined eight summer dens (used only in summer) and seven hibernacula (occupied both in winter and summer) of the snake Pituophis melanoleucus in the New Jersey Pine Barrens, comparing above ground characteristics of hibernacula and summer dens with characteristics at nearby random points. Temperatures at the soil surface and at 10 cm depth were significantly warmer, and there was less leaf cover around the random points compared to the entrances of the hibernacula and summer dens. Hibernacula had significantly more vegetation cover within 5 m, more leaf cover over the burrow entrance, and were closer to trees than were summer dens. Most hibernacula and summer dens were beside old fallen logs (73%), the entrance tunnels following decaying roots into the soil. Excavation of the hibernacula and summer dens indicated that most hibernacula appeared to be dug by the snakes and had an average of eight side chambers and 642 cm of tunnels, compared to less than one side chamber and 122 cm of tunnels for summer dens. Except for hatchlings, most snakes in hibernacula were located in individual chambers off the main tunnel; all snakes were at depths of 50-111 cm (X̄ = 79 cm). Pine snakes may select optimum hibernation sites which reduce winter mortality.

  3. A non-equilibrium neutral model for analysing cultural change.

    PubMed

    Kandler, Anne; Shennan, Stephen

    2013-08-07

    Neutral evolution is a frequently used model to analyse changes in frequencies of cultural variants over time. Variants are chosen to be copied according to their relative frequency and new variants are introduced by a process of random mutation. Here we present a non-equilibrium neutral model which accounts for temporally varying population sizes and mutation rates and makes it possible to analyse the cultural system under consideration at any point in time. This framework gives an indication whether observed changes in the frequency distributions of a set of cultural variants between two time points are consistent with the random copying hypothesis. We find that the likelihood of the existence of the observed assemblage at the end of the considered time period (expressed by the probability of the observed number of cultural variants present in the population during the whole period under neutral evolution) is a powerful indicator of departures from neutrality. Further, we study the effects of frequency-dependent selection on the evolutionary trajectories and present a case study of change in the decoration of pottery in early Neolithic Central Europe. Based on the framework developed we show that neutral evolution is not an adequate description of the observed changes in frequency. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. A randomized, controlled trial of oral propranolol in infantile hemangioma.

    PubMed

    Léauté-Labrèze, Christine; Hoeger, Peter; Mazereeuw-Hautier, Juliette; Guibaud, Laurent; Baselga, Eulalia; Posiunas, Gintas; Phillips, Roderic J; Caceres, Hector; Lopez Gutierrez, Juan Carlos; Ballona, Rosalia; Friedlander, Sheila Fallon; Powell, Julie; Perek, Danuta; Metz, Brandie; Barbarot, Sebastien; Maruani, Annabel; Szalai, Zsuzsanna Zsofia; Krol, Alfons; Boccara, Olivia; Foelster-Holst, Regina; Febrer Bosch, Maria Isabel; Su, John; Buckova, Hana; Torrelo, Antonio; Cambazard, Frederic; Grantzow, Rainer; Wargon, Orli; Wyrzykowski, Dariusz; Roessler, Jochen; Bernabeu-Wittel, Jose; Valencia, Adriana M; Przewratil, Przemyslaw; Glick, Sharon; Pope, Elena; Birchall, Nicholas; Benjamin, Latanya; Mancini, Anthony J; Vabres, Pierre; Souteyrand, Pierre; Frieden, Ilona J; Berul, Charles I; Mehta, Cyrus R; Prey, Sorilla; Boralevi, Franck; Morgan, Caroline C; Heritier, Stephane; Delarue, Alain; Voisard, Jean-Jacques

    2015-02-19

    Oral propranolol has been used to treat complicated infantile hemangiomas, although data from randomized, controlled trials to inform its use are limited. We performed a multicenter, randomized, double-blind, adaptive, phase 2-3 trial assessing the efficacy and safety of a pediatric-specific oral propranolol solution in infants 1 to 5 months of age with proliferating infantile hemangioma requiring systemic therapy. Infants were randomly assigned to receive placebo or one of four propranolol regimens (1 or 3 mg of propranolol base per kilogram of body weight per day for 3 or 6 months). A preplanned interim analysis was conducted to identify the regimen to study for the final efficacy analysis. The primary end point was success (complete or nearly complete resolution of the target hemangioma) or failure of trial treatment at week 24, as assessed by independent, centralized, blinded evaluations of standardized photographs. Of 460 infants who underwent randomization, 456 received treatment. On the basis of an interim analysis of the first 188 patients who completed 24 weeks of trial treatment, the regimen of 3 mg of propranolol per kilogram per day for 6 months was selected for the final efficacy analysis. The frequency of successful treatment was higher with this regimen than with placebo (60% vs. 4%, P<0.001). A total of 88% of patients who received the selected propranolol regimen showed improvement by week 5, versus 5% of patients who received placebo. A total of 10% of patients in whom treatment with propranolol was successful required systemic retreatment during follow-up. Known adverse events associated with propranolol (hypoglycemia, hypotension, bradycardia, and bronchospasm) occurred infrequently, with no significant difference in frequency between the placebo group and the groups receiving propranolol. This trial showed that propranolol was effective at a dose of 3 mg per kilogram per day for 6 months in the treatment of infantile hemangioma. (Funded by Pierre Fabre Dermatologie; ClinicalTrials.gov number, NCT01056341.).

  5. Racial-ethnic identity in mid-adolescence: content and change as predictors of academic achievement.

    PubMed

    Altschul, Inna; Oyserman, Daphna; Bybee, Deborah

    2006-01-01

    Three aspects of racial-ethnic identity (REI)-feeling connected to one's racial-ethnic group (Connectedness), being aware that others may not value the in-group (Awareness of Racism), and feeling that one's in-group is characterized by academic attainment (Embedded Achievement)-were hypothesized to promote academic achievement. Youth randomly selected from 3 low-income, urban schools (n=98 African American, n=41 Latino) reported on their REI 4 times over 2 school years. Hierarchical linear modeling shows a small increase in REI and the predicted REI-grades relationship. Youth high in both REI Connectedness and Embedded Achievement attained better grade point average (GPA) at each point in time; youth high in REI Connectedness and Awareness of Racism at the beginning of 8th grade attained better GPA through 9th grade. Effects are not moderated by race-ethnicity.

  6. Recurrence plots of discrete-time Gaussian stochastic processes

    NASA Astrophysics Data System (ADS)

    Ramdani, Sofiane; Bouchara, Frédéric; Lagarde, Julien; Lesne, Annick

    2016-09-01

    We investigate the statistical properties of recurrence plots (RPs) of data generated by discrete-time stationary Gaussian random processes. We analytically derive the theoretical values of the probabilities of occurrence of recurrence points and consecutive recurrence points forming diagonals in the RP, with an embedding dimension equal to 1. These results allow us to obtain theoretical values of three measures: (i) the recurrence rate (REC) (ii) the percent determinism (DET) and (iii) RP-based estimation of the ε-entropy κ(ε) in the sense of correlation entropy. We apply these results to two Gaussian processes, namely first order autoregressive processes and fractional Gaussian noise. For these processes, we simulate a number of realizations and compare the RP-based estimations of the three selected measures to their theoretical values. These comparisons provide useful information on the quality of the estimations, such as the minimum required data length and threshold radius used to construct the RP.

  7. NMR diffusion simulation based on conditional random walk.

    PubMed

    Gudbjartsson, H; Patz, S

    1995-01-01

    The authors introduce here a new, very fast, simulation method for free diffusion in a linear magnetic field gradient, which is an extension of the conventional Monte Carlo (MC) method or the convolution method described by Wong et al. (in 12th SMRM, New York, 1993, p.10). In earlier NMR-diffusion simulation methods, such as the finite difference method (FD), the Monte Carlo method, and the deterministic convolution method, the outcome of the calculations depends on the simulation time step. In the authors' method, however, the results are independent of the time step, although, in the convolution method the step size has to be adequate for spins to diffuse to adjacent grid points. By always selecting the largest possible time step the computation time can therefore be reduced. Finally the authors point out that in simple geometric configurations their simulation algorithm can be used to reduce computation time in the simulation of restricted diffusion.

  8. "Simulated molecular evolution" or computer-generated artifacts?

    PubMed

    Darius, F; Rojas, R

    1994-11-01

    1. The authors define a function with value 1 for the positive examples and 0 for the negative ones. They fit a continuous function but do not deal at all with the error margin of the fit, which is almost as large as the function values they compute. 2. The term "quality" for the value of the fitted function gives the impression that some biological significance is associated with values of the fitted function strictly between 0 and 1, but there is no justification for this kind of interpretation and finding the point where the fit achieves its maximum does not make sense. 3. By neglecting the error margin the authors try to optimize the fitted function using differences in the second, third, fourth, and even fifth decimal place which have no statistical significance. 4. Even if such a fit could profit from more data points, the authors should first prove that the region of interest has some kind of smoothness, that is, that a continuous fit makes any sense at all. 5. "Simulated molecular evolution" is a misnomer. We are dealing here with random search. Since the margin of error is so large, the fitted function does not provide statistically significant information about the points in search space where strings with cleavage sites could be found. This implies that the method is a highly unreliable stochastic search in the space of strings, even if the neural network is capable of learning some simple correlations. 6. Classical statistical methods are for these kind of problems with so few data points clearly superior to the neural networks used as a "black box" by the authors, which in the way they are structured provide a model with an error margin as large as the numbers being computed.7. And finally, even if someone would provide us with a function which separates strings with cleavage sites from strings without them perfectly, so-called simulated molecular evolution would not be better than random selection.Since a perfect fit would only produce exactly ones or zeros,starting a search in a region of space where all strings in the neighborhood get the value zero would not provide any kind of directional information for new iterations. We would just skip from one point to the other in a typical random walk manner.

  9. Changes in sleep and wake in response to different sleeping surfaces: a pilot study.

    PubMed

    McCall, W Vaughn; Boggs, Niki; Letton, Alan

    2012-03-01

    Six married couples (12 adults, mean age 34.8 years) were randomized as couples in a cross-over design to sleep on a queen-size conventional mattress for 2 weeks and a specially-designed pressure-relief mattress for 2 weeks. The pressure-relief mattress was designed to reduce the number of contact points exceeding 30 mm Hg. Actigraphic measurements of sleep and self-reports of sleep and daytime symptoms were collected at baseline for 2 weeks on each couple's home mattress and box springs at home, followed by 2 weeks of data collection on each randomized mattress for a total of 6 weeks of data collection. Pressure maps were created for each participant on each sleeping surface. There were no significant differences between the randomized sleeping surfaces for any measure of actigraphic sleep or self-reported sleep and daytime symptoms. However, poor pressure relief performance of the home mattress was associated with better actigraphic sleep on the randomized pressure-relief mattress. We conclude that while pressure-relief mattresses may not be universally preferred, baseline characteristics of the sleeper and/or their mattress may explain performance and sleeper preferences on future mattress selection. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  10. The kinetics of chirality assignment in catalytic single-walled carbon nanotube growth and the routes towards selective growth.

    PubMed

    Xu, Ziwei; Qiu, Lu; Ding, Feng

    2018-03-21

    Depending on its specific structure, or so-called chirality, a single-walled carbon nanotube (SWCNT) can be either a conductor or a semiconductor. This feature ensures great potential for building ∼1 nm sized electronics if chirality-selected SWCNTs could be achieved. However, due to the limited understanding of the growth mechanism of SWCNTs, reliable methods for chirality-selected SWCNTs are still pending. Here we present a theoretical model on the chirality assignment and control of SWCNTs during the catalytic growth. This study reveals that the chirality of a SWCNT is determined by the kinetic incorporation of pentagons, especially the last (6 th ) one, during the nucleation stage. Our analysis showed that the chirality of a SWCNT is randomly assigned on a liquid or liquid-like catalyst surface, and two routes of synthesizing chirality-selected SWCNTs, which are verified by recent experimental achievements, are demonstrated. They are (i) by using high melting point crystalline catalysts, such as Ta, W, Re, Os, or their alloys, and (ii) by frequently changing the chirality of SWCNTs during their growth. This study paves the way for achieving chirality-selective SWCNT growth for high performance SWCNT based electronics.

  11. Improved image decompression for reduced transform coding artifacts

    NASA Technical Reports Server (NTRS)

    Orourke, Thomas P.; Stevenson, Robert L.

    1994-01-01

    The perceived quality of images reconstructed from low bit rate compression is severely degraded by the appearance of transform coding artifacts. This paper proposes a method for producing higher quality reconstructed images based on a stochastic model for the image data. Quantization (scalar or vector) partitions the transform coefficient space and maps all points in a partition cell to a representative reconstruction point, usually taken as the centroid of the cell. The proposed image estimation technique selects the reconstruction point within the quantization partition cell which results in a reconstructed image which best fits a non-Gaussian Markov random field (MRF) image model. This approach results in a convex constrained optimization problem which can be solved iteratively. At each iteration, the gradient projection method is used to update the estimate based on the image model. In the transform domain, the resulting coefficient reconstruction points are projected to the particular quantization partition cells defined by the compressed image. Experimental results will be shown for images compressed using scalar quantization of block DCT and using vector quantization of subband wavelet transform. The proposed image decompression provides a reconstructed image with reduced visibility of transform coding artifacts and superior perceived quality.

  12. A Robust Method for Ego-Motion Estimation in Urban Environment Using Stereo Camera.

    PubMed

    Ci, Wenyan; Huang, Yingping

    2016-10-17

    Visual odometry estimates the ego-motion of an agent (e.g., vehicle and robot) using image information and is a key component for autonomous vehicles and robotics. This paper proposes a robust and precise method for estimating the 6-DoF ego-motion, using a stereo rig with optical flow analysis. An objective function fitted with a set of feature points is created by establishing the mathematical relationship between optical flow, depth and camera ego-motion parameters through the camera's 3-dimensional motion and planar imaging model. Accordingly, the six motion parameters are computed by minimizing the objective function, using the iterative Levenberg-Marquard method. One of key points for visual odometry is that the feature points selected for the computation should contain inliers as much as possible. In this work, the feature points and their optical flows are initially detected by using the Kanade-Lucas-Tomasi (KLT) algorithm. A circle matching is followed to remove the outliers caused by the mismatching of the KLT algorithm. A space position constraint is imposed to filter out the moving points from the point set detected by the KLT algorithm. The Random Sample Consensus (RANSAC) algorithm is employed to further refine the feature point set, i.e., to eliminate the effects of outliers. The remaining points are tracked to estimate the ego-motion parameters in the subsequent frames. The approach presented here is tested on real traffic videos and the results prove the robustness and precision of the method.

  13. A Robust Method for Ego-Motion Estimation in Urban Environment Using Stereo Camera

    PubMed Central

    Ci, Wenyan; Huang, Yingping

    2016-01-01

    Visual odometry estimates the ego-motion of an agent (e.g., vehicle and robot) using image information and is a key component for autonomous vehicles and robotics. This paper proposes a robust and precise method for estimating the 6-DoF ego-motion, using a stereo rig with optical flow analysis. An objective function fitted with a set of feature points is created by establishing the mathematical relationship between optical flow, depth and camera ego-motion parameters through the camera’s 3-dimensional motion and planar imaging model. Accordingly, the six motion parameters are computed by minimizing the objective function, using the iterative Levenberg–Marquard method. One of key points for visual odometry is that the feature points selected for the computation should contain inliers as much as possible. In this work, the feature points and their optical flows are initially detected by using the Kanade–Lucas–Tomasi (KLT) algorithm. A circle matching is followed to remove the outliers caused by the mismatching of the KLT algorithm. A space position constraint is imposed to filter out the moving points from the point set detected by the KLT algorithm. The Random Sample Consensus (RANSAC) algorithm is employed to further refine the feature point set, i.e., to eliminate the effects of outliers. The remaining points are tracked to estimate the ego-motion parameters in the subsequent frames. The approach presented here is tested on real traffic videos and the results prove the robustness and precision of the method. PMID:27763508

  14. Remote Effect of Lower Limb Acupuncture on Latent Myofascial Trigger Point of Upper Trapezius Muscle: A Pilot Study

    PubMed Central

    Chen, Kai-Hua; Hsiao, Kuang-Yu; Lin, Chu-Hsu; Chang, Wen-Ming; Hsu, Hung-Chih; Hsieh, Wei-Chi

    2013-01-01

    Objectives. To demonstrate the use of acupuncture in the lower limbs to treat myofascial pain of the upper trapezius muscles via a remote effect. Methods. Five adults with latent myofascial trigger points (MTrPs) of bilateral upper trapezius muscles received acupuncture at Weizhong (UB40) and Yanglingquan (GB34) points in the lower limbs. Modified acupuncture was applied at these points on a randomly selected ipsilateral lower limb (experimental side) versus sham needling on the contralateral lower limb (control side) in each subject. Each subject received two treatments within a one-week interval. To evaluate the remote effect of acupuncture, the range of motion (ROM) upon bending the contralateral side of the cervical spine was assessed before and after each treatment. Results. There was significant improvement in cervical ROM after the second treatment (P = 0.03) in the experimental group, and the increased ROM on the modified acupuncture side was greater compared to the sham needling side (P = 0.036). Conclusions. A remote effect of acupuncture was demonstrated in this pilot study. Using modified acupuncture needling at remote acupuncture points in the ipsilateral lower limb, our treatments released tightness due to latent MTrPs of the upper trapezius muscle. PMID:23710218

  15. Advantages and disadvantages of an objective selection process for early intervention in employees at risk for sickness absence

    PubMed Central

    Duijts, Saskia FA; Kant, IJmert; Swaen, Gerard MH

    2007-01-01

    Background It is unclear if objective selection of employees, for an intervention to prevent sickness absence, is more effective than subjective 'personal enlistment'. We hypothesize that objectively selected employees are 'at risk' for sickness absence and eligible to participate in the intervention program. Methods The dispatch of 8603 screening instruments forms the starting point of the objective selection process. Different stages of this process, throughout which employees either dropped out or were excluded, were described and compared with the subjective selection process. Characteristics of ineligible and ultimately selected employees, for a randomized trial, were described and quantified using sickness absence data. Results Overall response rate on the screening instrument was 42.0%. Response bias was found for the parameters sex and age, but not for sickness absence. Sickness absence was higher in the 'at risk' (N = 212) group (42%) compared to the 'not at risk' (N = 2503) group (25%) (OR 2.17 CI 1.63–2.89; p = 0.000). The selection process ended with the successful inclusion of 151 eligible, i.e. 2% of the approached employees in the trial. Conclusion The study shows that objective selection of employees for early intervention is effective. Despite methodological and practical problems, selected employees are actually those at risk for sickness absence, who will probably benefit more from the intervention program than others. PMID:17474980

  16. The effect of using cow genomic information on accuracy and bias of genomic breeding values in a simulated Holstein dairy cattle population.

    PubMed

    Dehnavi, E; Mahyari, S Ansari; Schenkel, F S; Sargolzaei, M

    2018-06-01

    Using cow data in the training population is attractive as a way to mitigate bias due to highly selected training bulls and to implement genomic selection for countries with no or limited proven bull data. However, one potential issue with cow data is a bias due to the preferential treatment. The objectives of this study were to (1) investigate the effect of including cow genotype and phenotype data into the training population on accuracy and bias of genomic predictions and (2) assess the effect of preferential treatment for different proportions of elite cows. First, a 4-pathway Holstein dairy cattle population was simulated for 2 traits with low (0.05) and moderate (0.3) heritability. Then different numbers of cows (0, 2,500, 5,000, 10,000, 15,000, or 20,000) were randomly selected and added to the training group composed of different numbers of top bulls (0, 2,500, 5,000, 10,000, or 15,000). Reliability levels of de-regressed estimated breeding values for training cows and bulls were 30 and 75% for traits with low heritability and were 60 and 90% for traits with moderate heritability, respectively. Preferential treatment was simulated by introducing upward bias equal to 35% of phenotypic variance to 5, 10, and 20% of elite bull dams in each scenario. Two different validation data sets were considered: (1) all animals in the last generation of both elite and commercial tiers (n = 42,000) and (2) only animals in the last generation of the elite tier (n = 12,000). Adding cow data into the training population led to an increase in accuracy (r) and decrease in bias of genomic predictions in all considered scenarios without preferential treatment. The gain in r was higher for the low heritable trait (from 0.004 to 0.166 r points) compared with the moderate heritable trait (from 0.004 to 0.116 r points). The gain in accuracy in scenarios with a lower number of training bulls was relatively higher (from 0.093 to 0.166 r points) than with a higher number of training bulls (from 0.004 to 0.09 r points). In this study, as expected, the bull-only reference population resulted in higher accuracy compared with the cow-only reference population of the same size. However, the cow reference population might be an option for countries with a small-scale progeny testing scheme or for minor breeds in large counties, and for traits measured only on a small fraction of the population. The inclusion of preferential treatment to 5 to 20% of the elite cows led to an adverse effect on both accuracy and bias of predictions. When preferential treatment was present, random selection of cows did not reduce the effect of preferential treatment. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  17. Prediction of soil attributes through interpolators in a deglaciated environment with complex landforms

    NASA Astrophysics Data System (ADS)

    Schünemann, Adriano Luis; Inácio Fernandes Filho, Elpídio; Rocha Francelino, Marcio; Rodrigues Santos, Gérson; Thomazini, Andre; Batista Pereira, Antônio; Gonçalves Reynaud Schaefer, Carlos Ernesto

    2017-04-01

    The knowledge of environmental variables values, in non-sampled sites from a minimum data set can be accessed through interpolation technique. Kriging and the classifier Random Forest algorithm are examples of predictors with this aim. The objective of this work was to compare methods of soil attributes spatialization in a recent deglaciated environment with complex landforms. Prediction of the selected soil attributes (potassium, calcium and magnesium) from ice-free areas were tested by using morphometric covariables, and geostatistical models without these covariables. For this, 106 soil samples were collected at 0-10 cm depth in Keller Peninsula, King George Island, Maritime Antarctica. Soil chemical analysis was performed by the gravimetric method, determining values of potassium, calcium and magnesium for each sampled point. Digital terrain models (DTMs) were obtained by using Terrestrial Laser Scanner. DTMs were generated from a cloud of points with spatial resolutions of 1, 5, 10, 20 and 30 m. Hence, 40 morphometric covariates were generated. Simple Kriging was performed using the R package software. The same data set coupled with morphometric covariates, was used to predict values of the studied attributes in non-sampled sites through Random Forest interpolator. Little differences were observed on the DTMs generated by Simple kriging and Random Forest interpolators. Also, DTMs with better spatial resolution did not improved the quality of soil attributes prediction. Results revealed that Simple Kriging can be used as interpolator when morphometric covariates are not available, with little impact regarding quality. It is necessary to go further in soil chemical attributes prediction techniques, especially in periglacial areas with complex landforms.

  18. Entanglement spectrum of random-singlet quantum critical points

    NASA Astrophysics Data System (ADS)

    Fagotti, Maurizio; Calabrese, Pasquale; Moore, Joel E.

    2011-01-01

    The entanglement spectrum (i.e., the full distribution of Schmidt eigenvalues of the reduced density matrix) contains more information than the conventional entanglement entropy and has been studied recently in several many-particle systems. We compute the disorder-averaged entanglement spectrum in the form of the disorder-averaged moments TrρAα̲ of the reduced density matrix ρA for a contiguous block of many spins at the random-singlet quantum critical point in one dimension. The result compares well in the scaling limit with numerical studies on the random XX model and is also expected to describe the (interacting) random Heisenberg model. Our numerical studies on the XX case reveal that the dependence of the entanglement entropy and spectrum on the geometry of the Hilbert space partition is quite different than for conformally invariant critical points.

  19. [Spatial variation of soil properties and quality evaluation for arable Ustic Cambosols in central Henan Province].

    PubMed

    Zhang, Xue-Lei; Feng, Wan-Wan; Zhong, Guo-Min

    2011-01-01

    A GIS-based 500 m x 500 m soil sampling point arrangement was set on 248 points at Wenshu Town of Yuzhou County in central Henan Province, where the typical Ustic Cambosols locates. By using soil digital data, the spatial database was established, from which, all the needed latitude and longitude data of the sampling points were produced for the field GPS guide. Soil samples (0-20 cm) were collected from 202 points, of which, bulk density measurement were conducted for randomly selected 34 points, and the ten soil property items used as the factors for soil quality assessment, including organic matter, available K, available P, pH, total N, total P, soil texture, cation exchange capacity (CEC), slowly available K, and bulk density, were analyzed for the other points. The soil property items were checked by statistic tools, and then, classified with standard criteria at home and abroad. The factor weight was given by analytic hierarchy process (AHP) method, and the spatial variation of the major 10 soil properties as well as the soil quality classes and their occupied areas were worked out by Kriging interpolation maps. The results showed that the arable Ustic Cambosols in study area was of good quality soil, over 95% of which ranked in good and medium classes and only less than 5% were in poor class.

  20. Randomization in clinical trials in orthodontics: its significance in research design and methods to achieve it.

    PubMed

    Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore

    2011-12-01

    Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.

  1. Efficiency enhancement of optimized Latin hypercube sampling strategies: Application to Monte Carlo uncertainty analysis and meta-modeling

    NASA Astrophysics Data System (ADS)

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad; Janssen, Hans

    2015-02-01

    The majority of literature regarding optimized Latin hypercube sampling (OLHS) is devoted to increasing the efficiency of these sampling strategies through the development of new algorithms based on the combination of innovative space-filling criteria and specialized optimization schemes. However, little attention has been given to the impact of the initial design that is fed into the optimization algorithm, on the efficiency of OLHS strategies. Previous studies, as well as codes developed for OLHS, have relied on one of the following two approaches for the selection of the initial design in OLHS: (1) the use of random points in the hypercube intervals (random LHS), and (2) the use of midpoints in the hypercube intervals (midpoint LHS). Both approaches have been extensively used, but no attempt has been previously made to compare the efficiency and robustness of their resulting sample designs. In this study we compare the two approaches and show that the space-filling characteristics of OLHS designs are sensitive to the initial design that is fed into the optimization algorithm. It is also illustrated that the space-filling characteristics of OLHS designs based on midpoint LHS are significantly better those based on random LHS. The two approaches are compared by incorporating their resulting sample designs in Monte Carlo simulation (MCS) for uncertainty propagation analysis, and then, by employing the sample designs in the selection of the training set for constructing non-intrusive polynomial chaos expansion (NIPCE) meta-models which subsequently replace the original full model in MCSs. The analysis is based on two case studies involving numerical simulation of density dependent flow and solute transport in porous media within the context of seawater intrusion in coastal aquifers. We show that the use of midpoint LHS as the initial design increases the efficiency and robustness of the resulting MCSs and NIPCE meta-models. The study also illustrates that this relative improvement decreases with increasing number of sample points and input parameter dimensions. Since the computational time and efforts for generating the sample designs in the two approaches are identical, the use of midpoint LHS as the initial design in OLHS is thus recommended.

  2. Spatial inventory integrating raster databases and point sample data. [Geographic Information System for timber inventory

    NASA Technical Reports Server (NTRS)

    Strahler, A. H.; Woodcock, C. E.; Logan, T. L.

    1983-01-01

    A timber inventory of the Eldorado National Forest, located in east-central California, provides an example of the use of a Geographic Information System (GIS) to stratify large areas of land for sampling and the collection of statistical data. The raster-based GIS format of the VICAR/IBIS software system allows simple and rapid tabulation of areas, and facilitates the selection of random locations for ground sampling. Algorithms that simplify the complex spatial pattern of raster-based information, and convert raster format data to strings of coordinate vectors, provide a link to conventional vector-based geographic information systems.

  3. Neuropharmacology of Poststroke Motor and Speech Recovery.

    PubMed

    Keser, Zafer; Francisco, Gerard E

    2015-11-01

    Almost 7 million adult Americans have had a stroke. There is a growing need for more effective treatment options as add-ons to conventional therapies. This article summarizes the published literature for pharmacologic agents used for the enhancement of motor and speech recovery after stroke. Amphetamine, levodopa, selective serotonin reuptake inhibitors, and piracetam were the most commonly used drugs. Pharmacologic augmentation of stroke motor and speech recovery seems promising but systematic, adequately powered, randomized, and double-blind clinical trials are needed. At this point, the use of these pharmacologic agents is not supported by class I evidence. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Optimal two-stage dynamic treatment regimes from a classification perspective with censored survival data.

    PubMed

    Hager, Rebecca; Tsiatis, Anastasios A; Davidian, Marie

    2018-05-18

    Clinicians often make multiple treatment decisions at key points over the course of a patient's disease. A dynamic treatment regime is a sequence of decision rules, each mapping a patient's observed history to the set of available, feasible treatment options at each decision point, and thus formalizes this process. An optimal regime is one leading to the most beneficial outcome on average if used to select treatment for the patient population. We propose a method for estimation of an optimal regime involving two decision points when the outcome of interest is a censored survival time, which is based on maximizing a locally efficient, doubly robust, augmented inverse probability weighted estimator for average outcome over a class of regimes. By casting this optimization as a classification problem, we exploit well-studied classification techniques such as support vector machines to characterize the class of regimes and facilitate implementation via a backward iterative algorithm. Simulation studies of performance and application of the method to data from a sequential, multiple assignment randomized clinical trial in acute leukemia are presented. © 2018, The International Biometric Society.

  5. Power in randomized group comparisons: the value of adding a single intermediate time point to a traditional pretest-posttest design.

    PubMed

    Venter, Anre; Maxwell, Scott E; Bolig, Erika

    2002-06-01

    Adding a pretest as a covariate to a randomized posttest-only design increases statistical power, as does the addition of intermediate time points to a randomized pretest-posttest design. Although typically 5 waves of data are required in this instance to produce meaningful gains in power, a 3-wave intensive design allows the evaluation of the straight-line growth model and may reduce the effect of missing data. The authors identify the statistically most powerful method of data analysis in the 3-wave intensive design. If straight-line growth is assumed, the pretest-posttest slope must assume fairly extreme values for the intermediate time point to increase power beyond the standard analysis of covariance on the posttest with the pretest as covariate, ignoring the intermediate time point.

  6. Random regression analyses using B-spline functions to model growth of Nellore cattle.

    PubMed

    Boligon, A A; Mercadante, M E Z; Lôbo, R B; Baldi, F; Albuquerque, L G

    2012-02-01

    The objective of this study was to estimate (co)variance components using random regression on B-spline functions to weight records obtained from birth to adulthood. A total of 82 064 weight records of 8145 females obtained from the data bank of the Nellore Breeding Program (PMGRN/Nellore Brazil) which started in 1987, were used. The models included direct additive and maternal genetic effects and animal and maternal permanent environmental effects as random. Contemporary group and dam age at calving (linear and quadratic effect) were included as fixed effects, and orthogonal Legendre polynomials of age (cubic regression) were considered as random covariate. The random effects were modeled using B-spline functions considering linear, quadratic and cubic polynomials for each individual segment. Residual variances were grouped in five age classes. Direct additive genetic and animal permanent environmental effects were modeled using up to seven knots (six segments). A single segment with two knots at the end points of the curve was used for the estimation of maternal genetic and maternal permanent environmental effects. A total of 15 models were studied, with the number of parameters ranging from 17 to 81. The models that used B-splines were compared with multi-trait analyses with nine weight traits and to a random regression model that used orthogonal Legendre polynomials. A model fitting quadratic B-splines, with four knots or three segments for direct additive genetic effect and animal permanent environmental effect and two knots for maternal additive genetic effect and maternal permanent environmental effect, was the most appropriate and parsimonious model to describe the covariance structure of the data. Selection for higher weight, such as at young ages, should be performed taking into account an increase in mature cow weight. Particularly, this is important in most of Nellore beef cattle production systems, where the cow herd is maintained on range conditions. There is limited modification of the growth curve of Nellore cattle with respect to the aim of selecting them for rapid growth at young ages while maintaining constant adult weight.

  7. Impact of communication strategies to increase knowledge, acceptability, and uptake of a new Woman's Condom in urban Lusaka, Zambia: study protocol for a randomized controlled trial.

    PubMed

    Pinchoff, Jessie; Chowdhuri, Rachna Nag; Taruberekera, Noah; Ngo, Thoai D

    2016-12-13

    Globally, 220 million women experience an unmet need for family planning. A newly designed female condom, the Woman's Condom (WC), has been developed featuring an improved design. It is the first dual-protection, female-initiated contraceptive that is a premium, higher price point product. However, market availability alone will not increase uptake. In February 2016 the WC will be distributed with a strong media campaign and interpersonal communication (IPC) outreach intervention. The impact of these on knowledge, acceptability, and use of the WC will be measured. A baseline survey of 2314 randomly selected 18- to 24-year-old sexually active men and women has been conducted. The WC and mass media will be introduced throughout 40 urban wards in and surrounding Lusaka, Zambia. The baseline survey will serve as a quasi-control arm to determine the impact of introducing the WC with mass media. Half of the wards will be randomly allocated to additionally receive the IPC intervention. A single-blind randomized controlled trial will determine the impact of the IPC intervention on knowledge, uptake, and use of the WC. After one year, another 2314 individuals will be randomly selected to participate in the endline survey. We hypothesize that (1) the distribution and media campaign of the WC will increase overall condom use in selected urban wards, and specifically use of the WC; (2) the IPC intervention will significantly impact knowledge, acceptability, and use of the WC. The primary outcome measures are use of the WC, use of any condom, and willingness to use the WC. Secondary outcomes include measures of knowledge, acceptability, and choice of contraception. Odds ratios will be estimated to measure the effect of the intervention on the outcomes with 95% confidence intervals. All analyses will be based on the intention-to-treat principle. Increasing uptake of dual prevention measures (such as the WC) may reduce incidence of sexually transmitted infections/HIV and unplanned pregnancies. It is important to ensure young, urban adults have access to new contraceptive methods; and, understanding how mass media and IPC impact contraceptive knowledge, acceptability, and use is critical to reduce unmet need. AEARCTR-0000899 . Registered on 26 October 2015.

  8. Systematic review with meta-analysis: highly selective 5-HT4 agonists (prucalopride, velusetrag or naronapride) in chronic constipation.

    PubMed

    Shin, A; Camilleri, M; Kolar, G; Erwin, P; West, C P; Murad, M H

    2014-02-01

    Highly selective 5-HT4 agonists have been suggested for the treatment of chronic constipation (CC). To assess the effects of highly selective 5-HT4 agonists (prucalopride, velusetrag or naronapride) on patient-important clinical efficacy outcomes and safety in adults with CC. We searched the medical literature in January 2013 using MEDLINE/Pubmed, Embase, Cochrane Library, and Web of Science/Scopus for randomised, controlled trials of highly selective 5-HT4 agonists in adults with CC, with no minimum duration of therapy (maximum 12 weeks) or date limitations. Data were extracted from intention-to-treat analyses, pooled using a random-effects model, and reported as relative risk (RR), mean differences, or standardised mean differences with 95% confidence intervals (CI). Main outcomes included stool frequency, Patient-Assessment of Constipation Quality of Life (PAC-QOL), PAC of symptoms (PAC-SYM) and adverse events. Thirteen eligible trials were identified: 11 prucalopride, 1 velusetrag, 1 naronapride. Relative to control, treatment with highly selective 5-HT4 agonists was superior for all outcomes: mean ≥3 spontaneous complete bowel movements (SCBM)/week (RR = 1.85; 95% CI 1.23-2.79); mean ≥1 SCBM over baseline (RR = 1.57; 95% CI 1.19, 2.06); ≥1 point improvement in PAC-QOL and PAC-SYM scores. The only active comparator trial of prucalopride and PEG3350 suggested PEG3350 is more efficacious for some end points. Adverse events were more common with highly selective 5-HT4 agonists, but were generally minor; headache was the most frequent. Most trials studied prucalopride. Demonstration of efficacy on patient-important outcomes and a favourable safety profile support the continued use and development of highly selective 5-HT4 agonists in the treatment of chronic constipation. © 2013 John Wiley & Sons Ltd.

  9. Physical Activity of Nurse Clinical Practitioners and Managers.

    PubMed

    Jirathananuwat, Areeya; Pongpirul, Krit

    2017-11-01

    This study was aimed (1) to compare the level of physical activity (PA) between working and nonworking hours and (2) to compare the level of PA during working hours of nurse clinical practitioners (NCPs) with that of nurse managers (NMs). This cross-sectional survey was conducted at a Thai university hospital from October 2015 to March 2016. All randomly selected participants wore an activity tracker on their hip for 5 days, except during bathing and sleeping periods, to record step counts and time points. Of 884 nurses, 289 (142 NCPs and 147 NMs) were randomly selected. The average age was 35.87 years. They spent 9.76 and 6.01 hours on work and nonwork activities, respectively. Daily steps per hour were significantly lower during work than nonwork periods (P < .001). An NCP had significantly higher overall hourly PA (P = .002). The number of steps per hour during work period of NCP was significantly higher than that of NM even after adjusting for age, work experience, and body mass index (P = .034). NCP had higher overall PA than NM, which was partly contributed by work-related PA. Level of PA for a professional with variation of actual work hours should be measured on hourly basis.

  10. Induction regimens for transplant-eligible patients with newly diagnosed multiple myeloma: a network meta-analysis of randomized controlled trials

    PubMed Central

    Zeng, Zi-Hang; Chen, Jia-Feng; Li, Yi-Xuan; Zhang, Ran; Xiao, Ling-Fei; Meng, Xiang-Yu

    2017-01-01

    Objective The aim of this study was to compare the early efficacy and survivals of induction regimens for transplant-eligible patients with untreated multiple myeloma. Materials and methods A comprehensive literature search in electronic databases was conducted for relevant randomized controlled trials (RCTs). Eligible studies were selected according to the predefined selection criteria, before they were evaluated for methodological quality. Basic characteristics and data for network meta-analysis (NMA) were extracted from included trials and pooled in our meta-analysis. The end points were the overall response rate (ORR), progression-free survival (PFS), and overall survival (OS). Results A total of 14 RCTs that included 4,763 patients were analyzed. The post-induction ORR was higher with bortezomib plus thalidomide plus dexamethasone (VTD) regimens, and VTD was better than the majority of other regimens. For OS, VTD plus cyclophosphamide (VTDC) regimens showed potential superiority over other regimens, but the difference was not statistically significant. The PFS was longer with thalidomide plus doxorubicin plus dexamethasone (TAD) regimens for transplant-eligible patients with newly diagnosed multiple myeloma (NDMM). Conclusion The NMA demonstrated that the VTD, VTDC, and TAD regimens are most beneficial in terms of ORR, OS, and PFS for transplant-eligible patients with NDMM, respectively. PMID:28744159

  11. Random crystal field effects on the integer and half-integer mixed-spin system

    NASA Astrophysics Data System (ADS)

    Yigit, Ali; Albayrak, Erhan

    2018-05-01

    In this work, we have focused on the random crystal field effects on the phase diagrams of the mixed spin-1 and spin-5/2 Ising system obtained by utilizing the exact recursion relations (ERR) on the Bethe lattice (BL). The distribution function P(Di) = pδ [Di - D(1 + α) ] +(1 - p) δ [Di - D(1 - α) ] is used to randomize the crystal field.The phase diagrams are found to exhibit second- and first-order phase transitions depending on the values of α, D and p. It is also observed that the model displays tricritical point, isolated point, critical end point and three compensation temperatures for suitable values of the system parameters.

  12. The effect of acupressure on fatigue among female nurses with chronic back pain.

    PubMed

    Movahedi, Maryam; Ghafari, Somayeh; Nazari, Fateme; Valiani, Mahboubeh

    2017-08-01

    To investigate the effect of acupressure on fatigue among female nurses with chronic back pain. Chronic back pain is one of the most common problems among nurses and has numerous physical and psychological effects. One of these effects is fatigue that impairs an individual's life. This randomized single-blind clinical trial was conducted on 50 nurses with chronic back pain working at the selected hospitals in Isfahan, Iran. After convenient sampling, the subjects were randomly allocated, through lottery, to the two groups of experimental (n=25) and sham (n=25). In the experimental group, acupressure techniques were performed during 9 sessions, 3 times a week for 14min for each patient. In the sham group, points within 1cm of the main points were only touched. Data were collected using the Fatigue Severity Scale (FSS), before, and immediately, 2weeks, and 4weeks after the intervention. Data analysis was performed using SPSS software. The mean score of fatigue severity before the intervention was not significantly different between the two groups (P=0.990). However, it was significantly lower in the experimental group than the sham group immediately (P<0.001), 2weeks (P=0.005), and 1month after the intervention (P<0.001). Acupressure on specific points of the foot and back improves back pain so, reduces fatigue. Therefore, acupressure can be used as a drug-free and low-cost approach without side effects to improve fatigue in nurses with chronic back pain. Copyright © 2017. Published by Elsevier Inc.

  13. The origin and functional transition of P34.

    PubMed

    Li, Q-G; Zhang, Y-M

    2013-03-01

    P34, a storage protein and major soybean allergen, has undergone a functional transition from a cysteine peptidase to a syringolide receptor. An exploration of the evolutionary mechanism of this functional transition is made. To identify homologous genes of P34, syntenic network was constructed using syntenic relationships from the Plant Genome Duplication Database. The collected homologous genes, along with SPE31, a highly homologous protein to P34 from the seeds of Pachyrhizus erosus, were used to construct a phylogenetic tree. The results show that multiple gene duplications, exon shuffling and following granulin domain loss and some critical point mutations are associated with the functional transition. Although some tests suggested the existence of positive selection, the possibility that random fixation under relaxation of purifying selection results in the functional transition is also supported. In addition, the genes Glyma08g12340 and Medtr8g086470 may belong to a new group within the papain family.

  14. The origin and functional transition of P34

    PubMed Central

    Li, Q-G; Zhang, Y-M

    2013-01-01

    P34, a storage protein and major soybean allergen, has undergone a functional transition from a cysteine peptidase to a syringolide receptor. An exploration of the evolutionary mechanism of this functional transition is made. To identify homologous genes of P34, syntenic network was constructed using syntenic relationships from the Plant Genome Duplication Database. The collected homologous genes, along with SPE31, a highly homologous protein to P34 from the seeds of Pachyrhizus erosus, were used to construct a phylogenetic tree. The results show that multiple gene duplications, exon shuffling and following granulin domain loss and some critical point mutations are associated with the functional transition. Although some tests suggested the existence of positive selection, the possibility that random fixation under relaxation of purifying selection results in the functional transition is also supported. In addition, the genes Glyma08g12340 and Medtr8g086470 may belong to a new group within the papain family. PMID:23211789

  15. Active Learning with Irrelevant Examples

    NASA Technical Reports Server (NTRS)

    Mazzoni, Dominic; Wagstaff, Kiri L.; Burl, Michael

    2006-01-01

    Active learning algorithms attempt to accelerate the learning process by requesting labels for the most informative items first. In real-world problems, however, there may exist unlabeled items that are irrelevant to the user's classification goals. Queries about these points slow down learning because they provide no information about the problem of interest. We have observed that when irrelevant items are present, active learning can perform worse than random selection, requiring more time (queries) to achieve the same level of accuracy. Therefore, we propose a novel approach, Relevance Bias, in which the active learner combines its default selection heuristic with the output of a simultaneously trained relevance classifier to favor items that are likely to be both informative and relevant. In our experiments on a real-world problem and two benchmark datasets, the Relevance Bias approach significantly improved the learning rate of three different active learning approaches.

  16. Oligovalent Fab display on M13 phage improved by directed evolution.

    PubMed

    Huovinen, Tuomas; Sanmark, Hanna; Ylä-Pelto, Jani; Vehniäinen, Markus; Lamminmäki, Urpo

    2010-03-01

    Efficient display of antibody on filamentous phage M13 coat is crucial for successful biopanning selections. We applied a directed evolution strategy to improve the oligovalent display of a poorly behaving Fab fragment fused to phage gene-3 for minor coat protein (g3p). The Fab displaying clones were enriched from a randomly mutated Fab gene library with polyclonal anti-mouse IgG antibodies. Contribution of each mutation to the improved phenotype of one selected mutant was studied. It was found out that two point mutations had significant contribution to the display efficiency of Fab clones superinfected with hyperphage. The most dramatic effect was connected to a start codon mutation, from AUG to GUG, of the PelB signal sequence preceding the heavy chain. The clone carrying this mutation, FabM(GUG), displayed Fab 19-fold better and yielded twofold higher phage titers than the original Fab.

  17. Visual evoked potentials and selective attention to points in space

    NASA Technical Reports Server (NTRS)

    Van Voorhis, S.; Hillyard, S. A.

    1977-01-01

    Visual evoked potentials (VEPs) were recorded to sequences of flashes delivered to the right and left visual fields while subjects responded promptly to designated stimuli in one field at a time (focused attention), in both fields at once (divided attention), or to neither field (passive). Three stimulus schedules were used: the first was a replication of a previous study (Eason, Harter, and White, 1969) where left- and right-field flashes were delivered quasi-independently, while in the other two the flashes were delivered to the two fields in random order (Bernoulli sequence). VEPs to attended-field stimuli were enhanced at both occipital (O2) and central (Cz) recording sites under all stimulus sequences, but different components were affected at the two scalp sites. It was suggested that the VEP at O2 may reflect modality-specific processing events, while the response at Cz, like its auditory homologue, may index more general aspects of selective attention.

  18. Post-harvest Salmonella spp. prevalence in turkey carcasses in processing plant in the northeast part of Poland.

    PubMed

    Zdrodowska, B; Liedtke, K; Radkowski, M

    2014-01-01

    Turkeys carcasses at selected point after slaughter on dressing line in poultry were sampled and analyzed for Salmonella. These slaughter turkeys came from the northeast part of Poland. The examinations were carried out in each month of 2009. Three hundred turkeys were selected at random from a commercial slaughter line, immediately after completing the cooling process. The percentage of these 300 turkeys from which Salmonella spp. were isolated was relatively high (8.3%; Salmonella positive results were observed in 25 cases). The lowest Salmonella spp. rate (1.3 %) for slaughter birds was found in the fourth quarter, and the highest contamination rate at 18.6% was found, in the third quarter. The serological types of Salmonella spp. isolated from the whole turkey carcasses were S. Saintpaul, S. Senftenberg, S. Anatum, S. Heidelberg, S. Hadar, S. Typhimurium and S. Infantis.

  19. Monte Carlo based toy model for fission process

    NASA Astrophysics Data System (ADS)

    Kurniadi, R.; Waris, A.; Viridi, S.

    2014-09-01

    There are many models and calculation techniques to obtain visible image of fission yield process. In particular, fission yield can be calculated by using two calculations approach, namely macroscopic approach and microscopic approach. This work proposes another calculation approach in which the nucleus is treated as a toy model. Hence, the fission process does not represent real fission process in nature completely. The toy model is formed by Gaussian distribution of random number that randomizes distance likesthe distance between particle and central point. The scission process is started by smashing compound nucleus central point into two parts that are left central and right central points. These three points have different Gaussian distribution parameters such as mean (μCN, μL, μR), and standard deviation (σCN, σL, σR). By overlaying of three distributions, the number of particles (NL, NR) that are trapped by central points can be obtained. This process is iterated until (NL, NR) become constant numbers. Smashing process is repeated by changing σL and σR, randomly.

  20. A GRASS GIS Semi-Stochastic Model for Evaluating the Probability of Landslides Impacting Road Networks in Collazzone, Central Italy

    NASA Astrophysics Data System (ADS)

    Taylor, Faith E.; Santangelo, Michele; Marchesini, Ivan; Malamud, Bruce D.

    2013-04-01

    During a landslide triggering event, the tens to thousands of landslides resulting from the trigger (e.g., earthquake, heavy rainfall) may block a number of sections of the road network, posing a risk to rescue efforts, logistics and accessibility to a region. Here, we present initial results from a semi-stochastic model we are developing to evaluate the probability of landslides intersecting a road network and the network-accessibility implications of this across a region. This was performed in the open source GRASS GIS software, where we took 'model' landslides and dropped them on a 79 km2 test area region in Collazzone, Umbria, Central Italy, with a given road network (major and minor roads, 404 km in length) and already determined landslide susceptibilities. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m.2 The number of landslide areas selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. 79 landslide areas chosen randomly for each iteration. Landslides were then 'dropped' over the region semi-stochastically: (i) random points were generated across the study region; (ii) based on the landslide susceptibility map, points were accepted/rejected based on the probability of a landslide occurring at that location. After a point was accepted, it was assigned a landslide area (AL) and length to width ratio. Landslide intersections with roads were then assessed and indices such as the location, number and size of road blockage recorded. The GRASS-GIS model was performed 1000 times in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event of 1 landslide km-2 over a 79 km2 region with 404 km of road, the number of road blockages ranges from 6 to 17, resulting in one road blockage every 24-67 km of roads. The average length of road blocked was 33 m. As we progress with model development and more sophisticated network analysis, we believe this semi-stochastic modelling approach will aid civil protection agencies to get a rough idea for the probability of road network potential damage (road block number and extent) as the result of different magnitude landslide triggering event scenarios.

  1. Spatial association of marine dockage with land-borne infestations of invasive termites (Isoptera: Rhinotermitidae: Coptotermes) in urban south Florida.

    PubMed

    Hochmair, Hartwig H; Scheffrahn, Rudolf H

    2010-08-01

    Marine vessels have been implicated in the anthropogenic dispersal of invasive termites for the past 500 yr. It has long been suspected that two invasive termites, the Formosan subterranean termite, Coptotermes formosanus Shiraki, and Coptotermes gestroi (Wasmann) (Isoptera: Rhinotermitidae), were introduced to and dispersed throughout South Florida by sailboats and yachts. We compared the distances between 190 terrestrial point records for Formosan subterranean termite, 177 records for C. gestroi, and random locations with the nearest marine dockage by using spatial analysis. Results show that the median distance to nearest docks associated with C. gestroi is significantly smaller than for the random points. Results also reveal that the median distance to nearest docks associated with Formosan subterranean termite is significantly smaller than for the random points. These results support the hypothesis that C. gestroi and Formosan subterranean termite are significantly closer to potential infested boat locations, i.e., marine docks, than random points in these urban areas. The results of our study suggest yet another source of aggregation in the context of exotic species, namely, hubs for pleasure boating.

  2. Efficient terrestrial laser scan segmentation exploiting data structure

    NASA Astrophysics Data System (ADS)

    Mahmoudabadi, Hamid; Olsen, Michael J.; Todorovic, Sinisa

    2016-09-01

    New technologies such as lidar enable the rapid collection of massive datasets to model a 3D scene as a point cloud. However, while hardware technology continues to advance, processing 3D point clouds into informative models remains complex and time consuming. A common approach to increase processing efficiently is to segment the point cloud into smaller sections. This paper proposes a novel approach for point cloud segmentation using computer vision algorithms to analyze panoramic representations of individual laser scans. These panoramas can be quickly created using an inherent neighborhood structure that is established during the scanning process, which scans at fixed angular increments in a cylindrical or spherical coordinate system. In the proposed approach, a selected image segmentation algorithm is applied on several input layers exploiting this angular structure including laser intensity, range, normal vectors, and color information. These segments are then mapped back to the 3D point cloud so that modeling can be completed more efficiently. This approach does not depend on pre-defined mathematical models and consequently setting parameters for them. Unlike common geometrical point cloud segmentation methods, the proposed method employs the colorimetric and intensity data as another source of information. The proposed algorithm is demonstrated on several datasets encompassing variety of scenes and objects. Results show a very high perceptual (visual) level of segmentation and thereby the feasibility of the proposed algorithm. The proposed method is also more efficient compared to Random Sample Consensus (RANSAC), which is a common approach for point cloud segmentation.

  3. CADASTER QSPR Models for Predictions of Melting and Boiling Points of Perfluorinated Chemicals.

    PubMed

    Bhhatarai, Barun; Teetz, Wolfram; Liu, Tao; Öberg, Tomas; Jeliazkova, Nina; Kochev, Nikolay; Pukalov, Ognyan; Tetko, Igor V; Kovarich, Simona; Papa, Ester; Gramatica, Paola

    2011-03-14

    Quantitative structure property relationship (QSPR) studies on per- and polyfluorinated chemicals (PFCs) on melting point (MP) and boiling point (BP) are presented. The training and prediction chemicals used for developing and validating the models were selected from Syracuse PhysProp database and literatures. The available experimental data sets were split in two different ways: a) random selection on response value, and b) structural similarity verified by self-organizing-map (SOM), in order to propose reliable predictive models, developed only on the training sets and externally verified on the prediction sets. Individual linear and non-linear approaches based models developed by different CADASTER partners on 0D-2D Dragon descriptors, E-state descriptors and fragment based descriptors as well as consensus model and their predictions are presented. In addition, the predictive performance of the developed models was verified on a blind external validation set (EV-set) prepared using PERFORCE database on 15 MP and 25 BP data respectively. This database contains only long chain perfluoro-alkylated chemicals, particularly monitored by regulatory agencies like US-EPA and EU-REACH. QSPR models with internal and external validation on two different external prediction/validation sets and study of applicability-domain highlighting the robustness and high accuracy of the models are discussed. Finally, MPs for additional 303 PFCs and BPs for 271 PFCs were predicted for which experimental measurements are unknown. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Elastic-wave-mode separation in TTI media with inverse-distance weighted interpolation involving position shading

    NASA Astrophysics Data System (ADS)

    Wang, Jian; Meng, Xiaohong; Zheng, Wanqiu

    2017-10-01

    The elastic-wave reverse-time migration of inhomogeneous anisotropic media is becoming the hotspot of research today. In order to ensure the accuracy of the migration, it is necessary to separate the wave mode into P-wave and S-wave before migration. For inhomogeneous media, the Kelvin-Christoffel equation can be solved in the wave-number domain by using the anisotropic parameters of the mesh nodes, and the polarization vector of the P-wave and S-wave at each node can be calculated and transformed into the space domain to obtain the quasi-differential operators. However, this method is computationally expensive, especially for the process of quasi-differential operators. In order to reduce the computational complexity, the wave-mode separation of mixed domain can be realized on the basis of a reference model in the wave-number domain. But conventional interpolation methods and reference model selection methods reduce the separation accuracy. In order to further improve the separation effect, this paper introduces an inverse-distance interpolation method involving position shading and uses the reference model selection method of random points scheme. This method adds the spatial weight coefficient K, which reflects the orientation of the reference point on the conventional IDW algorithm, and the interpolation process takes into account the combined effects of the distance and azimuth of the reference points. Numerical simulation shows that the proposed method can separate the wave mode more accurately using fewer reference models and has better practical value.

  5. [ZHENG's gold hook fishing acupuncture for lumbar disc herniation: a clinical observation].

    PubMed

    Zhu, Bowen; Zhang, Xinghua; Sun, Runjie; Qin, Xiaoguang

    2016-04-01

    To compare the clinical efficacy differences between Zheng's gold hook, fishing acupuncture and electroacupuncture (EA) for lumbar disc herniation (LDH). Sixty patients of LDH were randomly allocated to a gold hook fishing acupuncture group and an EA group, 30 cases in each one. Lumbar Jiaji (EX-1 B 2), Yaoyangguan (GV 3), Shenshu (BL 23), Dachangshu (BL 25), Guanyuanshu (BL 26) and ashi points were selected in the gold hook fishing acupuncture group; after the needles were inserted, the manipulation of gold hook fishing acupuncture was applied at tendon junction points and ashi points. The identical acupoints were selected in the EA group and patients were treated with EA. The treatment was both given once a day; ten days of treatment were taken as one session, and totally 3 sessions were given. The clinical effective rate, visual analogue scale (VAS), low back pain score and Oswestry disability index (ODI) were used for efficacy evaluation. The effective rate was 93.3% (28/30) in the gold hook fishing acupuncture group, which was superior to 86.7% (26/30) in the EA group (P < 0.05). The VAS, low back pain score and ODI were both significantly improved after treatment (all P < 0.05), which were more significant in the gold hook fishing acupuncture group (all P < 0.05). ZHENG's gold hook fishing acupuncture could effectively improve the symptoms and sings of LDH, reduce the disability index and improve the quality of life, which is superior to EA.

  6. Canine retraction and anchorage loss: self-ligating versus conventional brackets in a randomized split-mouth study.

    PubMed

    da Costa Monini, André; Júnior, Luiz Gonzaga Gandini; Martins, Renato Parsekian; Vianna, Alexandre Protásio

    2014-09-01

    To evaluate the velocity of canine retraction, anchorage loss and changes on canine and first molar inclinations using self-ligating and conventional brackets. Twenty-five adults with Class I malocclusion and a treatment plan involving extractions of four first premolars were selected for this randomized split-mouth control trial. Patients had either conventional or self-ligating brackets bonded to maxillary canines randomly. Retraction was accomplished using 100-g nickel-titanium closed coil springs, which were reactivated every 4 weeks. Oblique radiographs were taken before and after canine retraction was completed, and the cephalograms were superimposed on stable structures of the maxilla. Cephalometric points were digitized twice by a blinded operator for error control, and the following landmarks were collected: canine cusp and apex horizontal changes, molar cusp and apex horizontal changes, and angulation changes in canines and molars. The blinded data, which were normally distributed, were analyzed through paired t-tests for group differences. No differences were found between the two groups for all variables tested. Both brackets showed the same velocity of canine retraction and loss of anteroposterior anchorage of the molars. No changes were found between brackets regarding the inclination of canines and first molars.

  7. Energy parasites trigger oncogene mutation.

    PubMed

    Pokorný, Jiří; Pokorný, Jan; Jandová, Anna; Kobilková, Jitka; Vrba, Jan; Vrba, Jan

    2016-10-01

    Cancer initialization can be explained as a result of parasitic virus energy consumption leading to randomized genome chemical bonding. Analysis of experimental data on cell-mediated immunity (CMI) containing about 12,000 cases of healthy humans, cancer patients and patients with precancerous cervical lesions disclosed that the specific cancer and the non-specific lactate dehydrogenase-elevating (LDH) virus antigen elicit similar responses. The specific antigen is effective only in cancer type of its origin but the non-specific antigen in all examined cancers. CMI results of CIN patients display both healthy and cancer state. The ribonucleic acid (RNA) of the LDH virus parasitizing on energy reduces the ratio of coherent/random oscillations. Decreased effect of coherent cellular electromagnetic field on bonding electrons in biological macromolecules leads to elevating probability of random genome reactions. Overlapping of wave functions in biological macromolecules depends on energy of the cellular electromagnetic field which supplies energy to bonding electrons for selective chemical bonds. CMI responses of cancer and LDH virus antigens in all examined healthy, precancerous and cancer cases point to energy mechanism in cancer initiation. Dependence of the rate of biochemical reactions on biological electromagnetic field explains yet unknown mechanism of genome mutation.

  8. Resolving Transition Metal Chemical Space: Feature Selection for Machine Learning and Structure-Property Relationships.

    PubMed

    Janet, Jon Paul; Kulik, Heather J

    2017-11-22

    Machine learning (ML) of quantum mechanical properties shows promise for accelerating chemical discovery. For transition metal chemistry where accurate calculations are computationally costly and available training data sets are small, the molecular representation becomes a critical ingredient in ML model predictive accuracy. We introduce a series of revised autocorrelation functions (RACs) that encode relationships of the heuristic atomic properties (e.g., size, connectivity, and electronegativity) on a molecular graph. We alter the starting point, scope, and nature of the quantities evaluated in standard ACs to make these RACs amenable to inorganic chemistry. On an organic molecule set, we first demonstrate superior standard AC performance to other presently available topological descriptors for ML model training, with mean unsigned errors (MUEs) for atomization energies on set-aside test molecules as low as 6 kcal/mol. For inorganic chemistry, our RACs yield 1 kcal/mol ML MUEs on set-aside test molecules in spin-state splitting in comparison to 15-20× higher errors for feature sets that encode whole-molecule structural information. Systematic feature selection methods including univariate filtering, recursive feature elimination, and direct optimization (e.g., random forest and LASSO) are compared. Random-forest- or LASSO-selected subsets 4-5× smaller than the full RAC set produce sub- to 1 kcal/mol spin-splitting MUEs, with good transferability to metal-ligand bond length prediction (0.004-5 Å MUE) and redox potential on a smaller data set (0.2-0.3 eV MUE). Evaluation of feature selection results across property sets reveals the relative importance of local, electronic descriptors (e.g., electronegativity, atomic number) in spin-splitting and distal, steric effects in redox potential and bond lengths.

  9. Radiation-related quality of life parameters after targeted intraoperative radiotherapy versus whole breast radiotherapy in patients with breast cancer: results from the randomized phase III trial TARGIT-A.

    PubMed

    Welzel, Grit; Boch, Angela; Sperk, Elena; Hofmann, Frank; Kraus-Tiefenbacher, Uta; Gerhardt, Axel; Suetterlin, Marc; Wenz, Frederik

    2013-01-07

    Intraoperative radiotherapy (IORT) is a new treatment approach for early stage breast cancer. This study reports on the effects of IORT on radiation-related quality of life (QoL) parameters. Two hundred and thirty women with stage I-III breast cancer (age, 31 to 84 years) were entered into the study. A single-center subgroup of 87 women from the two arms of the randomized phase III trial TARGIT-A (TARGeted Intra-operative radioTherapy versus whole breast radiotherapy for breast cancer) was analyzed. Furthermore, results were compared to non-randomized control groups: n = 90 receiving IORT as a tumor bed boost followed by external beam whole breast radiotherapy (EBRT) outside of TARGIT-A (IORT-boost), and n = 53 treated with EBRT followed by an external-beam boost (EBRT-boost). QoL was collected using the European Organization for Research and Treatment of Cancer Quality of Life Questionnaires C30 (QLQ-C30) and BR23 (QLQ-BR23). The mean follow-up period in the TARGIT-A groups was 32 versus 39 months in the non-randomized control groups. Patients receiving IORT alone reported less general pain (21.3 points), breast (7.0 points) and arm (15.1 points) symptoms, and better role functioning (78.7 points) as patients receiving EBRT (40.9; 19.0; 32.8; and 60.5 points, respectively, P < 0.01). Patients receiving IORT alone also had fewer breast symptoms than TARGIT-A patients receiving IORT followed by EBRT for high risk features on final pathology (IORT-EBRT; 7.0 versus 29.7 points, P < 0.01). There were no significant differences between TARGIT-A patients receiving IORT-EBRT compared to non-randomized IORT-boost or EBRT-boost patients and patients receiving EBRT without a boost. In the randomized setting, important radiation-related QoL parameters after IORT were superior to EBRT. Non-randomized comparisons showed equivalent parameters in the IORT-EBRT group and the control groups.

  10. An Overview of Randomization and Minimization Programs for Randomized Clinical Trials

    PubMed Central

    Saghaei, Mahmoud

    2011-01-01

    Randomization is an essential component of sound clinical trials, which prevents selection biases and helps in blinding the allocations. Randomization is a process by which subsequent subjects are enrolled into trial groups only by chance, which essentially eliminates selection biases. A serious consequence of randomization is severe imbalance among the treatment groups with respect to some prognostic factors, which invalidate the trial results or necessitate complex and usually unreliable secondary analysis to eradicate the source of imbalances. Minimization on the other hand tends to allocate in such a way as to minimize the differences among groups, with respect to prognostic factors. Pure minimization is therefore completely deterministic, that is, one can predict the allocation of the next subject by knowing the factor levels of a previously enrolled subject and having the properties of the next subject. To eliminate the predictability of randomization, it is necessary to include some elements of randomness in the minimization algorithms. In this article brief descriptions of randomization and minimization are presented followed by introducing selected randomization and minimization programs. PMID:22606659

  11. Paretian Poisson Processes

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2008-05-01

    Many random populations can be modeled as a countable set of points scattered randomly on the positive half-line. The points may represent magnitudes of earthquakes and tornados, masses of stars, market values of public companies, etc. In this article we explore a specific class of random such populations we coin ` Paretian Poisson processes'. This class is elemental in statistical physics—connecting together, in a deep and fundamental way, diverse issues including: the Poisson distribution of the Law of Small Numbers; Paretian tail statistics; the Fréchet distribution of Extreme Value Theory; the one-sided Lévy distribution of the Central Limit Theorem; scale-invariance, renormalization and fractality; resilience to random perturbations.

  12. The estimation of branching curves in the presence of subject-specific random effects.

    PubMed

    Elmi, Angelo; Ratcliffe, Sarah J; Guo, Wensheng

    2014-12-20

    Branching curves are a technique for modeling curves that change trajectory at a change (branching) point. Currently, the estimation framework is limited to independent data, and smoothing splines are used for estimation. This article aims to extend the branching curve framework to the longitudinal data setting where the branching point varies by subject. If the branching point is modeled as a random effect, then the longitudinal branching curve framework is a semiparametric nonlinear mixed effects model. Given existing issues with using random effects within a smoothing spline, we express the model as a B-spline based semiparametric nonlinear mixed effects model. Simple, clever smoothness constraints are enforced on the B-splines at the change point. The method is applied to Women's Health data where we model the shape of the labor curve (cervical dilation measured longitudinally) before and after treatment with oxytocin (a labor stimulant). Copyright © 2014 John Wiley & Sons, Ltd.

  13. Does rational selection of training and test sets improve the outcome of QSAR modeling?

    PubMed

    Martin, Todd M; Harten, Paul; Young, Douglas M; Muratov, Eugene N; Golbraikh, Alexander; Zhu, Hao; Tropsha, Alexander

    2012-10-22

    Prior to using a quantitative structure activity relationship (QSAR) model for external predictions, its predictive power should be established and validated. In the absence of a true external data set, the best way to validate the predictive ability of a model is to perform its statistical external validation. In statistical external validation, the overall data set is divided into training and test sets. Commonly, this splitting is performed using random division. Rational splitting methods can divide data sets into training and test sets in an intelligent fashion. The purpose of this study was to determine whether rational division methods lead to more predictive models compared to random division. A special data splitting procedure was used to facilitate the comparison between random and rational division methods. For each toxicity end point, the overall data set was divided into a modeling set (80% of the overall set) and an external evaluation set (20% of the overall set) using random division. The modeling set was then subdivided into a training set (80% of the modeling set) and a test set (20% of the modeling set) using rational division methods and by using random division. The Kennard-Stone, minimal test set dissimilarity, and sphere exclusion algorithms were used as the rational division methods. The hierarchical clustering, random forest, and k-nearest neighbor (kNN) methods were used to develop QSAR models based on the training sets. For kNN QSAR, multiple training and test sets were generated, and multiple QSAR models were built. The results of this study indicate that models based on rational division methods generate better statistical results for the test sets than models based on random division, but the predictive power of both types of models are comparable.

  14. Wigner surmises and the two-dimensional homogeneous Poisson point process.

    PubMed

    Sakhr, Jamal; Nieminen, John M

    2006-04-01

    We derive a set of identities that relate the higher-order interpoint spacing statistics of the two-dimensional homogeneous Poisson point process to the Wigner surmises for the higher-order spacing distributions of eigenvalues from the three classical random matrix ensembles. We also report a remarkable identity that equates the second-nearest-neighbor spacing statistics of the points of the Poisson process and the nearest-neighbor spacing statistics of complex eigenvalues from Ginibre's ensemble of 2 x 2 complex non-Hermitian random matrices.

  15. The RANDOM computer program: A linear congruential random number generator

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  16. Human wound photogrammetry with low-cost hardware based on automatic calibration of geometry and color

    NASA Astrophysics Data System (ADS)

    Jose, Abin; Haak, Daniel; Jonas, Stephan; Brandenburg, Vincent; Deserno, Thomas M.

    2015-03-01

    Photographic documentation and image-based wound assessment is frequently performed in medical diagnostics, patient care, and clinical research. To support quantitative assessment, photographic imaging is based on expensive and high-quality hardware and still needs appropriate registration and calibration. Using inexpensive consumer hardware such as smartphone-integrated cameras, calibration of geometry, color, and contrast is challenging. Some methods involve color calibration using a reference pattern such as a standard color card, which is located manually in the photographs. In this paper, we adopt the lattice detection algorithm by Park et al. from real world to medicine. At first, the algorithm extracts and clusters feature points according to their local intensity patterns. Groups of similar points are fed into a selection process, which tests for suitability as a lattice grid. The group which describes the largest probability of the meshes of a lattice is selected and from it a template for an initial lattice cell is extracted. Then, a Markov random field is modeled. Using the mean-shift belief propagation, the detection of the 2D lattice is solved iteratively as a spatial tracking problem. Least-squares geometric calibration of projective distortions and non-linear color calibration in RGB space is supported by 35 corner points of 24 color patches, respectively. The method is tested on 37 photographs taken from the German Calciphylaxis registry, where non-standardized photographic documentation is collected nationwide from all contributing trial sites. In all images, the reference card location is correctly identified. At least, 28 out of 35 lattice points were detected, outperforming the SIFT-based approach previously applied. Based on these coordinates, robust geometry and color registration is performed making the photographs comparable for quantitative analysis.

  17. Effects of prey abundance, distribution, visual contrast and morphology on selection by a pelagic piscivore

    USGS Publications Warehouse

    Hansen, Adam G.; Beauchamp, David A.

    2014-01-01

    Most predators eat only a subset of possible prey. However, studies evaluating diet selection rarely measure prey availability in a manner that accounts for temporal–spatial overlap with predators, the sensory mechanisms employed to detect prey, and constraints on prey capture.We evaluated the diet selection of cutthroat trout (Oncorhynchus clarkii) feeding on a diverse planktivore assemblage in Lake Washington to test the hypothesis that the diet selection of piscivores would reflect random (opportunistic) as opposed to non-random (targeted) feeding, after accounting for predator–prey overlap, visual detection and capture constraints.Diets of cutthroat trout were sampled in autumn 2005, when the abundance of transparent, age-0 longfin smelt (Spirinchus thaleichthys) was low, and 2006, when the abundance of smelt was nearly seven times higher. Diet selection was evaluated separately using depth-integrated and depth-specific (accounted for predator–prey overlap) prey abundance. The abundance of different prey was then adjusted for differences in detectability and vulnerability to predation to see whether these factors could explain diet selection.In 2005, cutthroat trout fed non-randomly by selecting against the smaller, transparent age-0 longfin smelt, but for the larger age-1 longfin smelt. After adjusting prey abundance for visual detection and capture, cutthroat trout fed randomly. In 2006, depth-integrated and depth-specific abundance explained the diets of cutthroat trout well, indicating random feeding. Feeding became non-random after adjusting for visual detection and capture. Cutthroat trout selected strongly for age-0 longfin smelt, but against similar sized threespine stickleback (Gasterosteus aculeatus) and larger age-1 longfin smelt in 2006. Overlap with juvenile sockeye salmon (O. nerka) was minimal in both years, and sockeye salmon were rare in the diets of cutthroat trout.The direction of the shift between random and non-random selection depended on the presence of a weak versus a strong year class of age-0 longfin smelt. These fish were easy to catch, but hard to see. When their density was low, poor detection could explain their rarity in the diet. When their density was high, poor detection was compensated by higher encounter rates with cutthroat trout, sufficient to elicit a targeted feeding response. The nature of the feeding selectivity of a predator can be highly dependent on fluctuations in the abundance and suitability of key prey.

  18. Randomized Phase II, Double-Blind, Placebo-Controlled Study of Exemestane With or Without Entinostat in Postmenopausal Women With Locally Recurrent or Metastatic Estrogen Receptor-Positive Breast Cancer Progressing on Treatment With a Nonsteroidal Aromatase Inhibitor

    PubMed Central

    Yardley, Denise A.; Ismail-Khan, Roohi R.; Melichar, Bohuslav; Lichinitser, Mikhail; Munster, Pamela N.; Klein, Pamela M.; Cruickshank, Scott; Miller, Kathy D.; Lee, Min J.; Trepel, Jane B

    2013-01-01

    Purpose Entinostat is an oral isoform selective histone deacetylase inhibitor that targets resistance to hormonal therapies in estrogen receptor–positive (ER+) breast cancer. This randomized, placebo-controlled, phase II study evaluated entinostat combined with the aromatase inhibitor exemestane versus exemestane alone. Patients and Methods Postmenopausal women with ER+ advanced breast cancer progressing on a nonsteroidal aromatase inhibitor were randomly assigned to exemestane 25 mg daily plus entinostat 5 mg once per week (EE) or exemestane plus placebo (EP). The primary end point was progression-free survival (PFS). Blood was collected in a subset of patients for evaluation of protein lysine acetylation as a biomarker of entinostat activity. Results One hundred thirty patients were randomly assigned (EE group, n = 64; EP group, n = 66). Based on intent-to-treat analysis, treatment with EE improved median PFS to 4.3 months versus 2.3 months with EP (hazard ratio [HR], 0.73; 95% CI, 0.50 to 1.07; one-sided P = .055; two-sided P = .11 [predefined significance level of .10, one-sided]). Median overall survival was an exploratory end point and improved to 28.1 months with EE versus 19.8 months with EP (HR, 0.59; 95% CI, 0.36 to 0.97; P = .036). Fatigue and neutropenia were the most frequent grade 3/4 toxicities. Treatment discontinuation because of adverse events was higher in the EE group versus the EP group (11% v 2%). Protein lysine hyperacetylation in the EE biomarker subset was associated with prolonged PFS. Conclusion Entinostat added to exemestane is generally well tolerated and demonstrated activity in patients with ER+ advanced breast cancer in this signal-finding phase II study. Acetylation changes may provide an opportunity to maximize clinical benefit with entinostat. Plans for a confirmatory study are underway. PMID:23650416

  19. Design of state-feedback controllers including sensitivity reduction, with applications to precision pointing

    NASA Technical Reports Server (NTRS)

    Hadass, Z.

    1974-01-01

    The design procedure of feedback controllers was described and the considerations for the selection of the design parameters were given. The frequency domain properties of single-input single-output systems using state feedback controllers are analyzed, and desirable phase and gain margin properties are demonstrated. Special consideration is given to the design of controllers for tracking systems, especially those designed to track polynomial commands. As an example, a controller was designed for a tracking telescope with a polynomial tracking requirement and some special features such as actuator saturation and multiple measurements, one of which is sampled. The resulting system has a tracking performance comparing favorably with a much more complicated digital aided tracker. The parameter sensitivity reduction was treated by considering the variable parameters as random variables. A performance index is defined as a weighted sum of the state and control convariances that sum from both the random system disturbances and the parameter uncertainties, and is minimized numerically by adjusting a set of free parameters.

  20. Group Counseling With Emotionally Disturbed School Children in Taiwan.

    ERIC Educational Resources Information Center

    Chiu, Peter

    The application of group counseling to emotionally disturbed school children in Chinese culture was examined. Two junior high schools located in Tao-Yuan Province were randomly selected with two eighth-grade classes randomly selected from each school. Ten emotionally disturbed students were chosen from each class and randomly assigned to two…

  1. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  2. On Measuring and Reducing Selection Bias with a Quasi-Doubly Randomized Preference Trial

    ERIC Educational Resources Information Center

    Joyce, Ted; Remler, Dahlia K.; Jaeger, David A.; Altindag, Onur; O'Connell, Stephen D.; Crockett, Sean

    2017-01-01

    Randomized experiments provide unbiased estimates of treatment effects, but are costly and time consuming. We demonstrate how a randomized experiment can be leveraged to measure selection bias by conducting a subsequent observational study that is identical in every way except that subjects choose their treatment--a quasi-doubly randomized…

  3. The impact of roads on the timber rattlesnake (Crotalus horridus) in eastern Texas

    Treesearch

    D. Craig Rudolph; Shirley J. Burgdorf; Richard N. Conner; James G. Dickson

    1998-01-01

    Roads and associated vehicular traffic have the potential to significantly impact vertebrate populations. In eastern Texas we compared the densities of paved and unpaved roads within 2 and 4 km radii of timber rattlesnake (Crotalus horridus) ocations and of random points. Road networks were significantly more dense at random points than at snake...

  4. Item generation and design testing of a questionnaire to assess degenerative joint disease-associated pain in cats.

    PubMed

    Zamprogno, Helia; Hansen, Bernie D; Bondell, Howard D; Sumrell, Andrea Thomson; Simpson, Wendy; Robertson, Ian D; Brown, James; Pease, Anthony P; Roe, Simon C; Hardie, Elizabeth M; Wheeler, Simon J; Lascelles, B Duncan X

    2010-12-01

    To determine the items (question topics) for a subjective instrument to assess degenerative joint disease (DJD)-associated chronic pain in cats and determine the instrument design most appropriate for use by cat owners. 100 randomly selected client-owned cats from 6 months to 20 years old. Cats were evaluated to determine degree of radiographic DJD and signs of pain throughout the skeletal system. Two groups were identified: high DJD pain and low DJD pain. Owner-answered questions about activity and signs of pain were compared between the 2 groups to define items relating to chronic DJD pain. Interviews with 45 cat owners were performed to generate items. Fifty-three cat owners who had not been involved in any other part of the study, 19 veterinarians, and 2 statisticians assessed 6 preliminary instrument designs. 22 cats were selected for each group; 19 important items were identified, resulting in 12 potential items for the instrument; and 3 additional items were identified from owner interviews. Owners and veterinarians selected a 5-point descriptive instrument design over 11-point or visual analogue scale formats. Behaviors relating to activity were substantially different between healthy cats and cats with signs of DJD-associated pain. Fifteen items were identified as being potentially useful, and the preferred instrument design was identified. This information could be used to construct an owner-based questionnaire to assess feline DJD-associated pain. Once validated, such a questionnaire would assist in evaluating potential analgesic treatments for these patients.

  5. Detection of artificially ripened mango using spectrometric analysis

    NASA Astrophysics Data System (ADS)

    Mithun, B. S.; Mondal, Milton; Vishwakarma, Harsh; Shinde, Sujit; Kimbahune, Sanjay

    2017-05-01

    Hyperspectral sensing has been proven to be useful to determine the quality of food in general. It has also been used to distinguish naturally and artificially ripened mangoes by analyzing the spectral signature. However the focus has been on improving the accuracy of classification after performing dimensionality reduction, optimum feature selection and using suitable learning algorithm on the complete visible and NIR spectrum range data, namely 350nm to 1050nm. In this paper we focus on, (i) the use of low wavelength resolution and low cost multispectral sensor to reliably identify artificially ripened mango by selectively using the spectral information so that classification accuracy is not hampered at the cost of low resolution spectral data and (ii) use of visible spectrum i.e. 390nm to 700 nm data to accurately discriminate artificially ripened mangoes. Our results show that on a low resolution spectral data, the use of logistic regression produces an accuracy of 98.83% and outperforms other methods like classification tree, random forest significantly. And this is achieved by analyzing only 36 spectral reflectance data points instead of the complete 216 data points available in visual and NIR range. Another interesting experimental observation is that we are able to achieve more than 98% classification accuracy by selecting only 15 irradiance values in the visible spectrum. Even the number of data needs to be collected using hyper-spectral or multi-spectral sensor can be reduced by a factor of 24 for classification with high degree of confidence

  6. Three-dimensional Simulations of Pure Deflagration Models for Thermonuclear Supernovae

    NASA Astrophysics Data System (ADS)

    Long, Min; Jordan, George C., IV; van Rossum, Daniel R.; Diemer, Benedikt; Graziani, Carlo; Kessler, Richard; Meyer, Bradley; Rich, Paul; Lamb, Don Q.

    2014-07-01

    We present a systematic study of the pure deflagration model of Type Ia supernovae (SNe Ia) using three-dimensional, high-resolution, full-star hydrodynamical simulations, nucleosynthetic yields calculated using Lagrangian tracer particles, and light curves calculated using radiation transport. We evaluate the simulations by comparing their predicted light curves with many observed SNe Ia using the SALT2 data-driven model and find that the simulations may correspond to under-luminous SNe Iax. We explore the effects of the initial conditions on our results by varying the number of randomly selected ignition points from 63 to 3500, and the radius of the centered sphere they are confined in from 128 to 384 km. We find that the rate of nuclear burning depends on the number of ignition points at early times, the density of ignition points at intermediate times, and the radius of the confining sphere at late times. The results depend primarily on the number of ignition points, but we do not expect this to be the case in general. The simulations with few ignition points release more nuclear energy E nuc, have larger kinetic energies E K, and produce more 56Ni than those with many ignition points, and differ in the distribution of 56Ni, Si, and C/O in the ejecta. For these reasons, the simulations with few ignition points exhibit higher peak B-band absolute magnitudes M B and light curves that rise and decline more quickly; their M B and light curves resemble those of under-luminous SNe Iax, while those for simulations with many ignition points are not.

  7. Augmenting Microarray Data with Literature-Based Knowledge to Enhance Gene Regulatory Network Inference

    PubMed Central

    Kilicoglu, Halil; Shin, Dongwook; Rindflesch, Thomas C.

    2014-01-01

    Gene regulatory networks are a crucial aspect of systems biology in describing molecular mechanisms of the cell. Various computational models rely on random gene selection to infer such networks from microarray data. While incorporation of prior knowledge into data analysis has been deemed important, in practice, it has generally been limited to referencing genes in probe sets and using curated knowledge bases. We investigate the impact of augmenting microarray data with semantic relations automatically extracted from the literature, with the view that relations encoding gene/protein interactions eliminate the need for random selection of components in non-exhaustive approaches, producing a more accurate model of cellular behavior. A genetic algorithm is then used to optimize the strength of interactions using microarray data and an artificial neural network fitness function. The result is a directed and weighted network providing the individual contribution of each gene to its target. For testing, we used invasive ductile carcinoma of the breast to query the literature and a microarray set containing gene expression changes in these cells over several time points. Our model demonstrates significantly better fitness than the state-of-the-art model, which relies on an initial random selection of genes. Comparison to the component pathways of the KEGG Pathways in Cancer map reveals that the resulting networks contain both known and novel relationships. The p53 pathway results were manually validated in the literature. 60% of non-KEGG relationships were supported (74% for highly weighted interactions). The method was then applied to yeast data and our model again outperformed the comparison model. Our results demonstrate the advantage of combining gene interactions extracted from the literature in the form of semantic relations with microarray analysis in generating contribution-weighted gene regulatory networks. This methodology can make a significant contribution to understanding the complex interactions involved in cellular behavior and molecular physiology. PMID:24921649

  8. Augmenting microarray data with literature-based knowledge to enhance gene regulatory network inference.

    PubMed

    Chen, Guocai; Cairelli, Michael J; Kilicoglu, Halil; Shin, Dongwook; Rindflesch, Thomas C

    2014-06-01

    Gene regulatory networks are a crucial aspect of systems biology in describing molecular mechanisms of the cell. Various computational models rely on random gene selection to infer such networks from microarray data. While incorporation of prior knowledge into data analysis has been deemed important, in practice, it has generally been limited to referencing genes in probe sets and using curated knowledge bases. We investigate the impact of augmenting microarray data with semantic relations automatically extracted from the literature, with the view that relations encoding gene/protein interactions eliminate the need for random selection of components in non-exhaustive approaches, producing a more accurate model of cellular behavior. A genetic algorithm is then used to optimize the strength of interactions using microarray data and an artificial neural network fitness function. The result is a directed and weighted network providing the individual contribution of each gene to its target. For testing, we used invasive ductile carcinoma of the breast to query the literature and a microarray set containing gene expression changes in these cells over several time points. Our model demonstrates significantly better fitness than the state-of-the-art model, which relies on an initial random selection of genes. Comparison to the component pathways of the KEGG Pathways in Cancer map reveals that the resulting networks contain both known and novel relationships. The p53 pathway results were manually validated in the literature. 60% of non-KEGG relationships were supported (74% for highly weighted interactions). The method was then applied to yeast data and our model again outperformed the comparison model. Our results demonstrate the advantage of combining gene interactions extracted from the literature in the form of semantic relations with microarray analysis in generating contribution-weighted gene regulatory networks. This methodology can make a significant contribution to understanding the complex interactions involved in cellular behavior and molecular physiology.

  9. Comparing etoricoxib and celecoxib for preemptive analgesia for acute postoperative pain in patients undergoing arthroscopic anterior cruciate ligament reconstruction: a randomized controlled trial

    PubMed Central

    2010-01-01

    Background The efficacy of selective cox-2 inhibitors in postoperative pain reduction were usually compared with conventional non-selective conventional NSAIDs or other types of medicine. Previous studies also used selective cox-2 inhibitors as single postoperative dose, in continued mode, or in combination with other modalities. The purpose of this study was to compare analgesic efficacy of single preoperative administration of etoricoxib versus celecoxib for post-operative pain relief after arthroscopic anterior cruciate ligament reconstruction. Methods One hundred and two patients diagnosed as anterior cruciate ligament injury were randomized into 3 groups using opaque envelope. Both patients and surgeon were blinded to the allocation. All of the patients were operated by one orthopaedic surgeon under regional anesthesia. Each group was given either etoricoxib 120 mg., celecoxib 400 mg., or placebo 1 hour prior to operative incision. Post-operative pain intensity, time to first dose of analgesic requirement and numbers of analgesic used for pain control and adverse events were recorded periodically to 48 hours after surgery. We analyzed the data according to intention to treat principle. Results Among 102 patients, 35 were in etoricoxib, 35 in celecoxib and 32 in placebo group. The mean age of the patients was 30 years and most of the injury came from sports injury. There were no significant differences in all demographic characteristics among groups. The etoricoxib group had significantly less pain intensity than the other two groups at recovery room and up to 8 hours period but no significance difference in all other evaluation point, while celecoxib showed no significantly difference from placebo at any time points. The time to first dose of analgesic medication, amount of analgesic used, patient's satisfaction with pain control and incidence of adverse events were also no significantly difference among three groups. Conclusions Etoricoxib is more effective than celecoxib and placebo for using as preemptive analgesia for acute postoperative pain control in patients underwent arthroscopic anterior cruciate ligament reconstruction. Trial registration number NCT01017380 PMID:20973952

  10. Availability of arsenic in human milk in women and its correlation with arsenic in urine of breastfed children living in arsenic contaminated areas in Bangladesh.

    PubMed

    Islam, Md Rafiqul; Attia, John; Alauddin, Mohammad; McEvoy, Mark; McElduff, Patrick; Slater, Christine; Islam, Md Monirul; Akhter, Ayesha; d'Este, Catherine; Peel, Roseanne; Akter, Shahnaz; Smith, Wayne; Begg, Stephen; Milton, Abul Hasnat

    2014-12-04

    Early life exposure to inorganic arsenic may be related to adverse health effects in later life. However, there are few data on postnatal arsenic exposure via human milk. In this study, we aimed to determine arsenic levels in human milk and the correlation between arsenic in human milk and arsenic in mothers and infants urine. Between March 2011 and March 2012, this prospective study identified a total of 120 new mother-baby pairs from Kashiani (subdistrict), Bangladesh. Of these, 30 mothers were randomly selected for human milk samples at 1, 6 and 9 months post-natally; the same mother baby pairs were selected for urine sampling at 1 and 6 months. Twelve urine samples from these 30 mother baby pairs were randomly selected for arsenic speciation. Arsenic concentration in human milk was low and non-normally distributed. The median arsenic concentration in human milk at all three time points remained at 0.5 μg/L. In the mixed model estimates, arsenic concentration in human milk was non-significantly reduced by -0.035 μg/L (95% CI: -0.09 to 0.02) between 1 and 6 months and between 6 and 9 months. With the progression of time, arsenic concentration in infant's urine increased non-significantly by 0.13 μg/L (95% CI: -1.27 to 1.53). Arsenic in human milk at 1 and 6 months was not correlated with arsenic in the infant's urine at the same time points (r = -0.13 at 1 month and r = -0.09 at 6 month). Arsenite (AsIII), arsenate (AsV), monomethyl arsonic acid (MMA), dimethyl arsinic acid (DMA) and arsenobetaine (AsB) were the constituents of total urinary arsenic; DMA was the predominant arsenic metabolite in infant urine. We observed a low arsenic concentration in human milk. The concentration was lower than the World Health Organization's maximum permissible limit (WHO Permissible Limit 15 μg/kg-bw/week). Our findings support the safety of breastfeeding even in arsenic contaminated areas.

  11. SNP selection and classification of genome-wide SNP data using stratified sampling random forests.

    PubMed

    Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K

    2012-09-01

    For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.

  12. Neuropsychological and psychiatric changes after deep brain stimulation for Parkinson's disease: a randomised, multicentre study.

    PubMed

    Witt, Karsten; Daniels, Christine; Reiff, Julia; Krack, Paul; Volkmann, Jens; Pinsker, Markus O; Krause, Martin; Tronnier, Volker; Kloss, Manja; Schnitzler, Alfons; Wojtecki, Lars; Bötzel, Kai; Danek, Adrian; Hilker, Rüdiger; Sturm, Volker; Kupsch, Andreas; Karner, Elfriede; Deuschl, Günther

    2008-07-01

    Deep brain stimulation (DBS) of the subthalamic nucleus (STN) reduces motor symptoms in patients with Parkinson's disease (PD) and improves their quality of life; however, the effect of DBS on cognitive functions and its psychiatric side-effects are still controversial. To assess the neuropsychiatric consequences of DBS in patients with PD we did an ancillary protocol as part of a randomised study that compared DBS with the best medical treatment. 156 patients with advanced Parkinson's disease and motor fluctuations were randomly assigned to have DBS of the STN or the best medical treatment for PD according to the German Society of Neurology guidelines. 123 patients had neuropsychological and psychiatric examinations to assess the changes between baseline and after 6 months. The primary outcome was the comparison of the effect of DBS with the best medical treatment on overall cognitive functioning (Mattis dementia rating scale). Secondary outcomes were the effects on executive function, depression, anxiety, psychiatric status, manic symptoms, and quality of life. Analysis was per protocol. The study is registered at ClinicalTrials.gov, number NCT00196911. 60 patients were randomly assigned to receive STN-DBS and 63 patients to have best medical treatment. After 6 months, impairments were seen in executive function (difference of changes [DBS-best medical treatment] in verbal fluency [semantic] -4.50 points, 95% CI -8.07 to -0.93, Cohen's d=-;0.4; verbal fluency [phonemic] -3.06 points, -5.50 to -0.62, -0.5; Stroop 2 naming colour error rate -0.37 points, -0.73 to 0.00, -0.4; Stroop 3 word reading time -5.17 s, -8.82 to -1.52, -0.5; Stroop 4 colour naming time -13.00 s, -25.12 to -0.89, -0.4), irrespective of the improvement in quality of life (difference of changes in PDQ-39 10.16 points, 5.45 to 14.87, 0.6; SF-36 physical 16.55 points, 10.89 to 22.21, 0.9; SF-36 psychological 9.74 points, 2.18 to 17.29, 0.5). Anxiety was reduced in the DBS group compared with the medication group (difference of changes in Beck anxiety inventory 10.43 points, 6.08 to 14.78, 0.8). Ten patients in the DBS group and eight patients in the best medical treatment group had severe psychiatric adverse events. DBS of the STN does not reduce overall cognition or affectivity, although there is a selective decrease in frontal cognitive functions and an improvement in anxiety in patients after the treatment. These changes do not affect improvements in quality of life. DBS of the STN is safe with respect to neuropsychological and psychiatric effects in carefully selected patients during a 6-month follow-up period. German Federal Ministry of Education and Research (01GI0201).

  13. Evolution in fluctuating environments: decomposing selection into additive components of the Robertson-Price equation.

    PubMed

    Engen, Steinar; Saether, Bernt-Erik

    2014-03-01

    We analyze the stochastic components of the Robertson-Price equation for the evolution of quantitative characters that enables decomposition of the selection differential into components due to demographic and environmental stochasticity. We show how these two types of stochasticity affect the evolution of multivariate quantitative characters by defining demographic and environmental variances as components of individual fitness. The exact covariance formula for selection is decomposed into three components, the deterministic mean value, as well as stochastic demographic and environmental components. We show that demographic and environmental stochasticity generate random genetic drift and fluctuating selection, respectively. This provides a common theoretical framework for linking ecological and evolutionary processes. Demographic stochasticity can cause random variation in selection differentials independent of fluctuating selection caused by environmental variation. We use this model of selection to illustrate that the effect on the expected selection differential of random variation in individual fitness is dependent on population size, and that the strength of fluctuating selection is affected by how environmental variation affects the covariance in Malthusian fitness between individuals with different phenotypes. Thus, our approach enables us to partition out the effects of fluctuating selection from the effects of selection due to random variation in individual fitness caused by demographic stochasticity. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.

  14. Hierarchical Kohonenen net for anomaly detection in network security.

    PubMed

    Sarasamma, Suseela T; Zhu, Qiuming A; Huff, Julie

    2005-04-01

    A novel multilevel hierarchical Kohonen Net (K-Map) for an intrusion detection system is presented. Each level of the hierarchical map is modeled as a simple winner-take-all K-Map. One significant advantage of this multilevel hierarchical K-Map is its computational efficiency. Unlike other statistical anomaly detection methods such as nearest neighbor approach, K-means clustering or probabilistic analysis that employ distance computation in the feature space to identify the outliers, our approach does not involve costly point-to-point computation in organizing the data into clusters. Another advantage is the reduced network size. We use the classification capability of the K-Map on selected dimensions of data set in detecting anomalies. Randomly selected subsets that contain both attacks and normal records from the KDD Cup 1999 benchmark data are used to train the hierarchical net. We use a confidence measure to label the clusters. Then we use the test set from the same KDD Cup 1999 benchmark to test the hierarchical net. We show that a hierarchical K-Map in which each layer operates on a small subset of the feature space is superior to a single-layer K-Map operating on the whole feature space in detecting a variety of attacks in terms of detection rate as well as false positive rate.

  15. [Navigation in implantology: Accuracy assessment regarding the literature].

    PubMed

    Barrak, Ibrahim Ádám; Varga, Endre; Piffko, József

    2016-06-01

    Our objective was to assess the literature regarding the accuracy of the different static guided systems. After applying electronic literature search we found 661 articles. After reviewing 139 articles, the authors chose 52 articles for full-text evaluation. 24 studies involved accuracy measurements. Fourteen of our selected references were clinical and ten of them were in vitro (modell or cadaver). Variance-analysis (Tukey's post-hoc test; p < 0.05) was conducted to summarize the selected publications. Regarding 2819 results the average mean error at the entry point was 0.98 mm. At the level of the apex the average deviation was 1.29 mm while the mean of the angular deviation was 3,96 degrees. Significant difference could be observed between the two methods of implant placement (partially and fully guided sequence) in terms of deviation at the entry point, apex and angular deviation. Different levels of quality and quantity of evidence were available for assessing the accuracy of the different computer-assisted implant placement. The rapidly evolving field of digital dentistry and the new developments will further improve the accuracy of guided implant placement. In the interest of being able to draw dependable conclusions and for the further evaluation of the parameters used for accuracy measurements, randomized, controlled single or multi-centered clinical trials are necessary.

  16. The knowledge of "Facts for Life".

    PubMed

    Alper, Zuleyha; Ozdemir, Hakan; Bilgel, Nazan

    2005-07-01

    "Facts for Life" is an essential tool for saving the lives of children. In this study we wanted to evaluate the knowledge of "Facts for Life" among Turkish women. This is a cross-sectional field study. We used 25 indicator questions to evaluate the knowledge of women in the following main subjects: safe motherhood, childhood immunization, childhood diarrhoea, children's acute respiratory diseases, and household hygiene. We filled out printed questionnaires during face-to-face interviews. For each correct answer we gave 4 points, and the sum of the points was accepted as the knowledge score. Bursa metropolitan area in Turkey. Married women between 15-44 years of age. We selected 1000 of them from the household cards of the health centers that were located at the Bursa metropolitan area by using a random selection method. Mean knowledge score was 72.0 +/- 0.3. About 3/5 had moderate, 1/5 good + very good, and 1/5 bad knowledge scores. Childhood diarrhoea was better known than acute respiratory diseases. The very well known "Facts for Life" were those concerning food and household hygiene. Women's knowledge about "Facts for Life" was at a moderate level. The knowledge level of older women was better than the younger. Some false beliefs still existed. Knowledge about ARI and diarrhoeal diseases in childhood were the least known facts.

  17. Natural selection and self-organization in complex adaptive systems.

    PubMed

    Di Bernardo, Mirko

    2010-01-01

    The central theme of this work is self-organization "interpreted" both from the point of view of theoretical biology, and from a philosophical point of view. By analysing, on the one hand, those which are now considered--not only in the field of physics--some of the most important discoveries, that is complex systems and deterministic chaos and, on the other hand, the new frontiers of systemic biology, this work highlights how large thermodynamic systems which are open can spontaneously stay in an orderly regime. Such systems can represent the natural source of the order required for a stable self-organization, for homoeostasis and for hereditary variations. The order, emerging in enormous randomly interconnected nets of binary variables, is almost certainly only the precursor of similar orders emerging in all the varieties of complex systems. Hence, this work, by finding new foundations for the order pervading the living world, advances the daring hypothesis according to which Darwinian natural selection is not the only source of order in the biosphere. Thus, the article, by examining the passage from Prigogine's dissipative structures theory to the contemporary theory of biological complexity, highlights the development of a coherent and continuous line of research which is set to individuate the general principles marking the profound reality of that mysterious self-organization characterizing the complexity of life.

  18. Women's Perceptions of Reproductive Health in Three Communities around Beirut, Lebanon

    PubMed Central

    Kaddour, Afamia; Hafez, Raghda; Zurayk, Huda

    2006-01-01

    The aim of this study was to elicit definitions of the concept of reproductive health among women in three communities around Beirut, Lebanon, as part of the reproductive health component of a larger Urban Health Study. The communities were characterised by poverty, rural-urban mobility and heterogeneous refugee and migrant populations. A random sample of 1,869 women of reproductive age completed a questionnaire, of whom a sub-sample of 201 women were randomly selected. The women's understanding of good reproductive health included three major themes, which were expressed differently in the three communities. Their understanding included good physical and mental health, and underscored the need for activities promoting health. Their ability to reproduce and raise children, practise family planning and birth spacing, and go through pregnancy and motherhood safely were central to their reproductive duties and their social status. Finally, they saw reproductive health within the context of economic status, good marital relations and strength to cope with their lives. These findings point to the need to situate interventions in the life course of women, their health and that of their husbands and families; the importance of reproduction not only from a health services point of view, but also as regards women's roles and responsibilities within marriage and their families; and taking account of the harsh socio-economic conditions in their communities. A 2005 Reproductive Health Matters. All rights reserved. PMID:16035595

  19. Women's perceptions of reproductive health in three communities around Beirut, Lebanon.

    PubMed

    Kaddour, Afamia; Hafez, Raghda; Zurayk, Huda

    2005-05-01

    The aim of this study was to elicit definitions of the concept of reproductive health among women in three communities around Beirut, Lebanon, as part of the reproductive health component of a larger Urban Health Study. The communities were characterised by poverty, rural-urban mobility and heterogeneous refugee and migrant populations. A random sample of 1,869 women of reproductive age completed a questionnaire, of whom a sub-sample of 201 women were randomly selected. The women's understanding of good reproductive health included three major themes, which were expressed differently in the three communities. Their understanding included good physical and mental health, and underscored the need for activities promoting health. Their ability to reproduce and raise children, practise family planning and birth spacing, and go through pregnancy and motherhood safely were central to their reproductive duties and their social status. Finally, they saw reproductive health within the context of economic status, good marital relations and strength to cope with their lives. These findings point to the need to situate interventions in the life course of women, their health and that of their husbands and families; the importance of reproduction not only from a health services point of view, but also as regards women's roles and responsibilities within marriage and their families; and taking account of the harsh socio-economic conditions in their communities.

  20. Analysis of internal structure changes in black human hair keratin fibers with aging using Raman spectroscopy.

    PubMed

    Kuzuhara, Akio; Fujiwara, Nobuki; Hori, Teruo

    To investigate the internal structure changes in virgin black human hair keratin fibers due to aging, the structure of cross-sections at various depths of virgin black human hair (sections of new growth hair: 2 mm from the scalp) from a group of eight Japanese females in their twenties and another group of eight Japanese females in their fifties were analyzed using Raman spectroscopy. For the first time, we have succeeded in recording the Raman spectra of virgin black human hair, which had been impossible due to high melanin granule content. The key points of this method are to cross-section hair samples to a thickness of 1.50-microm, to select points at various depths of the cortex with the fewest possible melanin granules, and to optimize laser power, cross slit width as well as total acquisition time. The reproducibility of the Raman bands, namely the alpha-helix (alpha) content, the beta-sheet and/or random coil (beta/R) content, the disulfide (--SS--) content, and random coil content of two adjoining cross-sections of a single hair keratin fiber was clearly good. The --SS-- content of virgin black human hair from the Japanese females in their fifties for the cortex region decreased compared with that of the Japanese females in their twenties. On the other hand, the beta/R and alpha contents of the cortex region did not change.

  1. Training balance with opto-kinetic stimuli in the home: a randomized controlled feasibility study in people with pure cerebellar disease.

    PubMed

    Bunn, Lisa M; Marsden, Jonathan F; Giunti, Paola; Day, Brian L

    2015-02-01

    To investigate the feasibility of a randomized controlled trial of a home-based balance intervention for people with cerebellar ataxia. A randomized controlled trial design. Intervention and assessment took place in the home environment. A total of 12 people with spinocerebellar ataxia type 6 were randomized into a therapy or control group. Both groups received identical assessments at baseline, four and eight weeks. Therapy group participants undertook balance exercises in front of optokinetic stimuli during weeks 4-8, while control group participants received no intervention. Test-retest reliability was analysed from outcome measures collected twice at baseline and four weeks later. Feasibility issues were evaluated using daily diaries and end trial exit interviews. The home-based training intervention with opto-kinetic stimuli was feasible for people with pure ataxia, with one drop-out. Test-retest reliability is strong (intraclass correlation coefficient >0.7) for selected outcome measures evaluating balance at impairment and activity levels. Some measures reveal trends towards improvement for those in the therapy group. Sample size estimations indicate that Bal-SARA scores could detect a clinically significant change of 0.8 points in this functional balance score if 80 people per group were analysed in future trials. Home-based targeted training of functional balance for people with pure cerebellar ataxia is feasible and the outcome measures employed are reliable. © The Author(s) 2014.

  2. Chinese Herbal Bath Therapy for the Treatment of Knee Osteoarthritis: Meta-Analysis of Randomized Controlled Trials

    PubMed Central

    Chen, Bo; Zhan, Hongsheng; Chung, Mei; Lin, Xun; Zhang, Min; Pang, Jian; Wang, Chenchen

    2015-01-01

    Objective. Chinese herbal bath therapy (CHBT) has traditionally been considered to have analgesic and anti-inflammatory effects. We conducted the first meta-analysis evaluating its benefits for patients with knee osteoarthritis (OA). Methods. We searched three English and four Chinese databases through October, 2014. Randomized trials evaluating at least 2 weeks of CHBT for knee OA were selected. The effects of CHBT on clinical symptoms included both pain level (via the visual analog scale) and total effectiveness rate, which assessed pain, physical performance, and wellness. We performed random-effects meta-analyses using mean difference. Results. Fifteen studies totaling 1618 subjects met eligibility criteria. Bath prescription included, on average, 13 Chinese herbs with directions to steam and wash around the knee for 20–40 minutes once or twice daily. Mean treatment duration was 3 weeks. Results from meta-analysis showed superior pain improvement (mean difference = −0.59 points; 95% confidence intervals [CI], −0.83 to −0.36; p < 0.00001) and higher total effectiveness rate (risk ratio = 1.21; 95% CI, 1.15 to 1.28; p < 0.00001) when compared with standard western treatment. No serious adverse events were reported. Conclusion. Chinese herbal bath therapy may be a safe, effective, and simple alternative treatment modality for knee OA. Further rigorously designed, randomized trials are warranted. PMID:26483847

  3. Relative Pose Estimation Using Image Feature Triplets

    NASA Astrophysics Data System (ADS)

    Chuang, T. Y.; Rottensteiner, F.; Heipke, C.

    2015-03-01

    A fully automated reconstruction of the trajectory of image sequences using point correspondences is turning into a routine practice. However, there are cases in which point features are hardly detectable, cannot be localized in a stable distribution, and consequently lead to an insufficient pose estimation. This paper presents a triplet-wise scheme for calibrated relative pose estimation from image point and line triplets, and investigates the effectiveness of the feature integration upon the relative pose estimation. To this end, we employ an existing point matching technique and propose a method for line triplet matching in which the relative poses are resolved during the matching procedure. The line matching method aims at establishing hypotheses about potential minimal line matches that can be used for determining the parameters of relative orientation (pose estimation) of two images with respect to the reference one; then, quantifying the agreement using the estimated orientation parameters. Rather than randomly choosing the line candidates in the matching process, we generate an associated lookup table to guide the selection of potential line matches. In addition, we integrate the homologous point and line triplets into a common adjustment procedure. In order to be able to also work with image sequences the adjustment is formulated in an incremental manner. The proposed scheme is evaluated with both synthetic and real datasets, demonstrating its satisfactory performance and revealing the effectiveness of image feature integration.

  4. Quantifying the impact of time-varying baseline risk adjustment in the self-controlled risk interval design.

    PubMed

    Li, Lingling; Kulldorff, Martin; Russek-Cohen, Estelle; Kawai, Alison Tse; Hua, Wei

    2015-12-01

    The self-controlled risk interval design is commonly used to assess the association between an acute exposure and an adverse event of interest, implicitly adjusting for fixed, non-time-varying covariates. Explicit adjustment needs to be made for time-varying covariates, for example, age in young children. It can be performed via either a fixed or random adjustment. The random-adjustment approach can provide valid point and interval estimates but requires access to individual-level data for an unexposed baseline sample. The fixed-adjustment approach does not have this requirement and will provide a valid point estimate but may underestimate the variance. We conducted a comprehensive simulation study to evaluate their performance. We designed the simulation study using empirical data from the Food and Drug Administration-sponsored Mini-Sentinel Post-licensure Rapid Immunization Safety Monitoring Rotavirus Vaccines and Intussusception study in children 5-36.9 weeks of age. The time-varying confounder is age. We considered a variety of design parameters including sample size, relative risk, time-varying baseline risks, and risk interval length. The random-adjustment approach has very good performance in almost all considered settings. The fixed-adjustment approach can be used as a good alternative when the number of events used to estimate the time-varying baseline risks is at least the number of events used to estimate the relative risk, which is almost always the case. We successfully identified settings in which the fixed-adjustment approach can be used as a good alternative and provided guidelines on the selection and implementation of appropriate analyses for the self-controlled risk interval design. Copyright © 2015 John Wiley & Sons, Ltd.

  5. Bevacizumab combined with chemotherapy for platinum-resistant recurrent ovarian cancer: The AURELIA open-label randomized phase III trial.

    PubMed

    Pujade-Lauraine, Eric; Hilpert, Felix; Weber, Béatrice; Reuss, Alexander; Poveda, Andres; Kristensen, Gunnar; Sorio, Roberto; Vergote, Ignace; Witteveen, Petronella; Bamias, Aristotelis; Pereira, Deolinda; Wimberger, Pauline; Oaknin, Ana; Mirza, Mansoor Raza; Follana, Philippe; Bollag, David; Ray-Coquard, Isabelle

    2014-05-01

    In platinum-resistant ovarian cancer (OC), single-agent chemotherapy is standard. Bevacizumab is active alone and in combination. AURELIA is the first randomized phase III trial to our knowledge combining bevacizumab with chemotherapy in platinum-resistant OC. Eligible patients had measurable/assessable OC that had progressed < 6 months after completing platinum-based therapy. Patients with refractory disease, history of bowel obstruction, or > two prior anticancer regimens were ineligible. After investigators selected chemotherapy (pegylated liposomal doxorubicin, weekly paclitaxel, or topotecan), patients were randomly assigned to single-agent chemotherapy alone or with bevacizumab (10 mg/kg every 2 weeks or 15 mg/kg every 3 weeks) until progression, unacceptable toxicity, or consent withdrawal. Crossover to single-agent bevacizumab was permitted after progression with chemotherapy alone. The primary end point was progression-free survival (PFS) by RECIST. Secondary end points included objective response rate (ORR), overall survival (OS), safety, and patient-reported outcomes. The PFS hazard ratio (HR) after PFS events in 301 of 361 patients was 0.48 (95% CI, 0.38 to 0.60; unstratified log-rank P < .001). Median PFS was 3.4 months with chemotherapy alone versus 6.7 months with bevacizumab-containing therapy. RECIST ORR was 11.8% versus 27.3%, respectively (P = .001). The OS HR was 0.85 (95% CI, 0.66 to 1.08; P < .174; median OS, 13.3 v 16.6 months, respectively). Grade ≥ 2 hypertension and proteinuria were more common with bevacizumab. GI perforation occurred in 2.2% of bevacizumab-treated patients. Adding bevacizumab to chemotherapy statistically significantly improved PFS and ORR; the OS trend was not significant. No new safety signals were observed.

  6. Assessing relative abundance and reproductive success of shrubsteppe raptors

    USGS Publications Warehouse

    Lehman, Robert N.; Carpenter, L.B.; Steenhof, Karen; Kochert, Michael N.

    1998-01-01

    From 1991-1994, we quantified relative abundance and reproductive success of the Ferruginous Hawk (Buteo regalis), Northern Harrier (Circus cyaneus), Burrowing Owl (Speotytoc unicularia), and Short-eared Owl (Asio flammeus) on the shrubsteppe plateaus (benchlands) in and near the Snake River Birds of Prey National Conservation Area in southwestern Idaho. To assess relative abundance, we searched randomly selected plots using four sampling methods: point counts, line transects, and quadrats of two sizes. On a persampling-effort basis, transects were slightly more effective than point counts and quadrats for locating raptor nests (3.4 pairs detected/100 h of effort vs. 2.2-3.1 pairs). Random sampling using quadrats failed to detect a Short-eared Owl population increase from 1993 to 1994. To evaluate nesting success, we tried to determine reproductive outcome for all nesting attempts located during random, historical, and incidental nest searches. We compared nesting success estimates based on all nesting attempts, on attempts found during incubation, and the Mayfield model. Most pairs used to evaluate success were pairs found incidentally. Visits to historical nesting areas yielded the highest number of pairs per sampling effort (14.6/100 h), but reoccupancy rates for most species decreased through time. Estimates based on all attempts had the highest sample sizes but probably overestimated success for all species except the Ferruginous Hawk. Estimates of success based on nesting attempts found during incubation had the lowest sample sizes. All three methods yielded biased nesting snccess estimates for the Northern Harrier and Short-eared Owl. The estimate based on pairs found during incubation probably provided the least biased estimate for the Burrowing Owl. Assessments of nesting success were hindered by difficulties in confirming egg laying and nesting success for all species except the Ferruginous hawk.

  7. The performance of sample selection estimators to control for attrition bias.

    PubMed

    Grasdal, A

    2001-07-01

    Sample attrition is a potential source of selection bias in experimental, as well as non-experimental programme evaluation. For labour market outcomes, such as employment status and earnings, missing data problems caused by attrition can be circumvented by the collection of follow-up data from administrative registers. For most non-labour market outcomes, however, investigators must rely on participants' willingness to co-operate in keeping detailed follow-up records and statistical correction procedures to identify and adjust for attrition bias. This paper combines survey and register data from a Norwegian randomized field trial to evaluate the performance of parametric and semi-parametric sample selection estimators commonly used to correct for attrition bias. The considered estimators work well in terms of producing point estimates of treatment effects close to the experimental benchmark estimates. Results are sensitive to exclusion restrictions. The analysis also demonstrates an inherent paradox in the 'common support' approach, which prescribes exclusion from the analysis of observations outside of common support for the selection probability. The more important treatment status is as a determinant of attrition, the larger is the proportion of treated with support for the selection probability outside the range, for which comparison with untreated counterparts is possible. Copyright 2001 John Wiley & Sons, Ltd.

  8. New milk protein-derived peptides with potential antimicrobial activity: an approach based on bioinformatic studies.

    PubMed

    Dziuba, Bartłomiej; Dziuba, Marta

    2014-08-20

    New peptides with potential antimicrobial activity, encrypted in milk protein sequences, were searched for with the use of bioinformatic tools. The major milk proteins were hydrolyzed in silico by 28 enzymes. The obtained peptides were characterized by the following parameters: molecular weight, isoelectric point, composition and number of amino acid residues, net charge at pH 7.0, aliphatic index, instability index, Boman index, and GRAVY index, and compared with those calculated for known 416 antimicrobial peptides including 59 antimicrobial peptides (AMPs) from milk proteins listed in the BIOPEP database. A simple analysis of physico-chemical properties and the values of biological activity indicators were insufficient to select potentially antimicrobial peptides released in silico from milk proteins by proteolytic enzymes. The final selection was made based on the results of multidimensional statistical analysis such as support vector machines (SVM), random forest (RF), artificial neural networks (ANN) and discriminant analysis (DA) available in the Collection of Anti-Microbial Peptides (CAMP database). Eleven new peptides with potential antimicrobial activity were selected from all peptides released during in silico proteolysis of milk proteins.

  9. New Milk Protein-Derived Peptides with Potential Antimicrobial Activity: An Approach Based on Bioinformatic Studies

    PubMed Central

    Dziuba, Bartłomiej; Dziuba, Marta

    2014-01-01

    New peptides with potential antimicrobial activity, encrypted in milk protein sequences, were searched for with the use of bioinformatic tools. The major milk proteins were hydrolyzed in silico by 28 enzymes. The obtained peptides were characterized by the following parameters: molecular weight, isoelectric point, composition and number of amino acid residues, net charge at pH 7.0, aliphatic index, instability index, Boman index, and GRAVY index, and compared with those calculated for known 416 antimicrobial peptides including 59 antimicrobial peptides (AMPs) from milk proteins listed in the BIOPEP database. A simple analysis of physico-chemical properties and the values of biological activity indicators were insufficient to select potentially antimicrobial peptides released in silico from milk proteins by proteolytic enzymes. The final selection was made based on the results of multidimensional statistical analysis such as support vector machines (SVM), random forest (RF), artificial neural networks (ANN) and discriminant analysis (DA) available in the Collection of Anti-Microbial Peptides (CAMP database). Eleven new peptides with potential antimicrobial activity were selected from all peptides released during in silico proteolysis of milk proteins. PMID:25141106

  10. Population genetics and molecular evolution of DNA sequences in transposable elements. I. A simulation framework.

    PubMed

    Kijima, T E; Innan, Hideki

    2013-11-01

    A population genetic simulation framework is developed to understand the behavior and molecular evolution of DNA sequences of transposable elements. Our model incorporates random transposition and excision of transposable element (TE) copies, two modes of selection against TEs, and degeneration of transpositional activity by point mutations. We first investigated the relationships between the behavior of the copy number of TEs and these parameters. Our results show that when selection is weak, the genome can maintain a relatively large number of TEs, but most of them are less active. In contrast, with strong selection, the genome can maintain only a limited number of TEs but the proportion of active copies is large. In such a case, there could be substantial fluctuations of the copy number over generations. We also explored how DNA sequences of TEs evolve through the simulations. In general, active copies form clusters around the original sequence, while less active copies have long branches specific to themselves, exhibiting a star-shaped phylogeny. It is demonstrated that the phylogeny of TE sequences could be informative to understand the dynamics of TE evolution.

  11. Students perception on the usage of PowerPoint in learning calculus

    NASA Astrophysics Data System (ADS)

    Othman, Zarith Sofiah; Tarmuji, Nor Habibah; Hilmi, Zulkifli Ab Ghani

    2017-04-01

    Mathematics is a core subject in most of the science and technology courses and in some social sciences programs. However, the low achievement of students in the subject especially in topics such as Differentiation and Integration is always an issue. Many factors contribute to the low performance such as motivation, environment, method of learning, academic background and others. The purpose of this paper is to determine the perception of learning mathematics using PowerPoint on Integration concepts at the undergraduate level with respect to mathematics anxiety, learning enjoyment, mobility and learning satisfaction. The main content of the PowerPoint presentation focused on the integration method with historical elements as an added value. The study was conducted on 48 students randomly selected from students in computer and applied sciences program as experimental group. Questionnaires were distributed to students to explore their learning experiences. Another 51 students who were taught using the traditional chalkboard method were used as the control group. Both groups were given a test on Integration. The statistical methods used were descriptive statistics and independent sample t-test between the experimental and the control group. The finding showed that most students perceived positively to the PowerPoint presentations with respect to mobility and learning satisfaction. The experimental group performed better than the control group.

  12. Thermodynamics of strain-induced crystallization of random copolymers.

    PubMed

    Nie, Yijing; Gao, Huanhuan; Wu, Yixian; Hu, Wenbing

    2014-01-14

    Industrial semi-crystalline polymers contain various kinds of sequence defects, which behave like non-crystallizable comonomer units on random copolymers. We performed dynamic Monte Carlo simulations of strain-induced crystallization of random copolymers with various contents of comonomers at high temperatures. We observed that the onset strains of crystallization shift up with the increase of comonomer contents and temperatures. The behaviors can be predicted well by a combination of Flory's theories on the melting-point shifting-down of random copolymers and on the melting-point shifting-up of strain-induced crystallization. Our thermodynamic results are fundamentally important for us to understand the rubber strain-hardening, the plastic molding, the film stretching as well as the fiber spinning.

  13. The Coalescent Process in Models with Selection

    PubMed Central

    Kaplan, N. L.; Darden, T.; Hudson, R. R.

    1988-01-01

    Statistical properties of the process describing the genealogical history of a random sample of genes are obtained for a class of population genetics models with selection. For models with selection, in contrast to models without selection, the distribution of this process, the coalescent process, depends on the distribution of the frequencies of alleles in the ancestral generations. If the ancestral frequency process can be approximated by a diffusion, then the mean and the variance of the number of segregating sites due to selectively neutral mutations in random samples can be numerically calculated. The calculations are greatly simplified if the frequencies of the alleles are tightly regulated. If the mutation rates between alleles maintained by balancing selection are low, then the number of selectively neutral segregating sites in a random sample of genes is expected to substantially exceed the number predicted under a neutral model. PMID:3066685

  14. Integrating SAS and GIS software to improve habitat-use estimates from radiotelemetry data

    USGS Publications Warehouse

    Kenow, K.P.; Wright, R.G.; Samuel, M.D.; Rasmussen, P.W.

    2001-01-01

    Radiotelemetry has been used commonly to remotely determine habitat use by a variety of wildlife species. However, habitat misclassification can occur because the true location of a radiomarked animal can only be estimated. Analytical methods that provide improved estimates of habitat use from radiotelemetry location data using a subsampling approach have been proposed previously. We developed software, based on these methods, to conduct improved habitat-use analyses. A Statistical Analysis System (SAS)-executable file generates a random subsample of points from the error distribution of an estimated animal location and formats the output into ARC/INFO-compatible coordinate and attribute files. An associated ARC/INFO Arc Macro Language (AML) creates a coverage of the random points, determines the habitat type at each random point from an existing habitat coverage, sums the number of subsample points by habitat type for each location, and outputs tile results in ASCII format. The proportion and precision of habitat types used is calculated from the subsample of points generated for each radiotelemetry location. We illustrate the method and software by analysis of radiotelemetry data for a female wild turkey (Meleagris gallopavo).

  15. Renormalized Energy Concentration in Random Matrices

    NASA Astrophysics Data System (ADS)

    Borodin, Alexei; Serfaty, Sylvia

    2013-05-01

    We define a "renormalized energy" as an explicit functional on arbitrary point configurations of constant average density in the plane and on the real line. The definition is inspired by ideas of Sandier and Serfaty (From the Ginzburg-Landau model to vortex lattice problems, 2012; 1D log-gases and the renormalized energy, 2013). Roughly speaking, it is obtained by subtracting two leading terms from the Coulomb potential on a growing number of charges. The functional is expected to be a good measure of disorder of a configuration of points. We give certain formulas for its expectation for general stationary random point processes. For the random matrix β-sine processes on the real line ( β = 1,2,4), and Ginibre point process and zeros of Gaussian analytic functions process in the plane, we compute the expectation explicitly. Moreover, we prove that for these processes the variance of the renormalized energy vanishes, which shows concentration near the expected value. We also prove that the β = 2 sine process minimizes the renormalized energy in the class of determinantal point processes with translation invariant correlation kernels.

  16. Effects of Selected Meditative Asanas on Kinaesthetic Perception and Speed of Movement

    ERIC Educational Resources Information Center

    Singh, Kanwaljeet; Bal, Baljinder S.; Deol, Nishan S.

    2009-01-01

    Study aim: To assess the effects of selected meditative "asanas" on kinesthetic perception and movement speed. Material and methods: Thirty randomly selected male students aged 18-24 years volunteered to participate in the study. They were randomly assigned into two groups: A (medidative) and B (control). The Nelson's movement speed and…

  17. Model Selection with the Linear Mixed Model for Longitudinal Data

    ERIC Educational Resources Information Center

    Ryoo, Ji Hoon

    2011-01-01

    Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…

  18. Random one-of-N selector

    DOEpatents

    Kronberg, J.W.

    1993-04-20

    An apparatus for selecting at random one item of N items on the average comprising counter and reset elements for counting repeatedly between zero and N, a number selected by the user, a circuit for activating and deactivating the counter, a comparator to determine if the counter stopped at a count of zero, an output to indicate an item has been selected when the count is zero or not selected if the count is not zero. Randomness is provided by having the counter cycle very often while varying the relatively longer duration between activation and deactivation of the count. The passive circuit components of the activating/deactivating circuit and those of the counter are selected for the sensitivity of their response to variations in temperature and other physical characteristics of the environment so that the response time of the circuitry varies. Additionally, the items themselves, which may be people, may vary in shape or the time they press a pushbutton, so that, for example, an ultrasonic beam broken by the item or person passing through it will add to the duration of the count and thus to the randomness of the selection.

  19. Random one-of-N selector

    DOEpatents

    Kronberg, James W.

    1993-01-01

    An apparatus for selecting at random one item of N items on the average comprising counter and reset elements for counting repeatedly between zero and N, a number selected by the user, a circuit for activating and deactivating the counter, a comparator to determine if the counter stopped at a count of zero, an output to indicate an item has been selected when the count is zero or not selected if the count is not zero. Randomness is provided by having the counter cycle very often while varying the relatively longer duration between activation and deactivation of the count. The passive circuit components of the activating/deactivating circuit and those of the counter are selected for the sensitivity of their response to variations in temperature and other physical characteristics of the environment so that the response time of the circuitry varies. Additionally, the items themselves, which may be people, may vary in shape or the time they press a pushbutton, so that, for example, an ultrasonic beam broken by the item or person passing through it will add to the duration of the count and thus to the randomness of the selection.

  20. Trends of overweight and obesity among children in Tijuana, Mexico.

    PubMed

    Bacardi-Gascón, M; Jiménez-Cruz, A; Jones, E; Velasquez Perez, I; Loaiza Martinez, J A

    2009-01-01

    The objectives of this study were to compare the trends of obesity from 2001-02 to 2006-07 in school children of Tijuana, Mexico and to investigate the relationship with the child's gender and type of school attended. Bietapic random sample was selected by cluster of schools and groups. Results of the 1684 children from 6-14 years of age assessed showed an overall prevalence of obesity (> 95(th)) of 28%. An overall increase of overweight and obesity of 7-percentage points (p=0.0003), from 41 to 48%, being higher among boys and younger girls. Prevalence of obesity was higher among boys and children from private schools. Copyright © Taylor & Francis Group, LLC

  1. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    PubMed Central

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and caution should be taken when applying filter FS methods in selecting predictive models. PMID:26890307

  2. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness.

    PubMed

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia's marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to 'small p and large n' problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and caution should be taken when applying filter FS methods in selecting predictive models.

  3. Longitudinal analyses of correlated response efficiencies of fillet traits in Nile tilapia.

    PubMed

    Turra, E M; Fernandes, A F A; de Alvarenga, E R; Teixeira, E A; Alves, G F O; Manduca, L G; Murphy, T W; Silva, M A

    2018-03-01

    Recent studies with Nile tilapia have shown divergent results regarding the possibility of selecting on morphometric measurements to promote indirect genetic gains in fillet yield (FY). The use of indirect selection for fillet traits is important as these traits are only measurable after harvesting. Random regression models are a powerful tool in association studies to identify the best time point to measure and select animals. Random regression models can also be applied in a multiple trait approach to analyze indirect response to selection, which would avoid the need to sacrifice candidate fish. Therefore, the aim of this study was to investigate the genetic relationships between several body measurements, weight and fillet traits throughout the growth period and to evaluate the possibility of indirect selection for fillet traits in Nile tilapia. Data were collected from 2042 fish and was divided into two subsets. The first subset was used to estimate genetic parameters, including the permanent environmental effect for BW and body measurements (8758 records for each body measurement, as each fish was individually weighed and measured a maximum of six times). The second subset (2042 records for each trait) was used to estimate genetic correlations and heritabilities, which enabled the calculation of correlated response efficiencies between body measurements and the fillet traits. Heritability estimates across ages ranged from 0.05 to 0.5 for height, 0.02 to 0.48 for corrected length (CL), 0.05 to 0.68 for width, 0.08 to 0.57 for fillet weight (FW) and 0.12 to 0.42 for FY. All genetic correlation estimates between body measurements and FW were positive and strong (0.64 to 0.98). The estimates of genetic correlation between body measurements and FY were positive (except for CL at some ages), but weak to moderate (-0.08 to 0.68). These estimates resulted in strong and favorable correlated response efficiencies for FW and positive, but moderate for FY. These results indicate the possibility of achieving indirect genetic gains for FW and by selecting for morphometric traits, but low efficiency for FY when compared with direct selection.

  4. Population differentiation in Pacific salmon: local adaptation, genetic drift, or the environment?

    USGS Publications Warehouse

    Adkison, Milo D.

    1995-01-01

    Morphological, behavioral, and life-history differences between Pacific salmon (Oncorhynchus spp.) populations are commonly thought to reflect local adaptation, and it is likewise common to assume that salmon populations separated by small distances are locally adapted. Two alternatives to local adaptation exist: random genetic differentiation owing to genetic drift and founder events, and genetic homogeneity among populations, in which differences reflect differential trait expression in differing environments. Population genetics theory and simulations suggest that both alternatives are possible. With selectively neutral alleles, genetic drift can result in random differentiation despite many strays per generation. Even weak selection can prevent genetic drift in stable populations; however, founder effects can result in random differentiation despite selective pressures. Overlapping generations reduce the potential for random differentiation. Genetic homogeneity can occur despite differences in selective regimes when straying rates are high. In sum, localized differences in selection should not always result in local adaptation. Local adaptation is favored when population sizes are large and stable, selection is consistent over large areas, selective diffeentials are large, and straying rates are neither too high nor too low. Consideration of alternatives to local adaptation would improve both biological research and salmon conservation efforts.

  5. Effect of darapladib on major coronary events after an acute coronary syndrome: the SOLID-TIMI 52 randomized clinical trial.

    PubMed

    O'Donoghue, Michelle L; Braunwald, Eugene; White, Harvey D; Steen, Dylan P; Lukas, Mary Ann; Tarka, Elizabeth; Steg, P Gabriel; Hochman, Judith S; Bode, Christoph; Maggioni, Aldo P; Im, KyungAh; Shannon, Jennifer B; Davies, Richard Y; Murphy, Sabina A; Crugnale, Sharon E; Wiviott, Stephen D; Bonaca, Marc P; Watson, David F; Weaver, W Douglas; Serruys, Patrick W; Cannon, Christopher P; Steen, Dylan L

    2014-09-10

    Lipoprotein-associated phospholipase A2 (Lp-PLA2) has been hypothesized to be involved in atherogenesis through pathways related to inflammation. Darapladib is an oral, selective inhibitor of the Lp-PLA2 enzyme. To evaluate the efficacy and safety of darapladib in patients after an acute coronary syndrome (ACS) event. SOLID-TIMI 52 was a multinational, double-blind, placebo-controlled trial that randomized 13,026 participants within 30 days of hospitalization with an ACS (non-ST-elevation or ST-elevation myocardial infarction [MI]) at 868 sites in 36 countries. Patients were randomized to either once-daily darapladib (160 mg) or placebo on a background of guideline-recommended therapy. Patients were followed up for a median of 2.5 years between December 7, 2009, and December 6, 2013. The primary end point (major coronary events) was the composite of coronary heart disease (CHD) death, MI, or urgent coronary revascularization for myocardial ischemia. Kaplan-Meier event rates are reported at 3 years. During a median duration of 2.5 years, the primary end point occurred in 903 patients in the darapladib group and 910 in the placebo group (16.3% vs 15.6% at 3 years; hazard ratio [HR], 1.00 [95% CI, 0.91-1.09]; P = .93). The composite of cardiovascular death, MI, or stroke occurred in 824 in the darapladib group and 838 in the placebo group (15.0% vs 15.0% at 3 years; HR, 0.99 [95% CI, 0.90-1.09]; P = .78). There were no differences between the treatment groups for additional secondary end points, for individual components of the primary end point, or in all-cause mortality (371 events in the darapladib group and 395 in the placebo group [7.3% vs 7.1% at 3 years; HR, 0.94 [95% CI, 0.82-1.08]; P = .40). Patients were more likely to report an odor-related concern in the darapladib group vs the placebo group (11.5% vs 2.5%) and also more likely to report diarrhea (10.6% vs 5.6%). In patients who experienced an ACS event, direct inhibition of Lp-PLA2 with darapladib added to optimal medical therapy and initiated within 30 days of hospitalization did not reduce the risk of major coronary events. clinicaltrials.gov Identifier: NCT01000727.

  6. SIRFLOX: Randomized Phase III Trial Comparing First-Line mFOLFOX6 (Plus or Minus Bevacizumab) Versus mFOLFOX6 (Plus or Minus Bevacizumab) Plus Selective Internal Radiation Therapy in Patients With Metastatic Colorectal Cancer.

    PubMed

    van Hazel, Guy A; Heinemann, Volker; Sharma, Navesh K; Findlay, Michael P N; Ricke, Jens; Peeters, Marc; Perez, David; Robinson, Bridget A; Strickland, Andrew H; Ferguson, Tom; Rodríguez, Javier; Kröning, Hendrik; Wolf, Ido; Ganju, Vinod; Walpole, Euan; Boucher, Eveline; Tichler, Thomas; Shacham-Shmueli, Einat; Powell, Alex; Eliadis, Paul; Isaacs, Richard; Price, David; Moeslein, Fred; Taieb, Julien; Bower, Geoff; Gebski, Val; Van Buskirk, Mark; Cade, David N; Thurston, Kenneth; Gibbs, Peter

    2016-05-20

    SIRFLOX was a randomized, multicenter trial designed to assess the efficacy and safety of adding selective internal radiation therapy (SIRT) using yttrium-90 resin microspheres to standard fluorouracil, leucovorin, and oxaliplatin (FOLFOX)-based chemotherapy in patients with previously untreated metastatic colorectal cancer. Chemotherapy-naïve patients with liver metastases plus or minus limited extrahepatic metastases were randomly assigned to receive either modified FOLFOX (mFOLFOX6; control) or mFOLFOX6 plus SIRT (SIRT) plus or minus bevacizumab. The primary end point was progression-free survival (PFS) at any site as assessed by independent centralized radiology review blinded to study arm. Between October 2006 and April 2013, 530 patients were randomly assigned to treatment (control, 263; SIRT, 267). Median PFS at any site was 10.2 v 10.7 months in control versus SIRT (hazard ratio, 0.93; 95% CI, 0.77 to 1.12; P = .43). Median PFS in the liver by competing risk analysis was 12.6 v 20.5 months in control versus SIRT (hazard ratio, 0.69; 95% CI, 0.55 to 0.90; P = .002). Objective response rates (ORRs) at any site were similar (68.1% v 76.4% in control v SIRT; P = .113). ORR in the liver was improved with the addition of SIRT (68.8% v 78.7% in control v SIRT; P = .042). Grade ≥ 3 adverse events, including recognized SIRT-related effects, were reported in 73.4% and 85.4% of patients in control versus SIRT. The addition of SIRT to FOLFOX-based first-line chemotherapy in patients with liver-dominant or liver-only metastatic colorectal cancer did not improve PFS at any site but significantly delayed disease progression in the liver. The safety profile was as expected and was consistent with previous studies. © 2016 by American Society of Clinical Oncology.

  7. Evaluation of Electromyographic Biofeedback for the Quadriceps Femoris: A Systematic Review

    PubMed Central

    Wasielewski, Noah J.; Parker, Tonya M.; Kotsko, Kevin M.

    2011-01-01

    Objective: To critically review evidence for the effectiveness of electromyographic biofeedback (EMGB) of the quadriceps femoris muscle in treating various knee conditions. Data Sources: Databases used to locate randomized controlled trials included PubMed (1980–2010), Cumulative Index of Nursing and Allied Health Literature (CINAHL, 1995–2007), Web of Science (1986–2010), SPORTDiscus (1990–2007), and Physiotherapy Evidence Database (PEDro). Key words were knee and biofeedback. Study Selection: The criteria for selection were clinical randomized controlled trials in which EMGB of the quadriceps femoris was used for various knee conditions of musculoskeletal origin. Trials were excluded because of research designs other than randomized controlled trials, articles published in a non-English language, inclusion of healthy research participants, inability to identify EMGB as the source of clinical improvement, and lack of pain, functional outcome, or quadriceps torque as outcome measures. Data Extraction: Twenty specific data points were abstracted from each clinical trial under the broad categories of attributes of the patient and injury, treatment variables for the EMGB group, treatment variables for the control group, and attributes of the research design. Data Synthesis: Eight trials yielded a total of 319 participants with patellofemoral pain syndrome (n = 86), anterior cruciate ligament reconstruction (n = 52), arthroscopic surgery (n = 91), or osteoarthritis (n = 90). The average methodologic score of the included studies was 4.6/10 based on PEDro criteria. Pooled analyses demonstrated heterogeneity of the included studies, rendering the interpretation of the pooled data inappropriate. The EMGB appeared to benefit short-term postsurgical pain or quadriceps strength in 3 of 4 postsurgical investigations but was ineffective for chronic knee conditions such as patellofemoral pain and osteoarthritis in all 4 studies. Because the findings are based on limited data, caution is warranted until more randomized controlled trials are conducted to support or refute the general trends observed in this report. PMID:22488142

  8. Phase III, Randomized, Double-Blind, Multicenter Trial Comparing Orteronel (TAK-700) Plus Prednisone With Placebo Plus Prednisone in Patients With Metastatic Castration-Resistant Prostate Cancer That Has Progressed During or After Docetaxel-Based Therapy: ELM-PC 5

    PubMed Central

    Fizazi, Karim; Jones, Robert; Oudard, Stephane; Efstathiou, Eleni; Saad, Fred; de Wit, Ronald; De Bono, Johann; Cruz, Felipe Melo; Fountzilas, George; Ulys, Albertas; Carcano, Flavio; Agarwal, Neeraj; Agus, David; Bellmunt, Joaquim; Petrylak, Daniel P.; Lee, Shih-Yuan; Webb, Iain J.; Tejura, Bindu; Borgstein, Niels; Dreicer, Robert

    2015-01-01

    Purpose Orteronel (TAK-700) is an investigational, nonsteroidal, reversible, selective 17,20-lyase inhibitor. This study examined orteronel in patients with metastatic castration-resistant prostate cancer that progressed after docetaxel therapy. Patients and Methods In our study, 1,099 men were randomly assigned in a 2:1 schedule to receive orteronel 400 mg plus prednisone 5 mg twice daily or placebo plus prednisone 5 mg twice daily, stratified by region (Europe, North America [NA], and non-Europe/NA) and Brief Pain Inventory–Short Form worst pain score. Primary end point was overall survival (OS). Key secondary end points (radiographic progression-free survival [rPFS], ≥ 50% decrease of prostate-specific antigen [PSA50], and pain response at 12 weeks) were to undergo statistical testing only if the primary end point analysis was significant. Results The study was unblinded after crossing a prespecified OS futility boundary. The median OS was 17.0 months versus 15.2 months with orteronel-prednisone versus placebo-prednisone (hazard ratio [HR], 0.886; 95% CI, 0.739 to 1.062; P = .190). Improved rPFS was observed with orteronel-prednisone (median, 8.3 v 5.7 months; HR, 0.760; 95% CI, 0.653 to 0.885; P < .001). Orteronel-prednisone showed advantages over placebo-prednisone in PSA50 rate (25% v 10%, P < .001) and time to PSA progression (median, 5.5 v 2.9 months, P < .001) but not pain response rate (12% v 9%; P = .128). Adverse events (all grades) were generally more frequent with orteronel-prednisone, including nausea (42% v 26%), vomiting (36% v 17%), fatigue (29% v 23%), and increased amylase (14% v 2%). Conclusion Our study did not meet the primary end point of OS. Longer rPFS and a higher PSA50 rate with orteronel-prednisone indicate antitumor activity. PMID:25624429

  9. Alpha emitter radium-223 and survival in metastatic prostate cancer.

    PubMed

    Parker, C; Nilsson, S; Heinrich, D; Helle, S I; O'Sullivan, J M; Fosså, S D; Chodacki, A; Wiechno, P; Logue, J; Seke, M; Widmark, A; Johannessen, D C; Hoskin, P; Bottomley, D; James, N D; Solberg, A; Syndikus, I; Kliment, J; Wedel, S; Boehmer, S; Dall'Oglio, M; Franzén, L; Coleman, R; Vogelzang, N J; O'Bryan-Tear, C G; Staudacher, K; Garcia-Vargas, J; Shan, M; Bruland, Ø S; Sartor, O

    2013-07-18

    Radium-223 dichloride (radium-223), an alpha emitter, selectively targets bone metastases with alpha particles. We assessed the efficacy and safety of radium-223 as compared with placebo, in addition to the best standard of care, in men with castration-resistant prostate cancer and bone metastases. In our phase 3, randomized, double-blind, placebo-controlled study, we randomly assigned 921 patients who had received, were not eligible to receive, or declined docetaxel, in a 2:1 ratio, to receive six injections of radium-223 (at a dose of 50 kBq per kilogram of body weight intravenously) or matching placebo; one injection was administered every 4 weeks. In addition, all patients received the best standard of care. The primary end point was overall survival. The main secondary efficacy end points included time to the first symptomatic skeletal event and various biochemical end points. A prespecified interim analysis, conducted when 314 deaths had occurred, assessed the effect of radium-223 versus placebo on survival. An updated analysis, when 528 deaths had occurred, was performed before crossover from placebo to radium-223. At the interim analysis, which involved 809 patients, radium-223, as compared with placebo, significantly improved overall survival (median, 14.0 months vs. 11.2 months; hazard ratio, 0.70; 95% confidence interval [CI], 0.55 to 0.88; two-sided P=0.002). The updated analysis involving 921 patients confirmed the radium-223 survival benefit (median, 14.9 months vs. 11.3 months; hazard ratio, 0.70; 95% CI, 0.58 to 0.83; P<0.001). Assessments of all main secondary efficacy end points also showed a benefit of radium-233 as compared with placebo. Radium-223 was associated with low myelosuppression rates and fewer adverse events. In this study, which was terminated for efficacy at the prespecified interim analysis, radium-223 improved overall survival. (Funded by Algeta and Bayer HealthCare Pharmaceuticals; ALSYMPCA ClinicalTrials.gov number, NCT00699751.).

  10. Evaluating the impact of a school-based health intervention using a randomized field experiment.

    PubMed

    Greve, Jane; Heinesen, Eskil

    2015-07-01

    We conduct an econometric evaluation of a health-promoting programme in primary and lower secondary schools in Denmark. The programme includes health-related measurements of the students, communication of knowledge about health, and support of health-promoting projects for students. Half of the schools in the fourth largest municipality in Denmark were randomly selected into a treatment group implementing the programme, while the remainder served as a control group. We estimate both OLS models using only post-intervention observations and difference in differences (DID) models using also pre-intervention observations. We estimate effects of the initiative on BMI, waist/height ratio, overweight and obesity for the entire sample and by gender and grade. We find no consistent effect of the programme. When we use the entire sample, no estimates are statistically significant at conventional levels, although the point estimates for the effect on BMI, indicating an average reduction in the range of 0.10-0.15 kg/m(2), are consistent with the results in a recent Cochrane review evaluating 55 studies of diet and exercise interventions targeting children; and DID estimates which are marginally significant (at the 10% level) indicate that the intervention reduces the risk of obesity by 1% point. Running separate estimations by gender and grade we find a few statistically significant estimates: OLS estimates indicate that the intervention reduces BMI in females in grade 5 by 0.39 kg/m(2) and reduces the risk of obesity in females in grade 9 by 2.6% points; DID estimates indicate an increase in waist for females in preschool class by 1.2 cm and an increase in the risk of obesity in grade 9 males by 4% points. However, if we corrected for multiple hypotheses testing these estimates would be insignificant. There is no statistically significant correlation between participation in the programme and the number of other health-promoting projects at the schools. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, R; Zhu, X; Li, S

    Purpose: High Dose Rate (HDR) brachytherapy forward planning is principally an iterative process; hence, plan quality is affected by planners’ experiences and limited planning time. Thus, this may lead to sporadic errors and inconsistencies in planning. A statistical tool based on previous approved clinical treatment plans would help to maintain the consistency of planning quality and improve the efficiency of second checking. Methods: An independent dose calculation tool was developed from commercial software. Thirty-three previously approved cervical HDR plans with the same prescription dose (550cGy), applicator type, and treatment protocol were examined, and ICRU defined reference point doses (bladder, vaginalmore » mucosa, rectum, and points A/B) along with dwell times were collected. Dose calculation tool then calculated appropriate range with a 95% confidence interval for each parameter obtained, which would be used as the benchmark for evaluation of those parameters in future HDR treatment plans. Model quality was verified using five randomly selected approved plans from the same dataset. Results: Dose variations appears to be larger at the reference point of bladder and mucosa as compared with rectum. Most reference point doses from verification plans fell between the predicted range, except the doses of two points of rectum and two points of reference position A (owing to rectal anatomical variations & clinical adjustment in prescription points, respectively). Similar results were obtained for tandem and ring dwell times despite relatively larger uncertainties. Conclusion: This statistical tool provides an insight into clinically acceptable range of cervical HDR plans, which could be useful in plan checking and identifying potential planning errors, thus improving the consistency of plan quality.« less

  12. Relationship Between FEV1 and Patient-Reported Outcomes Changes: Results of a Meta-Analysis of Randomized Trials in Stable COPD.

    PubMed

    de la Loge, Christine; Tugaut, Béatrice; Fofana, Fatoumata; Lambert, Jérémy; Hennig, Michael; Tschiesner, Uta; Vahdati-Bolouri, Mitra; Segun Ismaila, Afisi; Suresh Punekar, Yogesh

    2016-03-15

    Background: This meta-analysis assessed the relationship between change from baseline (CFB) in spirometric measurements (trough forced expiratory volume in 1 second [FEV 1 ] and FEV 1 area under the curve [AUC]) and patient-reported outcomes (St. George's Respiratory Questionnaire total score [SGRQ] CFB, Transition Dyspnea Index [TDI] and exacerbation rates) after 6-12 months' follow-up, using study treatment-group level data. Methods: A systematic literature search was performed for randomized controlled trials of ≥24 weeks duration in adults with chronic obstructive pulmonary disease (COPD). Studies reporting ≥1 spirometric measurement and ≥1 patient-reported outcome (PRO) at baseline and at study endpoint were selected. The relationships between PROs and spirometric endpoints were assessed using Pearson correlation coefficient and meta-regression. Results: Fifty-two studies (62,385 patients) were included. Primary weighted analysis conducted at the last assessment showed a large significant negative correlation (r, -0.68 [95% confidence interval (CI); -0.77, -0.57]) between trough FEV 1 and SGRQ. Improvement of 100 mL in trough FEV 1 corresponded to a 5.9 point reduction in SGRQ. Similarly, a reduction of 4 points on SGRQ corresponded to 40 mL improvement in trough FEV 1 ( p <0.001). The weighted correlation coefficients of trough FEV 1 with TDI, exacerbation rate (all) and exacerbation rate (moderate/severe) at last assessment point were 0.57, -0.69 and -0.57, respectively (all p <0.05). For the analyses excluding placebo groups, the correlations of FEV 1 with SGRQ and TDI were lower but significant. Conclusions: A strong association exists between changes in spirometric measurements and changes in PROs.

  13. Relationship Between FEV1 and Patient-Reported Outcomes Changes: Results of a Meta-Analysis of Randomized Trials in Stable COPD

    PubMed Central

    de la Loge, Christine; Tugaut, Béatrice; Fofana, Fatoumata; Lambert, Jérémy; Hennig, Michael; Tschiesner, Uta; Vahdati-Bolouri, Mitra; Segun Ismaila, Afisi; Suresh Punekar, Yogesh

    2016-01-01

    Background: This meta-analysis assessed the relationship between change from baseline (CFB) in spirometric measurements (trough forced expiratory volume in 1 second [FEV1] and FEV1 area under the curve [AUC]) and patient-reported outcomes (St. George’s Respiratory Questionnaire total score [SGRQ] CFB, Transition Dyspnea Index [TDI] and exacerbation rates) after 6-12 months’ follow-up, using study treatment-group level data. Methods: A systematic literature search was performed for randomized controlled trials of ≥24 weeks duration in adults with chronic obstructive pulmonary disease (COPD). Studies reporting ≥1 spirometric measurement and ≥1 patient-reported outcome (PRO) at baseline and at study endpoint were selected. The relationships between PROs and spirometric endpoints were assessed using Pearson correlation coefficient and meta-regression. Results: Fifty-two studies (62,385 patients) were included. Primary weighted analysis conducted at the last assessment showed a large significant negative correlation (r, −0.68 [95% confidence interval (CI); −0.77, −0.57]) between trough FEV1 and SGRQ. Improvement of 100 mL in trough FEV1 corresponded to a 5.9 point reduction in SGRQ. Similarly, a reduction of 4 points on SGRQ corresponded to 40 mL improvement in trough FEV1 (p<0.001). The weighted correlation coefficients of trough FEV1 with TDI, exacerbation rate (all) and exacerbation rate (moderate/severe) at last assessment point were 0.57, -0.69 and -0.57, respectively (all p<0.05). For the analyses excluding placebo groups, the correlations of FEV1 with SGRQ and TDI were lower but significant. Conclusions: A strong association exists between changes in spirometric measurements and changes in PROs. PMID:28848877

  14. A single point acupuncture treatment at large intestine meridian: a randomized controlled trial in acute tonsillitis and pharyngitis.

    PubMed

    Fleckenstein, Johannes; Lill, Christian; Lüdtke, Rainer; Gleditsch, Jochen; Rasp, Gerd; Irnich, Dominik

    2009-09-01

    One out of 4 patients visiting a general practitioner reports of a sore throat associated with pain on swallowing. This study was established to examine the immediate pain alleviating effect of a single point acupuncture treatment applied to the large intestine meridian of patients with sore throat. Sixty patients with acute tonsillitis and pharyngitis were enrolled in this randomized placebo-controlled trial. They either received acupuncture, or sham laser acupuncture, directed to the large intestine meridian section between acupuncture points LI 8 and LI 10. The main outcome measure was the change of pain intensity on swallowing a sip of water evaluated by a visual analog scale 15 minutes after treatment. A credibility assessment regarding the respective treatment was performed. The pain intensity for the acupuncture group before and immediately after therapy was 5.6+/-2.8 and 3.0+/-3.0, and for the sham group 5.6+/-2.5 and 3.8+/-2.5, respectively. Despite the articulation of a more pronounced improvement among the acupuncture group, there was no significant difference between groups (Delta=0.9, confidence interval: -0.2-2.0; P=0.12; analysis of covariance). Patients' satisfaction was high in both treatment groups. The study was prematurely terminated due to a subsequent lack of suitable patients. A single acupuncture treatment applied to a selected area of the large intestine meridian was no more effective in the alleviation of pain associated with clinical sore throat than sham laser acupuncture applied to the same area. Hence, clinically relevant improvement could be achieved. Pain alleviation might partly be due to the intense palpation of the large intestine meridian. The benefit of a comprehensive acupuncture treatment protocol in this condition should be subject to further trials.

  15. Saliency-Guided Detection of Unknown Objects in RGB-D Indoor Scenes.

    PubMed

    Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Xi, Ning

    2015-08-27

    This paper studies the problem of detecting unknown objects within indoor environments in an active and natural manner. The visual saliency scheme utilizing both color and depth cues is proposed to arouse the interests of the machine system for detecting unknown objects at salient positions in a 3D scene. The 3D points at the salient positions are selected as seed points for generating object hypotheses using the 3D shape. We perform multi-class labeling on a Markov random field (MRF) over the voxels of the 3D scene, combining cues from object hypotheses and 3D shape. The results from MRF are further refined by merging the labeled objects, which are spatially connected and have high correlation between color histograms. Quantitative and qualitative evaluations on two benchmark RGB-D datasets illustrate the advantages of the proposed method. The experiments of object detection and manipulation performed on a mobile manipulator validate its effectiveness and practicability in robotic applications.

  16. Evaluation of the Effectiveness of Chemical Dependency Counseling Course Based on Patrick and Partners

    PubMed Central

    Keshavarz, Yousef; Ghaedi, Sina; Rahimi-Kashani, Mansure

    2012-01-01

    Background The twelve step program is one of the programs that are administered for overcoming abuse of drugs. In this study, the effectiveness of chemical dependency counseling course was investigated using a hybrid model. Methods In a survey with sample size of 243, participants were selected using stratified random sampling method. A questionnaire was used for collecting data and one sample t-test employed for data analysis. Findings Chemical dependency counseling courses was effective from the point of view of graduates, chiefs of rehabilitation centers, rescuers and their families and ultimately managers of rebirth society, but it was not effective from the point of view of professors and lecturers. The last group evaluated the effectiveness of chemical dependency counseling courses only in performance level. Conclusion It seems that the chemical dependency counseling courses had appropriate effectiveness and led to change in attitudes, increase awareness, knowledge and experience combination and ultimately increased the efficiency of counseling. PMID:24494132

  17. Ultrahigh thermoelectric power factor in flexible hybrid inorganic-organic superlattice

    DOE PAGES

    Wan, Chunlei; Tian, Ruoming; Kondou, Mami; ...

    2017-10-18

    Hybrid inorganic–organic superlattice with an electron-transmitting but phonon-blocking structure has emerged as a promising flexible thin film thermoelectric material. However, the substantial challenge in optimizing carrier concentration without disrupting the superlattice structure prevents further improvement of the thermoelectric performance. Here we demonstrate a strategy for carrier optimization in a hybrid inorganic–organic superlattice of TiS 2[tetrabutylammonium] x [hexylammonium] y, where the organic layers are composed of a random mixture of tetrabutylammonium and hexylammonium molecules. By vacuum heating the hybrid materials at an intermediate temperature, the hexylammonium molecules with a lower boiling point are selectively de-intercalated, which reduces the electron density duemore » to the requirement of electroneutrality. The tetrabutylammonium molecules with a higher boiling point remain to support and stabilize the superlattice structure. Furthermore, the carrier concentration can thus be effectively reduced, resulting in a remarkably high power factor of 904 µW m –1 K –2 at 300 K for flexible thermoelectrics, approaching the values achieved in conventional inorganic semiconductors.« less

  18. Automated generation of radical species in crystalline carbohydrate using ab initio MD simulations.

    PubMed

    Aalbergsjø, Siv G; Pauwels, Ewald; Van Yperen-De Deyne, Andy; Van Speybroeck, Veronique; Sagstuen, Einar

    2014-08-28

    As the chemical structures of radiation damaged molecules may differ greatly from their undamaged counterparts, investigation and description of radiation damaged structures is commonly biased by the researcher. Radical formation from ionizing radiation in crystalline α-l-rhamnose monohydrate has been investigated using a new method where the selection of radical structures is unbiased by the researcher. The method is based on using ab initio molecular dynamics (MD) studies to investigate how ionization damage can form, change and move. Diversity in the radical production is gained by using different points on the potential energy surface of the intact crystal as starting points for the ionizations and letting the initial velocities of the nuclei after ionization be generated randomly. 160 ab initio MD runs produced 12 unique radical structures for investigation. Out of these, 7 of the potential products have never previously been discussed, and 3 products are found to match with radicals previously observed by electron magnetic resonance experiments.

  19. Saliency-Guided Detection of Unknown Objects in RGB-D Indoor Scenes

    PubMed Central

    Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Xi, Ning

    2015-01-01

    This paper studies the problem of detecting unknown objects within indoor environments in an active and natural manner. The visual saliency scheme utilizing both color and depth cues is proposed to arouse the interests of the machine system for detecting unknown objects at salient positions in a 3D scene. The 3D points at the salient positions are selected as seed points for generating object hypotheses using the 3D shape. We perform multi-class labeling on a Markov random field (MRF) over the voxels of the 3D scene, combining cues from object hypotheses and 3D shape. The results from MRF are further refined by merging the labeled objects, which are spatially connected and have high correlation between color histograms. Quantitative and qualitative evaluations on two benchmark RGB-D datasets illustrate the advantages of the proposed method. The experiments of object detection and manipulation performed on a mobile manipulator validate its effectiveness and practicability in robotic applications. PMID:26343656

  20. Ultrahigh thermoelectric power factor in flexible hybrid inorganic-organic superlattice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan, Chunlei; Tian, Ruoming; Kondou, Mami

    Hybrid inorganic–organic superlattice with an electron-transmitting but phonon-blocking structure has emerged as a promising flexible thin film thermoelectric material. However, the substantial challenge in optimizing carrier concentration without disrupting the superlattice structure prevents further improvement of the thermoelectric performance. Here we demonstrate a strategy for carrier optimization in a hybrid inorganic–organic superlattice of TiS 2[tetrabutylammonium] x [hexylammonium] y, where the organic layers are composed of a random mixture of tetrabutylammonium and hexylammonium molecules. By vacuum heating the hybrid materials at an intermediate temperature, the hexylammonium molecules with a lower boiling point are selectively de-intercalated, which reduces the electron density duemore » to the requirement of electroneutrality. The tetrabutylammonium molecules with a higher boiling point remain to support and stabilize the superlattice structure. Furthermore, the carrier concentration can thus be effectively reduced, resulting in a remarkably high power factor of 904 µW m –1 K –2 at 300 K for flexible thermoelectrics, approaching the values achieved in conventional inorganic semiconductors.« less

  1. Effect of expanding medicaid for parents on children's health insurance coverage: lessons from the Oregon experiment.

    PubMed

    DeVoe, Jennifer E; Marino, Miguel; Angier, Heather; O'Malley, Jean P; Crawford, Courtney; Nelson, Christine; Tillotson, Carrie J; Bailey, Steffani R; Gallia, Charles; Gold, Rachel

    2015-01-01

    In the United States, health insurance is not universal. Observational studies show an association between uninsured parents and children. This association persisted even after expansions in child-only public health insurance. Oregon's randomized Medicaid expansion for adults, known as the Oregon Experiment, created a rare opportunity to assess causality between parent and child coverage. To estimate the effect on a child's health insurance coverage status when (1) a parent randomly gains access to health insurance and (2) a parent obtains coverage. Oregon Experiment randomized natural experiment assessing the results of Oregon's 2008 Medicaid expansion. We used generalized estimating equation models to examine the longitudinal effect of a parent randomly selected to apply for Medicaid on their child's Medicaid or Children's Health Insurance Program (CHIP) coverage (intent-to-treat analyses). We used per-protocol analyses to understand the impact on children's coverage when a parent was randomly selected to apply for and obtained Medicaid. Participants included 14409 children aged 2 to 18 years whose parents participated in the Oregon Experiment. For intent-to-treat analyses, the date a parent was selected to apply for Medicaid was considered the date the child was exposed to the intervention. In per-protocol analyses, exposure was defined as whether a selected parent obtained Medicaid. Children's Medicaid or CHIP coverage, assessed monthly and in 6-month intervals relative to their parent's selection date. In the immediate period after selection, children whose parents were selected to apply significantly increased from 3830 (61.4%) to 4152 (66.6%) compared with a nonsignificant change from 5049 (61.8%) to 5044 (61.7%) for children whose parents were not selected to apply. Children whose parents were randomly selected to apply for Medicaid had 18% higher odds of being covered in the first 6 months after parent's selection compared with children whose parents were not selected (adjusted odds ratio [AOR]=1.18; 95% CI, 1.10-1.27). The effect remained significant during months 7 to 12 (AOR=1.11; 95% CI, 1.03-1.19); months 13 to 18 showed a positive but not significant effect (AOR=1.07; 95% CI, 0.99-1.14). Children whose parents were selected and obtained coverage had more than double the odds of having coverage compared with children whose parents were not selected and did not gain coverage (AOR=2.37; 95% CI, 2.14-2.64). Children's odds of having Medicaid or CHIP coverage increased when their parents were randomly selected to apply for Medicaid. Children whose parents were selected and subsequently obtained coverage benefited most. This study demonstrates a causal link between parents' access to Medicaid coverage and their children's coverage.

  2. Genome-wide association data classification and SNPs selection using two-stage quality-based Random Forests.

    PubMed

    Nguyen, Thanh-Tung; Huang, Joshua; Wu, Qingyao; Nguyen, Thuy; Li, Mark

    2015-01-01

    Single-nucleotide polymorphisms (SNPs) selection and identification are the most important tasks in Genome-wide association data analysis. The problem is difficult because genome-wide association data is very high dimensional and a large portion of SNPs in the data is irrelevant to the disease. Advanced machine learning methods have been successfully used in Genome-wide association studies (GWAS) for identification of genetic variants that have relatively big effects in some common, complex diseases. Among them, the most successful one is Random Forests (RF). Despite of performing well in terms of prediction accuracy in some data sets with moderate size, RF still suffers from working in GWAS for selecting informative SNPs and building accurate prediction models. In this paper, we propose to use a new two-stage quality-based sampling method in random forests, named ts-RF, for SNP subspace selection for GWAS. The method first applies p-value assessment to find a cut-off point that separates informative and irrelevant SNPs in two groups. The informative SNPs group is further divided into two sub-groups: highly informative and weak informative SNPs. When sampling the SNP subspace for building trees for the forest, only those SNPs from the two sub-groups are taken into account. The feature subspaces always contain highly informative SNPs when used to split a node at a tree. This approach enables one to generate more accurate trees with a lower prediction error, meanwhile possibly avoiding overfitting. It allows one to detect interactions of multiple SNPs with the diseases, and to reduce the dimensionality and the amount of Genome-wide association data needed for learning the RF model. Extensive experiments on two genome-wide SNP data sets (Parkinson case-control data comprised of 408,803 SNPs and Alzheimer case-control data comprised of 380,157 SNPs) and 10 gene data sets have demonstrated that the proposed model significantly reduced prediction errors and outperformed most existing the-state-of-the-art random forests. The top 25 SNPs in Parkinson data set were identified by the proposed model including four interesting genes associated with neurological disorders. The presented approach has shown to be effective in selecting informative sub-groups of SNPs potentially associated with diseases that traditional statistical approaches might fail. The new RF works well for the data where the number of case-control objects is much smaller than the number of SNPs, which is a typical problem in gene data and GWAS. Experiment results demonstrated the effectiveness of the proposed RF model that outperformed the state-of-the-art RFs, including Breiman's RF, GRRF and wsRF methods.

  3. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...

  4. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...

  5. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...

  6. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...

  7. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...

  8. Exploring the parameter space of the coarse-grained UNRES force field by random search: selecting a transferable medium-resolution force field.

    PubMed

    He, Yi; Xiao, Yi; Liwo, Adam; Scheraga, Harold A

    2009-10-01

    We explored the energy-parameter space of our coarse-grained UNRES force field for large-scale ab initio simulations of protein folding, to obtain good initial approximations for hierarchical optimization of the force field with new virtual-bond-angle bending and side-chain-rotamer potentials which we recently introduced to replace the statistical potentials. 100 sets of energy-term weights were generated randomly, and good sets were selected by carrying out replica-exchange molecular dynamics simulations of two peptides with a minimal alpha-helical and a minimal beta-hairpin fold, respectively: the tryptophan cage (PDB code: 1L2Y) and tryptophan zipper (PDB code: 1LE1). Eight sets of parameters produced native-like structures of these two peptides. These eight sets were tested on two larger proteins: the engrailed homeodomain (PDB code: 1ENH) and FBP WW domain (PDB code: 1E0L); two sets were found to produce native-like conformations of these proteins. These two sets were tested further on a larger set of nine proteins with alpha or alpha + beta structure and found to locate native-like structures of most of them. These results demonstrate that, in addition to finding reasonable initial starting points for optimization, an extensive search of parameter space is a powerful method to produce a transferable force field. Copyright 2009 Wiley Periodicals, Inc.

  9. The Status, Quality, and Expansion of the NIH Full-Length cDNA Project: The Mammalian Gene Collection (MGC)

    PubMed Central

    2004-01-01

    The National Institutes of Health's Mammalian Gene Collection (MGC) project was designed to generate and sequence a publicly accessible cDNA resource containing a complete open reading frame (ORF) for every human and mouse gene. The project initially used a random strategy to select clones from a large number of cDNA libraries from diverse tissues. Candidate clones were chosen based on 5′-EST sequences, and then fully sequenced to high accuracy and analyzed by algorithms developed for this project. Currently, more than 11,000 human and 10,000 mouse genes are represented in MGC by at least one clone with a full ORF. The random selection approach is now reaching a saturation point, and a transition to protocols targeted at the missing transcripts is now required to complete the mouse and human collections. Comparison of the sequence of the MGC clones to reference genome sequences reveals that most cDNA clones are of very high sequence quality, although it is likely that some cDNAs may carry missense variants as a consequence of experimental artifact, such as PCR, cloning, or reverse transcriptase errors. Recently, a rat cDNA component was added to the project, and ongoing frog (Xenopus) and zebrafish (Danio) cDNA projects were expanded to take advantage of the high-throughput MGC pipeline. PMID:15489334

  10. On the Interplay between the Evolvability and Network Robustness in an Evolutionary Biological Network: A Systems Biology Approach

    PubMed Central

    Chen, Bor-Sen; Lin, Ying-Po

    2011-01-01

    In the evolutionary process, the random transmission and mutation of genes provide biological diversities for natural selection. In order to preserve functional phenotypes between generations, gene networks need to evolve robustly under the influence of random perturbations. Therefore, the robustness of the phenotype, in the evolutionary process, exerts a selection force on gene networks to keep network functions. However, gene networks need to adjust, by variations in genetic content, to generate phenotypes for new challenges in the network’s evolution, ie, the evolvability. Hence, there should be some interplay between the evolvability and network robustness in evolutionary gene networks. In this study, the interplay between the evolvability and network robustness of a gene network and a biochemical network is discussed from a nonlinear stochastic system point of view. It was found that if the genetic robustness plus environmental robustness is less than the network robustness, the phenotype of the biological network is robust in evolution. The tradeoff between the genetic robustness and environmental robustness in evolution is discussed from the stochastic stability robustness and sensitivity of the nonlinear stochastic biological network, which may be relevant to the statistical tradeoff between bias and variance, the so-called bias/variance dilemma. Further, the tradeoff could be considered as an antagonistic pleiotropic action of a gene network and discussed from the systems biology perspective. PMID:22084563

  11. Rheumatic Heart Disease in Kerala: A Vanishing Entity? An Echo Doppler Study in 5-15-Years-Old School Children.

    PubMed

    Nair, Bigesh; Viswanathan, Sunitha; Koshy, A George; Gupta, Prabha Nini; Nair, Namita; Thakkar, Ashok

    2015-01-01

    Background. Early detection of subclinical rheumatic heart disease by use of echocardiography warrants timely implementation of secondary antibiotic prophylaxis and thereby prevents or retards its related complications. Objectives. The objective of this epidemiological study was to determine prevalence of RHD by echocardiography using World Heart Federation criteria in randomly selected school children of Trivandrum. Methods. This was a population-based cross-sectional screening study carried out in Trivandrum. A total of 2060 school children, 5-15 years, were randomly selected from five government and two private (aided) schools. All enrolled children were screened for RHD according to standard clinical and WHF criteria of echocardiography. Results. Echocardiographic examinations confirmed RHD in 5 children out of 146 clinically suspected cases. Thus, clinical prevalence was found to be 2.4 per 1000. According to WHF criteria of echocardiography, 12 children (12/2060) were diagnosed with RHD corresponding to echocardiographic prevalence of 5.83 cases per 1000. As per criteria, 6 children were diagnosed with definite RHD and 6 with borderline RHD. Conclusions. The results of the current study demonstrate that echocardiography is more sensitive and feasible in detecting clinically silent RHD. Our study, the largest school survey of south India till date, points towards declining prevalence of RHD (5.83/1000 cases) using WHF criteria in Kerala.

  12. Statistical 3D shape analysis of gender differences in lateral ventricles

    NASA Astrophysics Data System (ADS)

    He, Qing; Karpman, Dmitriy; Duan, Ye

    2010-03-01

    This paper aims at analyzing gender differences in the 3D shapes of lateral ventricles, which will provide reference for the analysis of brain abnormalities related to neurological disorders. Previous studies mostly focused on volume analysis, and the main challenge in shape analysis is the required step of establishing shape correspondence among individual shapes. We developed a simple and efficient method based on anatomical landmarks. 14 females and 10 males with matching ages participated in this study. 3D ventricle models were segmented from MR images by a semiautomatic method. Six anatomically meaningful landmarks were identified by detecting the maximum curvature point in a small neighborhood of a manually clicked point on the 3D model. Thin-plate spline was used to transform a randomly selected template shape to each of the rest shape instances, and the point correspondence was established according to Euclidean distance and surface normal. All shapes were spatially aligned by Generalized Procrustes Analysis. Hotelling T2 twosample metric was used to compare the ventricle shapes between males and females, and False Discovery Rate estimation was used to correct for the multiple comparison. The results revealed significant differences in the anterior horn of the right ventricle.

  13. Internet Addiction in High School Students in Turkey and Multivariate Analyses of the Underlying Factors.

    PubMed

    Kilic, Mahmut; Avci, Dilek; Uzuncakmak, Tugba

    2016-01-01

    The aim of this study is to examine the Internet addiction among adolescents in relation to their sociodemographic characteristics, communication skills, and perceived familial social support. This cross-sectional research is conducted in the high schools in some city centers, in Turkey, in 2013. In this study, cluster sampling was used. In each school, a class for each grade level was randomly selected, and all the students in the selected classes were included in the sample. One thousand seven hundred forty-two students aged between 14 and 20 years were included in the sample.The mean Internet Addiction Scale (IAS) score of the students was found to be 27.9 ± 21.2. According to the scores obtained from IAS, 81.8% of the students were found to display no symptoms (<50 points), 16.9% were found to display borderline symptoms (50-79 points), and 1.3% were found to be Internet addicts (≥80 points). According to the results of the binary logistic regression, male students and the students in single sex vocational schools were found to report higher levels of borderline Internet addiction. It was also observed that the IAS score increases when the father's educational level increases and when the students' school performance is worse. On the other hand, the IAS score decreases when the student grade level, perceived family social support, and communication skills scores increase.The risk factors for Internet addiction are being a male, low academic achievement, inadequate social support and communication skills, and father's high educational level.

  14. Combined impact of negative lifestyle factors on cardiovascular risk in children: a randomized prospective study.

    PubMed

    Meyer, Ursina; Schindler, Christian; Bloesch, Tamara; Schmocker, Eliane; Zahner, Lukas; Puder, Jardena J; Kriemler, Susi

    2014-12-01

    Negative lifestyle factors are known to be associated with increased cardiovascular risk (CVR) in children, but research on their combined impact on a general population of children is sparse. Therefore, we aimed to quantify the combined impact of easily assessable negative lifestyle factors on the CVR scores of randomly selected children after 4 years. Of the 540 randomly selected 6- to 13-year-old children, 502 children participated in a baseline health assessment, and 64% were assessed again after 4 years. Measures included anthropometry, fasting blood samples, and a health assessment questionnaire. Participants scored one point for each negative lifestyle factor at baseline: overweight; physical inactivity; high media consumption; little outdoor time; skipping breakfast; and having a parent who has ever smoked, is inactive, or overweight. A CVR score at follow-up was constructed by averaging sex- and age-related z-scores of waist circumference, blood pressure, glucose, inverted high-density lipoprotein, and triglycerides. The age-, sex-, pubertal stage-, and social class-adjusted probabilities (95% confidence interval) for being in the highest CVR score tertile at follow-up for children who had at most one (n = 48), two (n = 64), three (n = 56), four (n = 41), or five or more (n = 14) risky lifestyle factors were 15.4% (8.9-25.3), 24.3% (17.4-32.8), 36.0% (28.6-44.2), 49.8% (38.6-61.0), and 63.5% (47.2-77.2), respectively. Even in childhood, an accumulation of negative lifestyle factors is associated with higher CVR scores after 4 years. These negative lifestyle factors are easy to assess in clinical practice and allow early detection and prevention of CVR in childhood. Copyright © 2014 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  15. Rationale and design of the Patient Related OuTcomes with Endeavor versus Cypher stenting Trial (PROTECT): randomized controlled trial comparing the incidence of stent thrombosis and clinical events after sirolimus or zotarolimus drug-eluting stent implantation.

    PubMed

    Camenzind, Edoardo; Wijns, William; Mauri, Laura; Boersma, Eric; Parikh, Keyur; Kurowski, Volkhard; Gao, Runlin; Bode, Christoph; Greenwood, John P; Gershlick, Anthony; O'Neill, William; Serruys, Patrick W; Jorissen, Brenda; Steg, P Gabriel

    2009-12-01

    Drug-eluting stents (DES) reduce restenosis rates compared to bare-metal stents. Most trials using DES enrolled selected patient and lesion subtypes, and primary endpoint focused on angiographic metrics or relatively short-term outcomes. When DES are used in broader types of lesions and patients, important differences may emerge in long-term outcomes between stent types, particularly the incidence of late stent thrombosis. PROTECT is a randomized, open-label trial comparing the long-term safety of the zotarolimus-eluting stent and the sirolimus-eluting stent. The trial has enrolled 8,800 patients representative of those seen in routine clinical practice, undergoing elective, unplanned, or emergency procedures in native coronary arteries in 196 centers in 36 countries. Indications for the procedure and selection of target vessel and lesion characteristics were at the operator's discretion. Procedures could be staged, but no more than 4 target lesions could be treated per patient. Duration of dual antiplatelet therapy was prespecified to achieve similar lengths of treatment in both study arms. The shortest predefined duration was 3 months, as per the manufacturer's instructions. The primary outcome measure is the composite rate of definite and probable stent thrombosis at 3 years, centrally adjudicated using Academic Research Consortium definitions. The main secondary end points are 3-year all-cause mortality, cardiac death, large nonfatal myocardial infarction, and all myocardial infarctions. This large, international, randomized, controlled trial will provide important information on comparative rates of stent thrombosis between 2 different DES systems and safety as assessed by patient-relevant long-term clinical outcomes.

  16. A randomized, controlled trial of the family check-up model in public secondary schools: Examining links between parent engagement and substance use progressions from early adolescence to adulthood.

    PubMed

    Véronneau, Marie-Hélène; Dishion, Thomas J; Connell, Arin M; Kavanagh, Kathryn

    2016-06-01

    Substance use in adulthood compromises work, relationships, and health. Prevention strategies in early adolescence are designed to reduce substance use and progressions to problematic use by adulthood. This report examines the long-term effects of offering Family Check-up (FCU) at multiple time points in secondary education on the progression of substance use from age 11 to 23 years. Participants (N = 998; 472 females) were randomly assigned individuals to intervention or control in Grade 6 and offered a multilevel intervention that included a classroom-based intervention (universal), the FCU (selected), and tailored family management treatment (indicated). Among intervention families, 23% engaged in the selected and indicated levels during middle school. Intention to treat analyses revealed that randomization to the FCU was associated with reduced growth in marijuana use (p < .05), but not alcohol and tobacco use. We also examined whether engagement in the voluntary FCU services moderated the effect of the intervention on substance use progressions using complier average causal effect (CACE) modeling, and found that engagement in the FCU services predicted reductions in alcohol, tobacco, and marijuana use by age 23. In comparing engagers with nonengagers: 70% versus 95% showed signs of alcohol abuse or dependence, 28% versus 61% showed signs of tobacco dependence, and 59% versus 84% showed signs of marijuana abuse or dependence. Family interventions that are embedded within public school systems can reach high-risk students and families and prevent progressions from exploration to problematic substance use through early adulthood. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. Peculiarities of the statistics of spectrally selected fluorescence radiation in laser-pumped dye-doped random media

    NASA Astrophysics Data System (ADS)

    Yuvchenko, S. A.; Ushakova, E. V.; Pavlova, M. V.; Alonova, M. V.; Zimnyakov, D. A.

    2018-04-01

    We consider the practical realization of a new optical probe method of the random media which is defined as the reference-free path length interferometry with the intensity moments analysis. A peculiarity in the statistics of the spectrally selected fluorescence radiation in laser-pumped dye-doped random medium is discussed. Previously established correlations between the second- and the third-order moments of the intensity fluctuations in the random interference patterns, the coherence function of the probe radiation, and the path difference probability density for the interfering partial waves in the medium are confirmed. The correlations were verified using the statistical analysis of the spectrally selected fluorescence radiation emitted by a laser-pumped dye-doped random medium. Water solution of Rhodamine 6G was applied as the doping fluorescent agent for the ensembles of the densely packed silica grains, which were pumped by the 532 nm radiation of a solid state laser. The spectrum of the mean path length for a random medium was reconstructed.

  18. CURE-SMOTE algorithm and hybrid algorithm for feature selection and parameter optimization based on random forests.

    PubMed

    Ma, Li; Fan, Suohai

    2017-03-14

    The random forests algorithm is a type of classifier with prominent universality, a wide application range, and robustness for avoiding overfitting. But there are still some drawbacks to random forests. Therefore, to improve the performance of random forests, this paper seeks to improve imbalanced data processing, feature selection and parameter optimization. We propose the CURE-SMOTE algorithm for the imbalanced data classification problem. Experiments on imbalanced UCI data reveal that the combination of Clustering Using Representatives (CURE) enhances the original synthetic minority oversampling technique (SMOTE) algorithms effectively compared with the classification results on the original data using random sampling, Borderline-SMOTE1, safe-level SMOTE, C-SMOTE, and k-means-SMOTE. Additionally, the hybrid RF (random forests) algorithm has been proposed for feature selection and parameter optimization, which uses the minimum out of bag (OOB) data error as its objective function. Simulation results on binary and higher-dimensional data indicate that the proposed hybrid RF algorithms, hybrid genetic-random forests algorithm, hybrid particle swarm-random forests algorithm and hybrid fish swarm-random forests algorithm can achieve the minimum OOB error and show the best generalization ability. The training set produced from the proposed CURE-SMOTE algorithm is closer to the original data distribution because it contains minimal noise. Thus, better classification results are produced from this feasible and effective algorithm. Moreover, the hybrid algorithm's F-value, G-mean, AUC and OOB scores demonstrate that they surpass the performance of the original RF algorithm. Hence, this hybrid algorithm provides a new way to perform feature selection and parameter optimization.

  19. Evaluating Gaze-Based Interface Tools to Facilitate Point-and-Select Tasks with Small Targets

    ERIC Educational Resources Information Center

    Skovsgaard, Henrik; Mateo, Julio C.; Hansen, John Paulin

    2011-01-01

    Gaze interaction affords hands-free control of computers. Pointing to and selecting small targets using gaze alone is difficult because of the limited accuracy of gaze pointing. This is the first experimental comparison of gaze-based interface tools for small-target (e.g. less than 12 x 12 pixels) point-and-select tasks. We conducted two…

  20. Vedolizumab as induction and maintenance therapy for ulcerative colitis.

    PubMed

    Feagan, Brian G; Rutgeerts, Paul; Sands, Bruce E; Hanauer, Stephen; Colombel, Jean-Frédéric; Sandborn, William J; Van Assche, Gert; Axler, Jeffrey; Kim, Hyo-Jong; Danese, Silvio; Fox, Irving; Milch, Catherine; Sankoh, Serap; Wyant, Tim; Xu, Jing; Parikh, Asit

    2013-08-22

    Gut-selective blockade of lymphocyte trafficking by vedolizumab may constitute effective treatment for ulcerative colitis. We conducted two integrated randomized, double-blind, placebo-controlled trials of vedolizumab in patients with active disease. In the trial of induction therapy, 374 patients (cohort 1) received vedolizumab (at a dose of 300 mg) or placebo intravenously at weeks 0 and 2, and 521 patients (cohort 2) received open-label vedolizumab at weeks 0 and 2, with disease evaluation at week 6. In the trial of maintenance therapy, patients in either cohort who had a response to vedolizumab at week 6 were randomly assigned to continue receiving vedolizumab every 8 or 4 weeks or to switch to placebo for up to 52 weeks. A response was defined as a reduction in the Mayo Clinic score (range, 0 to 12, with higher scores indicating more active disease) of at least 3 points and a decrease of at least 30% from baseline, with an accompanying decrease in the rectal bleeding subscore of at least 1 point or an absolute rectal bleeding subscore of 0 or 1. Response rates at week 6 were 47.1% and 25.5% among patients in the vedolizumab group and placebo group, respectively (difference with adjustment for stratification factors, 21.7 percentage points; 95% confidence interval [CI], 11.6 to 31.7; P<0.001). At week 52, 41.8% of patients who continued to receive vedolizumab every 8 weeks and 44.8% of patients who continued to receive vedolizumab every 4 weeks were in clinical remission (Mayo Clinic score ≤2 and no subscore >1), as compared with 15.9% of patients who switched to placebo (adjusted difference, 26.1 percentage points for vedolizumab every 8 weeks vs. placebo [95% CI, 14.9 to 37.2; P<0.001] and 29.1 percentage points for vedolizumab every 4 weeks vs. placebo [95% CI, 17.9 to 40.4; P<0.001]). The frequency of adverse events was similar in the vedolizumab and placebo groups. Vedolizumab was more effective than placebo as induction and maintenance therapy for ulcerative colitis. (Funded by Millennium Pharmaceuticals; GEMINI 1 ClinicalTrials.gov number, NCT00783718.).

  1. Multivariate meta-analysis of prognostic factor studies with multiple cut-points and/or methods of measurement.

    PubMed

    Riley, Richard D; Elia, Eleni G; Malin, Gemma; Hemming, Karla; Price, Malcolm P

    2015-07-30

    A prognostic factor is any measure that is associated with the risk of future health outcomes in those with existing disease. Often, the prognostic ability of a factor is evaluated in multiple studies. However, meta-analysis is difficult because primary studies often use different methods of measurement and/or different cut-points to dichotomise continuous factors into 'high' and 'low' groups; selective reporting is also common. We illustrate how multivariate random effects meta-analysis models can accommodate multiple prognostic effect estimates from the same study, relating to multiple cut-points and/or methods of measurement. The models account for within-study and between-study correlations, which utilises more information and reduces the impact of unreported cut-points and/or measurement methods in some studies. The applicability of the approach is improved with individual participant data and by assuming a functional relationship between prognostic effect and cut-point to reduce the number of unknown parameters. The models provide important inferential results for each cut-point and method of measurement, including the summary prognostic effect, the between-study variance and a 95% prediction interval for the prognostic effect in new populations. Two applications are presented. The first reveals that, in a multivariate meta-analysis using published results, the Apgar score is prognostic of neonatal mortality but effect sizes are smaller at most cut-points than previously thought. In the second, a multivariate meta-analysis of two methods of measurement provides weak evidence that microvessel density is prognostic of mortality in lung cancer, even when individual participant data are available so that a continuous prognostic trend is examined (rather than cut-points). © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  2. Dose-response of an extrafine dry powder inhaler formulation of glycopyrronium bromide: randomized, double-blind, placebo-controlled, dose-ranging study (GlycoNEXT).

    PubMed

    Beeh, Kai M; Emirova, Aida; Prunier, Hélène; Santoro, Debora; Nandeuil, Marie Anna

    2018-01-01

    An extrafine formulation of the long-acting muscarinic antagonist, glycopyrronium bromide (GB), has been developed for delivery via the NEXThaler dry powder inhaler (DPI). This study assessed the bronchodilator efficacy and safety of different doses of this formulation in patients with COPD to identify the optimal dose for further development. This was a multicenter, randomized, double-blind, placebo-controlled, incomplete block, three-way crossover study, including three 28-day treatment periods, each separated by a 21-day washout period. Eligible patients had a diagnosis of COPD and post-bronchodilator forced expiratory volume in 1 s (FEV 1 ) 40%-70% predicted. Treatments administered were GB 6.25, 12.5, 25 and 50 μg or matched placebo; all were given twice daily (BID) via DPI, with spirometry assessed on Days 1 and 28 of each treatment period. The primary end point was FEV 1 area under the curve from 0 to 12 h (AUC 0-12 h ) on Day 28. A total of 202 patients were randomized (61% male, mean age 62.6 years), with 178 (88%) completing all the three treatment periods. For the primary end point, all the four GB doses were superior to placebo ( p <0.001) with mean differences (95% CI) of 114 (74, 154), 125 (85, 166), 143 (104, 183) and 187 (147, 228) mL for GB 6.25, 12.5, 25 and 50 μg BID, respectively. All four GB doses were also statistically superior to placebo for all secondary efficacy end points, showing clear dose-response relationships for most of the endpoints. Accordingly, GB 25 μg BID met the criteria for the minimally acceptable dose. Adverse events were reported by 15.5, 16.2, 10.9 and 14.3% of patients receiving GB 6.25, 12.5, 25 and 50 μg BID, respectively, and 14.8% receiving placebo. This study supports the selection of GB 25 μg BID as the minimal effective dose for patients with COPD when delivered with this extrafine DPI formulation.

  3. Is the internal connection more efficient than external connection in mechanical, biological, and esthetical point of views? A systematic review.

    PubMed

    Goiato, Marcelo Coelho; Pellizzer, Eduardo Piza; da Silva, Emily Vivianne Freitas; Bonatto, Liliane da Rocha; dos Santos, Daniela Micheline

    2015-09-01

    This systematic review aimed to evaluate if the internal connection is more efficient than the external connection and its associated influencing factors. A specific question was formulated according to the Population, Intervention, Control, and Outcome (PICO): Is internal connection more efficient than external connection in mechanical, biological, and esthetical point of views? An electronic search of the MEDLINE and the Web of Knowledge databases was performed for relevant studies published in English up to November 2013 by two independent reviewers. The keywords used in the search included a combination of "dental implant" and "internal connection" or "Morse connection" or "external connection." Selected studies were randomized clinical trials, prospective or retrospective studies, and in vitro studies with a clear aim of investigating the internal and/or external implant connection use. From an initial screening yield of 674 articles, 64 potentially relevant articles were selected after an evaluation of their titles and abstracts. Full texts of these articles were obtained with 29 articles fulfilling the inclusion criteria. Morse taper connection has the best sealing ability. Concerning crestal bone loss, internal connections presented better results than external connections. The limitation of the present study was the absence of randomized clinical trials that investigated if the internal connection was more efficient than the external connection. The external and internal connections have different mechanical, biological, and esthetical characteristics. Besides all systems that show proper success rates and effectiveness, crestal bone level maintenance is more important around internal connections than external connections. The Morse taper connection seems to be more efficient concerning biological aspects, allowing lower bacterial leakage and bone loss in single implants, including aesthetic regions. Additionally, this connection type can be successfully indicated for fixed partial prostheses and overdenture planning, since it exhibits high mechanical stability.

  4. Over- and undersupply in home care: a representative multicenter correlational study.

    PubMed

    Lahmann, Nils A; Suhr, Ralf; Kuntz, Simone; Kottner, Jan

    2015-04-01

    Quality assurance and funding of care become a major challenge against the background of demographic changes in western societies. The primary aim of the study was to identify possible misclassification, respectively over and undersupply of care by comparing the Barthel Index of clients of home care service with the level of care (Stage 0, I, II, III) according to the statutory German long-term care insurance. In 2012, a multi-center point prevalence study of 878 randomly selected clients of 100 randomly selected home care services across Germany was conducted. According to a standardized study protocol, demographics, the Barthel Index and the nurses' professional judgment-whether a client requires more nursing care-were assessed. Associations of the Barthel items and professional judgment were analyzed using univariate (Chi-square) and multivariate (logistic regression and classification-regression-tree-models) statistics. In each level of care, the Barthel Index showed large variability e.g. in level II ranging from 0 to 100 points. Multivariate logistic regression regarding possible under- and oversupply revealed occasionally fecal incontinence (2.1; 95 % CI 1.2-3.7), urinary incontinence (2.0; 95 % CI 1.1-3.6), feeding (1.7; 95 % CI 1.0-2.9), immobility (0.2; 95 % CI 0.1-0.6) and to be female (1.8; 95 % CI 1.2-2.6) to be statistically significantly associated. The variability in Barthel Index in each level of care found in this study indicated a large general misclassification of home care clients according to their actual need of care. Professional caregivers identified occasional incontinence, help with eating and drinking and mobility (especially in female clients) as areas of possible under- and oversupply of care. The statutory German long-term care insurance classification should be modified according to the above finding to increase the quality of care in home care clients.

  5. Adaptive Localization of Focus Point Regions via Random Patch Probabilistic Density from Whole-Slide, Ki-67-Stained Brain Tumor Tissue

    PubMed Central

    Alomari, Yazan M.; MdZin, Reena Rahayu

    2015-01-01

    Analysis of whole-slide tissue for digital pathology images has been clinically approved to provide a second opinion to pathologists. Localization of focus points from Ki-67-stained histopathology whole-slide tissue microscopic images is considered the first step in the process of proliferation rate estimation. Pathologists use eye pooling or eagle-view techniques to localize the highly stained cell-concentrated regions from the whole slide under microscope, which is called focus-point regions. This procedure leads to a high variety of interpersonal observations and time consuming, tedious work and causes inaccurate findings. The localization of focus-point regions can be addressed as a clustering problem. This paper aims to automate the localization of focus-point regions from whole-slide images using the random patch probabilistic density method. Unlike other clustering methods, random patch probabilistic density method can adaptively localize focus-point regions without predetermining the number of clusters. The proposed method was compared with the k-means and fuzzy c-means clustering methods. Our proposed method achieves a good performance, when the results were evaluated by three expert pathologists. The proposed method achieves an average false-positive rate of 0.84% for the focus-point region localization error. Moreover, regarding RPPD used to localize tissue from whole-slide images, 228 whole-slide images have been tested; 97.3% localization accuracy was achieved. PMID:25793010

  6. Evaluation of uncertainty in the adjustment of fundamental constants

    NASA Astrophysics Data System (ADS)

    Bodnar, Olha; Elster, Clemens; Fischer, Joachim; Possolo, Antonio; Toman, Blaza

    2016-02-01

    Combining multiple measurement results for the same quantity is an important task in metrology and in many other areas. Examples include the determination of fundamental constants, the calculation of reference values in interlaboratory comparisons, or the meta-analysis of clinical studies. However, neither the GUM nor its supplements give any guidance for this task. Various approaches are applied such as weighted least-squares in conjunction with the Birge ratio or random effects models. While the former approach, which is based on a location-scale model, is particularly popular in metrology, the latter represents a standard tool used in statistics for meta-analysis. We investigate the reliability and robustness of the location-scale model and the random effects model with particular focus on resulting coverage or credible intervals. The interval estimates are obtained by adopting a Bayesian point of view in conjunction with a non-informative prior that is determined by a currently favored principle for selecting non-informative priors. Both approaches are compared by applying them to simulated data as well as to data for the Planck constant and the Newtonian constant of gravitation. Our results suggest that the proposed Bayesian inference based on the random effects model is more reliable and less sensitive to model misspecifications than the approach based on the location-scale model.

  7. Learning From Past Failures of Oral Insulin Trials.

    PubMed

    Michels, Aaron W; Gottlieb, Peter A

    2018-07-01

    Very recently one of the largest type 1 diabetes prevention trials using daily administration of oral insulin or placebo was completed. After 9 years of study enrollment and follow-up, the randomized controlled trial failed to delay the onset of clinical type 1 diabetes, which was the primary end point. The unfortunate outcome follows the previous large-scale trial, the Diabetes Prevention Trial-Type 1 (DPT-1), which again failed to delay diabetes onset with oral insulin or low-dose subcutaneous insulin injections in a randomized controlled trial with relatives at risk for type 1 diabetes. These sobering results raise the important question, "Where does the type 1 diabetes prevention field move next?" In this Perspective, we advocate for a paradigm shift in which smaller mechanistic trials are conducted to define immune mechanisms and potentially identify treatment responders. The stage is set for these interventions in individuals at risk for type 1 diabetes as Type 1 Diabetes TrialNet has identified thousands of relatives with islet autoantibodies and general population screening for type 1 diabetes risk is under way. Mechanistic trials will allow for better trial design and patient selection based upon molecular markers prior to large randomized controlled trials, moving toward a personalized medicine approach for the prevention of type 1 diabetes. © 2018 by the American Diabetes Association.

  8. THE SELECTION OF A NATIONAL RANDOM SAMPLE OF TEACHERS FOR EXPERIMENTAL CURRICULUM EVALUATION.

    ERIC Educational Resources Information Center

    WELCH, WAYNE W.; AND OTHERS

    MEMBERS OF THE EVALUATION SECTION OF HARVARD PROJECT PHYSICS, DESCRIBING WHAT IS SAID TO BE THE FIRST ATTEMPT TO SELECT A NATIONAL RANDOM SAMPLE OF (HIGH SCHOOL PHYSICS) TEACHERS, LIST THE STEPS AS (1) PURCHASE OF A LIST OF PHYSICS TEACHERS FROM THE NATIONAL SCIENCE TEACHERS ASSOCIATION (MOST COMPLETE AVAILABLE), (2) SELECTION OF 136 NAMES BY A…

  9. Unbiased feature selection in learning random forests for high-dimensional data.

    PubMed

    Nguyen, Thanh-Tung; Huang, Joshua Zhexue; Nguyen, Thuy Thi

    2015-01-01

    Random forests (RFs) have been widely used as a powerful classification method. However, with the randomization in both bagging samples and feature selection, the trees in the forest tend to select uninformative features for node splitting. This makes RFs have poor accuracy when working with high-dimensional data. Besides that, RFs have bias in the feature selection process where multivalued features are favored. Aiming at debiasing feature selection in RFs, we propose a new RF algorithm, called xRF, to select good features in learning RFs for high-dimensional data. We first remove the uninformative features using p-value assessment, and the subset of unbiased features is then selected based on some statistical measures. This feature subset is then partitioned into two subsets. A feature weighting sampling technique is used to sample features from these two subsets for building trees. This approach enables one to generate more accurate trees, while allowing one to reduce dimensionality and the amount of data needed for learning RFs. An extensive set of experiments has been conducted on 47 high-dimensional real-world datasets including image datasets. The experimental results have shown that RFs with the proposed approach outperformed the existing random forests in increasing the accuracy and the AUC measures.

  10. Probability distribution of the entanglement across a cut at an infinite-randomness fixed point

    NASA Astrophysics Data System (ADS)

    Devakul, Trithep; Majumdar, Satya N.; Huse, David A.

    2017-03-01

    We calculate the probability distribution of entanglement entropy S across a cut of a finite one-dimensional spin chain of length L at an infinite-randomness fixed point using Fisher's strong randomness renormalization group (RG). Using the random transverse-field Ising model as an example, the distribution is shown to take the form p (S |L ) ˜L-ψ (k ) , where k ≡S /ln[L /L0] , the large deviation function ψ (k ) is found explicitly, and L0 is a nonuniversal microscopic length. We discuss the implications of such a distribution on numerical techniques that rely on entanglement, such as matrix-product-state-based techniques. Our results are verified with numerical RG simulations, as well as the actual entanglement entropy distribution for the random transverse-field Ising model which we calculate for large L via a mapping to Majorana fermions.

  11. A stochastic model for stationary dynamics of prices in real estate markets. A case of random intensity for Poisson moments of prices changes

    NASA Astrophysics Data System (ADS)

    Rusakov, Oleg; Laskin, Michael

    2017-06-01

    We consider a stochastic model of changes of prices in real estate markets. We suppose that in a book of prices the changes happen in points of jumps of a Poisson process with a random intensity, i.e. moments of changes sequently follow to a random process of the Cox process type. We calculate cumulative mathematical expectations and variances for the random intensity of this point process. In the case that the process of random intensity is a martingale the cumulative variance has a linear grows. We statistically process a number of observations of real estate prices and accept hypotheses of a linear grows for estimations as well for cumulative average, as for cumulative variance both for input and output prises that are writing in the book of prises.

  12. Inflation with a graceful exit in a random landscape

    NASA Astrophysics Data System (ADS)

    Pedro, F. G.; Westphal, A.

    2017-03-01

    We develop a stochastic description of small-field inflationary histories with a graceful exit in a random potential whose Hessian is a Gaussian random matrix as a model of the unstructured part of the string landscape. The dynamical evolution in such a random potential from a small-field inflation region towards a viable late-time de Sitter (dS) minimum maps to the dynamics of Dyson Brownian motion describing the relaxation of non-equilibrium eigenvalue spectra in random matrix theory. We analytically compute the relaxation probability in a saddle point approximation of the partition function of the eigenvalue distribution of the Wigner ensemble describing the mass matrices of the critical points. When applied to small-field inflation in the landscape, this leads to an exponentially strong bias against small-field ranges and an upper bound N ≪ 10 on the number of light fields N participating during inflation from the non-observation of negative spatial curvature.

  13. Sensitivity of landscape resistance estimates based on point selection functions to scale and behavioral state: Pumas as a case study

    Treesearch

    Katherine A. Zeller; Kevin McGarigal; Paul Beier; Samuel A. Cushman; T. Winston Vickers; Walter M. Boyce

    2014-01-01

    Estimating landscape resistance to animal movement is the foundation for connectivity modeling, and resource selection functions based on point data are commonly used to empirically estimate resistance. In this study, we used GPS data points acquired at 5-min intervals from radiocollared pumas in southern California to model context-dependent point selection...

  14. High-Tg Polynorbornene-Based Block and Random Copolymers for Butanol Pervaporation Membranes

    NASA Astrophysics Data System (ADS)

    Register, Richard A.; Kim, Dong-Gyun; Takigawa, Tamami; Kashino, Tomomasa; Burtovyy, Oleksandr; Bell, Andrew

    Vinyl addition polymers of substituted norbornene (NB) monomers possess desirably high glass transition temperatures (Tg); however, until very recently, the lack of an applicable living polymerization chemistry has precluded the synthesis of such polymers with controlled architecture, or copolymers with controlled sequence distribution. We have recently synthesized block and random copolymers of NB monomers bearing hydroxyhexafluoroisopropyl and n-butyl substituents (HFANB and BuNB) via living vinyl addition polymerization with Pd-based catalysts. Both series of polymers were cast into the selective skin layers of thin film composite (TFC) membranes, and these organophilic membranes investigated for the isolation of n-butanol from dilute aqueous solution (model fermentation broth) via pervaporation. The block copolymers show well-defined microphase-separated morphologies, both in bulk and as the selective skin layers on TFC membranes, while the random copolymers are homogeneous. Both block and random vinyl addition copolymers are effective as n-butanol pervaporation membranes, with the block copolymers showing a better flux-selectivity balance. While polyHFANB has much higher permeability and n-butanol selectivity than polyBuNB, incorporating BuNB units into the polymer (in either a block or random sequence) limits the swelling of the polyHFANB and thereby improves the n-butanol pervaporation selectivity.

  15. The Motor Subsystem as a Predictor of Success in Young Football Talents: A Person-Oriented Study

    PubMed Central

    Zibung, Marc; Zuber, Claudia; Conzelmann, Achim

    2016-01-01

    Motor tests play a key role in talent selection in football. However, individual motor tests only focus on specific areas of a player’s complex performance. To evaluate his or her overall performance during a game, the current study takes a holistic perspective and uses a person-oriented approach. In this approach, several factors are viewed together as a system, whose state is analysed longitudinally. Based on this idea, six motor tests were aggregated to form the Motor Function subsystem. 104 young, top-level, male football talents were tested three times (2011, 2012, 2013; Mage, t2011 = 12.26, SD = 0.29), and their overall level of performance was determined one year later (2014). The data were analysed using the LICUR method, a pattern-analytical procedure for person-oriented approaches. At all three measuring points, four patterns could be identified, which remained stable over time. One of the patterns found at the third measuring point identified more subsequently successful players than random selection would. This pattern is characterised by above-average, but not necessarily the best, performance on the tests. Developmental paths along structurally stable patterns that occur more often than predicted by chance indicate that the Motor Function subsystem is a viable means of forecasting in the age range of 12–15 years. Above-average, though not necessary outstanding, performance both on fitness and technical tests appears to be particularly promising. These findings underscore the view that a holistic perspective may be profitable in talent selection. PMID:27508929

  16. Can Ashi points stimulation have specific effects on shoulder pain? A systematic review of randomized controlled trials.

    PubMed

    Wang, Kang-Feng; Zhang, Li-Juan; Lu, Feng; Lu, Yong-Hui; Yang, Chuan-Hua

    2016-06-01

    To provide an evidence-based overview regarding the efficacy of Ashi points stimulation for the treatment of shoulder pain. A comprehensive search [PubMed, Chinese Biomedical Literature Database, China National Knowledge Infrastructure (CNKI), Chongqing Weipu Database for Chinese Technical Periodicals (VIP) and Wanfang Database] was conducted to identify randomized or quasi-randomized controlled trials that evaluated the effectiveness of Ashi points stimulation for shoulder pain compared with conventional treatment. The methodological quality of the included studies was assessed using the Cochrane risk of bias tool. RevMan 5.0 was used for data synthesis. Nine trials were included. Seven studies assessed the effectiveness of Ashi points stimulation on response rate compared with conventional acupuncture. Their results suggested significant effect in favour of Ashi points stimulation [odds ratio (OR): 5.89, 95% confidence interval (CI): 2.97 to 11.67, P<0.01, heterogeneity: χ(2) =3.81, P=0.70, I (2) =0% ]. One trial compared Ashi points stimulation with drug therapy. The result showed there was a significantly greater recovery rate in group of Ashi points stimulation (OR: 9.58, 95% CI: 2.69 to 34.12). One trial compared comprehensive treatment on the myofascial trigger points (MTrPs) with no treatment and the result was in favor of MTrPs. Ashi points stimulation might be superior to conventional acupuncture, drug therapy and no treatment for shoulder pain. However, due to the low methodological quality of included studies, a firm conclusion could not be reached until further studies of high quality are available.

  17. Robust portfolio selection based on asymmetric measures of variability of stock returns

    NASA Astrophysics Data System (ADS)

    Chen, Wei; Tan, Shaohua

    2009-10-01

    This paper addresses a new uncertainty set--interval random uncertainty set for robust optimization. The form of interval random uncertainty set makes it suitable for capturing the downside and upside deviations of real-world data. These deviation measures capture distributional asymmetry and lead to better optimization results. We also apply our interval random chance-constrained programming to robust mean-variance portfolio selection under interval random uncertainty sets in the elements of mean vector and covariance matrix. Numerical experiments with real market data indicate that our approach results in better portfolio performance.

  18. Positivity, discontinuity, finite resources, and nonzero error for arbitrarily varying quantum channels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boche, H., E-mail: boche@tum.de, E-mail: janis.noetzel@tum.de; Nötzel, J., E-mail: boche@tum.de, E-mail: janis.noetzel@tum.de

    2014-12-15

    This work is motivated by a quite general question: Under which circumstances are the capacities of information transmission systems continuous? The research is explicitly carried out on finite arbitrarily varying quantum channels (AVQCs). We give an explicit example that answers the recent question whether the transmission of messages over AVQCs can benefit from assistance by distribution of randomness between the legitimate sender and receiver in the affirmative. The specific class of channels introduced in that example is then extended to show that the unassisted capacity does have discontinuity points, while it is known that the randomness-assisted capacity is always continuousmore » in the channel. We characterize the discontinuity points and prove that the unassisted capacity is always continuous around its positivity points. After having established shared randomness as an important resource, we quantify the interplay between the distribution of finite amounts of randomness between the legitimate sender and receiver, the (nonzero) probability of a decoding error with respect to the average error criterion and the number of messages that can be sent over a finite number of channel uses. We relate our results to the entanglement transmission capacities of finite AVQCs, where the role of shared randomness is not yet well understood, and give a new sufficient criterion for the entanglement transmission capacity with randomness assistance to vanish.« less

  19. Dynamic Loads Generation for Multi-Point Vibration Excitation Problems

    NASA Technical Reports Server (NTRS)

    Shen, Lawrence

    2011-01-01

    A random-force method has been developed to predict dynamic loads produced by rocket-engine random vibrations for new rocket-engine designs. The method develops random forces at multiple excitation points based on random vibration environments scaled from accelerometer data obtained during hot-fire tests of existing rocket engines. This random-force method applies random forces to the model and creates expected dynamic response in a manner that simulates the way the operating engine applies self-generated random vibration forces (random pressure acting on an area) with the resulting responses that we measure with accelerometers. This innovation includes the methodology (implementation sequence), the computer code, two methods to generate the random-force vibration spectra, and two methods to reduce some of the inherent conservatism in the dynamic loads. This methodology would be implemented to generate the random-force spectra at excitation nodes without requiring the use of artificial boundary conditions in a finite element model. More accurate random dynamic loads than those predicted by current industry methods can then be generated using the random force spectra. The scaling method used to develop the initial power spectral density (PSD) environments for deriving the random forces for the rocket engine case is based on the Barrett Criteria developed at Marshall Space Flight Center in 1963. This invention approach can be applied in the aerospace, automotive, and other industries to obtain reliable dynamic loads and responses from a finite element model for any structure subject to multipoint random vibration excitations.

  20. Habitat characteristics of American woodcock nest sites on a managed area in Maine

    USGS Publications Warehouse

    McAuley, D.G.; Longcore, J.R.; Sepik, G.F.; Pendleton, G.W.

    1996-01-01

    We measured characteristics of habitat near 89 nests of American woodcock (Scolopax minor) and 100 randomly selected points on Moosehorn National Wildlife Refuge, Calais, Maine, an area managed for woodcock. At nest sites, basal area was lower (P 0.05) or between sites of successful nests and nests destroyed by predators, although the large variances of the variables reduced our power to detect differences. Habitat around sites of renests differed from sites of first nests. Sites around first nests had lower basal area of dead trees (P = 0.05) and higher stem densities of aspen (P = 0.03) and cherry saplings (P = 0.001), and viburnum (P = 0.05), while renest sites had taller trees (P = 0.02). The change from nest sites in areas dominated by alders and tree-size gray birch used in 1977-80 to sites dominated by sapling trees, especially aspen, used during 1987-90 suggests that woodcock in the expanding population at the refuge are selecting nest sites created by habitat management since 1979.

  1. Autonomous encoding of irrelevant goals and outcomes by prefrontal cortex neurons.

    PubMed

    Genovesio, Aldo; Tsujimoto, Satoshi; Navarra, Giulia; Falcone, Rossella; Wise, Steven P

    2014-01-29

    Two rhesus monkeys performed a distance discrimination task in which they reported whether a red square or a blue circle had appeared farther from a fixed reference point. Because a new pair of distances was chosen randomly on each trial, and because the monkeys had no opportunity to correct errors, no information from the previous trial was relevant to a current one. Nevertheless, many prefrontal cortex neurons encoded the outcome of the previous trial on current trials. A smaller, intermingled population of cells encoded the spatial goal on the previous trial or the features of the chosen stimuli, such as color or shape. The coding of previous outcomes and goals began at various times during a current trial, and it was selective in that prefrontal cells did not encode other information from the previous trial. The monitoring of previous goals and outcomes often contributes to problem solving, and it can support exploratory behavior. The present results show that such monitoring occurs autonomously and selectively, even when irrelevant to the task at hand.

  2. The influence of the level formants on the perception of synthetic vowel sounds

    NASA Astrophysics Data System (ADS)

    Kubzdela, Henryk; Owsianny, Mariuz

    A computer model of a generator of periodic complex sounds simulating consonants was developed. The system makes possible independent regulation of the level of each of the formants and instant generation of the sound. A trapezoid approximates the curve of the spectrum within the range of the formant. In using this model, each person in a group of six listeners experimentally selected synthesis parameters for six sounds that to him seemed optimal approximations of Polish consonants. From these, another six sounds were selected that were identified by a majority of the six persons and several additional listeners as being best qualified to serve as prototypes of Polish consonants. These prototypes were then used to randomly create sounds with various combinations at the level of the second and third formant and these were presented to seven listeners for identification. The results of the identifications are presented in table form in three variants and are described from the point of view of the requirements of automatic recognition of consonants in continuous speech.

  3. Visually based path-planning by Japanese monkeys.

    PubMed

    Mushiake, H; Saito, N; Sakamoto, K; Sato, Y; Tanji, J

    2001-03-01

    To construct an animal model of strategy formation, we designed a maze path-finding task. First, we asked monkeys to capture a goal in the maze by moving a cursor on the screen. Cursor movement was linked to movements of each wrist. When the animals learned the association between cursor movement and wrist movement, we established a start and a goal in the maze, and asked them to find a path between them. We found that the animals took the shortest pathway, rather than approaching the goal randomly. We further found that the animals adopted a strategy of selecting a fixed intermediate point in the visually presented maze to select one of the shortest pathways, suggesting a visually based path planning. To examine their capacity to use that strategy flexibly, we transformed the task by blocking pathways in the maze, providing a problem to solve. The animals then developed a strategy of solving the problem by planning a novel shortest path from the start to the goal and rerouting the path to bypass the obstacle.

  4. Adaptive consensus of scale-free multi-agent system by randomly selecting links

    NASA Astrophysics Data System (ADS)

    Mou, Jinping; Ge, Huafeng

    2016-06-01

    This paper investigates an adaptive consensus problem for distributed scale-free multi-agent systems (SFMASs) by randomly selecting links, where the degree of each node follows a power-law distribution. The randomly selecting links are based on the assumption that every agent decides to select links among its neighbours according to the received data with a certain probability. Accordingly, a novel consensus protocol with the range of the received data is developed, and each node updates its state according to the protocol. By the iterative method and Cauchy inequality, the theoretical analysis shows that all errors among agents converge to zero, and in the meanwhile, several criteria of consensus are obtained. One numerical example shows the reliability of the proposed methods.

  5. Automated corresponding point candidate selection for image registration using wavelet transformation neurla network with rotation invariant inputs and context information about neighboring candidates

    NASA Astrophysics Data System (ADS)

    Okumura, Hiroshi; Suezaki, Masashi; Sueyasu, Hideki; Arai, Kohei

    2003-03-01

    An automated method that can select corresponding point candidates is developed. This method has the following three features: 1) employment of the RIN-net for corresponding point candidate selection; 2) employment of multi resolution analysis with Haar wavelet transformation for improvement of selection accuracy and noise tolerance; 3) employment of context information about corresponding point candidates for screening of selected candidates. Here, the 'RIN-net' means the back-propagation trained feed-forward 3-layer artificial neural network that feeds rotation invariants as input data. In our system, pseudo Zernike moments are employed as the rotation invariants. The RIN-net has N x N pixels field of view (FOV). Some experiments are conducted to evaluate corresponding point candidate selection capability of the proposed method by using various kinds of remotely sensed images. The experimental results show the proposed method achieves fewer training patterns, less training time, and higher selection accuracy than conventional method.

  6. Effect of Expanding Medicaid for Parents on Children’s Health Insurance Coverage

    PubMed Central

    DeVoe, Jennifer E.; Marino, Miguel; Angier, Heather; O’Malley, Jean P.; Crawford, Courtney; Nelson, Christine; Tillotson, Carrie J.; Bailey, Steffani R.; Gallia, Charles; Gold, Rachel

    2016-01-01

    IMPORTANCE In the United States, health insurance is not universal. Observational studies show an association between uninsured parents and children. This association persisted even after expansions in child-only public health insurance. Oregon’s randomized Medicaid expansion for adults, known as the Oregon Experiment, created a rare opportunity to assess causality between parent and child coverage. OBJECTIVE To estimate the effect on a child’s health insurance coverage status when (1) a parent randomly gains access to health insurance and (2) a parent obtains coverage. DESIGN, SETTING, AND PARTICIPANTS Oregon Experiment randomized natural experiment assessing the results of Oregon’s 2008 Medicaid expansion. We used generalized estimating equation models to examine the longitudinal effect of a parent randomly selected to apply for Medicaid on their child’s Medicaid or Children’s Health Insurance Program (CHIP) coverage (intent-to-treat analyses). We used per-protocol analyses to understand the impact on children’s coverage when a parent was randomly selected to apply for and obtained Medicaid. Participants included 14 409 children aged 2 to 18 years whose parents participated in the Oregon Experiment. EXPOSURES For intent-to-treat analyses, the date a parent was selected to apply for Medicaid was considered the date the child was exposed to the intervention. In per-protocol analyses, exposure was defined as whether a selected parent obtained Medicaid. MAIN OUTCOMES AND MEASURES Children’s Medicaid or CHIP coverage, assessed monthly and in 6-month intervals relative to their parent’s selection date. RESULTS In the immediate period after selection, children whose parents were selected to apply significantly increased from 3830 (61.4%) to 4152 (66.6%) compared with a nonsignificant change from 5049 (61.8%) to 5044 (61.7%) for children whose parents were not selected to apply. Children whose parents were randomly selected to apply for Medicaid had 18% higher odds of being covered in the first 6 months after parent’s selection compared with children whose parents were not selected (adjusted odds ratio [AOR] = 1.18; 95% CI, 1.10–1.27). The effect remained significant during months 7 to 12 (AOR = 1.11; 95% CI, 1.03–1.19); months 13 to 18 showed a positive but not significant effect (AOR = 1.07; 95% CI, 0.99–1.14). Children whose parents were selected and obtained coverage had more than double the odds of having coverage compared with children whose parents were not selected and did not gain coverage (AOR = 2.37; 95% CI, 2.14–2.64). CONCLUSIONS AND RELEVANCE Children’s odds of having Medicaid or CHIP coverage increased when their parents were randomly selected to apply for Medicaid. Children whose parents were selected and subsequently obtained coverage benefited most. This study demonstrates a causal link between parents’ access to Medicaid coverage and their children’s coverage. PMID:25561041

  7. Principal component analysis of binding energies for single-point mutants of hT2R16 bound to an agonist correlate with experimental mutant cell response.

    PubMed

    Chen, Derek E; Willick, Darryl L; Ruckel, Joseph B; Floriano, Wely B

    2015-01-01

    Directed evolution is a technique that enables the identification of mutants of a particular protein that carry a desired property by successive rounds of random mutagenesis, screening, and selection. This technique has many applications, including the development of G protein-coupled receptor-based biosensors and designer drugs for personalized medicine. Although effective, directed evolution is not without challenges and can greatly benefit from the development of computational techniques to predict the functional outcome of single-point amino acid substitutions. In this article, we describe a molecular dynamics-based approach to predict the effects of single amino acid substitutions on agonist binding (salicin) to a human bitter taste receptor (hT2R16). An experimentally determined functional map of single-point amino acid substitutions was used to validate the whole-protein molecular dynamics-based predictive functions. Molecular docking was used to construct a wild-type agonist-receptor complex, providing a starting structure for single-point substitution simulations. The effects of each single amino acid substitution in the functional response of the receptor to its agonist were estimated using three binding energy schemes with increasing inclusion of solvation effects. We show that molecular docking combined with molecular mechanics simulations of single-point mutants of the agonist-receptor complex accurately predicts the functional outcome of single amino acid substitutions in a human bitter taste receptor.

  8. The impact of rehabilitation and counseling services on the labor market activity of Social Security Disability Insurance (SSDI) beneficiaries.

    PubMed

    Weathers, Robert R; Bailey, Michelle Stegman

    2014-01-01

    We use data from a social experiment to estimate the impact of a rehabilitation and counseling program on the labor market activity of newly entitled Social Security Disability Insurance (SSDI) beneficiaries. Our results indicate that the program led to a 4.6 percentage point increase in the receipt of employment services within the first year following random assignment and a 5.1 percentage point increase in participation in the Social Security Administration's Ticket to Work program within the first three years following random assignment. The program led to a 5.3 percentage point increase, or almost 50 percent increase, in employment, and an $831 increase in annual earnings in the second calendar year after the calendar year of random assignment. The employment and earnings impacts are smaller and not statistically significant in the third calendar year following random assignment, and we describe SSDI rules that are consistent with this finding. Our findings indicate that disability reform proposals focusing on restoring the work capacity of people with disabilities can increase the disability employment rate.

  9. Comparability of item quality indices from sparse data matrices with random and non-random missing data patterns.

    PubMed

    Wolfe, Edward W; McGill, Michael T

    2011-01-01

    This article summarizes a simulation study of the performance of five item quality indicators (the weighted and unweighted versions of the mean square and standardized mean square fit indices and the point-measure correlation) under conditions of relatively high and low amounts of missing data under both random and conditional patterns of missing data for testing contexts such as those encountered in operational administrations of a computerized adaptive certification or licensure examination. The results suggest that weighted fit indices, particularly the standardized mean square index, and the point-measure correlation provide the most consistent information between random and conditional missing data patterns and that these indices perform more comparably for items near the passing score than for items with extreme difficulty values.

  10. Estimating the efficacy of Alcoholics Anonymous without self-selection bias: An instrumental variables re-analysis of randomized clinical trials

    PubMed Central

    Humphreys, Keith; Blodgett, Janet C.; Wagner, Todd H.

    2014-01-01

    Background Observational studies of Alcoholics Anonymous’ (AA) effectiveness are vulnerable to self-selection bias because individuals choose whether or not to attend AA. The present study therefore employed an innovative statistical technique to derive a selection bias-free estimate of AA’s impact. Methods Six datasets from 5 National Institutes of Health-funded randomized trials (one with two independent parallel arms) of AA facilitation interventions were analyzed using instrumental variables models. Alcohol dependent individuals in one of the datasets (n = 774) were analyzed separately from the rest of sample (n = 1582 individuals pooled from 5 datasets) because of heterogeneity in sample parameters. Randomization itself was used as the instrumental variable. Results Randomization was a good instrument in both samples, effectively predicting increased AA attendance that could not be attributed to self-selection. In five of the six data sets, which were pooled for analysis, increased AA attendance that was attributable to randomization (i.e., free of self-selection bias) was effective at increasing days of abstinence at 3-month (B = .38, p = .001) and 15-month (B = 0.42, p = .04) follow-up. However, in the remaining dataset, in which pre-existing AA attendance was much higher, further increases in AA involvement caused by the randomly assigned facilitation intervention did not affect drinking outcome. Conclusions For most individuals seeking help for alcohol problems, increasing AA attendance leads to short and long term decreases in alcohol consumption that cannot be attributed to self-selection. However, for populations with high pre-existing AA involvement, further increases in AA attendance may have little impact. PMID:25421504

  11. Estimating the efficacy of Alcoholics Anonymous without self-selection bias: an instrumental variables re-analysis of randomized clinical trials.

    PubMed

    Humphreys, Keith; Blodgett, Janet C; Wagner, Todd H

    2014-11-01

    Observational studies of Alcoholics Anonymous' (AA) effectiveness are vulnerable to self-selection bias because individuals choose whether or not to attend AA. The present study, therefore, employed an innovative statistical technique to derive a selection bias-free estimate of AA's impact. Six data sets from 5 National Institutes of Health-funded randomized trials (1 with 2 independent parallel arms) of AA facilitation interventions were analyzed using instrumental variables models. Alcohol-dependent individuals in one of the data sets (n = 774) were analyzed separately from the rest of sample (n = 1,582 individuals pooled from 5 data sets) because of heterogeneity in sample parameters. Randomization itself was used as the instrumental variable. Randomization was a good instrument in both samples, effectively predicting increased AA attendance that could not be attributed to self-selection. In 5 of the 6 data sets, which were pooled for analysis, increased AA attendance that was attributable to randomization (i.e., free of self-selection bias) was effective at increasing days of abstinence at 3-month (B = 0.38, p = 0.001) and 15-month (B = 0.42, p = 0.04) follow-up. However, in the remaining data set, in which preexisting AA attendance was much higher, further increases in AA involvement caused by the randomly assigned facilitation intervention did not affect drinking outcome. For most individuals seeking help for alcohol problems, increasing AA attendance leads to short- and long-term decreases in alcohol consumption that cannot be attributed to self-selection. However, for populations with high preexisting AA involvement, further increases in AA attendance may have little impact. Copyright © 2014 by the Research Society on Alcoholism.

  12. Practical guidance on characterizing availability in resource selection functions under a use-availability design

    USGS Publications Warehouse

    Northrup, Joseph M.; Hooten, Mevin B.; Anderson, Charles R.; Wittemyer, George

    2013-01-01

    Habitat selection is a fundamental aspect of animal ecology, the understanding of which is critical to management and conservation. Global positioning system data from animals allow fine-scale assessments of habitat selection and typically are analyzed in a use-availability framework, whereby animal locations are contrasted with random locations (the availability sample). Although most use-availability methods are in fact spatial point process models, they often are fit using logistic regression. This framework offers numerous methodological challenges, for which the literature provides little guidance. Specifically, the size and spatial extent of the availability sample influences coefficient estimates potentially causing interpretational bias. We examined the influence of availability on statistical inference through simulations and analysis of serially correlated mule deer GPS data. Bias in estimates arose from incorrectly assessing and sampling the spatial extent of availability. Spatial autocorrelation in covariates, which is common for landscape characteristics, exacerbated the error in availability sampling leading to increased bias. These results have strong implications for habitat selection analyses using GPS data, which are increasingly prevalent in the literature. We recommend researchers assess the sensitivity of their results to their availability sample and, where bias is likely, take care with interpretations and use cross validation to assess robustness.

  13. Molecular mechanisms of retroviral integration site selection

    PubMed Central

    Kvaratskhelia, Mamuka; Sharma, Amit; Larue, Ross C.; Serrao, Erik; Engelman, Alan

    2014-01-01

    Retroviral replication proceeds through an obligate integrated DNA provirus, making retroviral vectors attractive vehicles for human gene-therapy. Though most of the host cell genome is available for integration, the process of integration site selection is not random. Retroviruses differ in their choice of chromatin-associated features and also prefer particular nucleotide sequences at the point of insertion. Lentiviruses including HIV-1 preferentially integrate within the bodies of active genes, whereas the prototypical gammaretrovirus Moloney murine leukemia virus (MoMLV) favors strong enhancers and active gene promoter regions. Integration is catalyzed by the viral integrase protein, and recent research has demonstrated that HIV-1 and MoMLV targeting preferences are in large part guided by integrase-interacting host factors (LEDGF/p75 for HIV-1 and BET proteins for MoMLV) that tether viral intasomes to chromatin. In each case, the selectivity of epigenetic marks on histones recognized by the protein tether helps to determine the integration distribution. In contrast, nucleotide preferences at integration sites seem to be governed by the ability for the integrase protein to locally bend the DNA duplex for pairwise insertion of the viral DNA ends. We discuss approaches to alter integration site selection that could potentially improve the safety of retroviral vectors in the clinic. PMID:25147212

  14. Mutation-selection equilibrium in games with mixed strategies.

    PubMed

    Tarnita, Corina E; Antal, Tibor; Nowak, Martin A

    2009-11-07

    We develop a new method for studying stochastic evolutionary game dynamics of mixed strategies. We consider the general situation: there are n pure strategies whose interactions are described by an nxn payoff matrix. Players can use mixed strategies, which are given by the vector (p(1),...,p(n)). Each entry specifies the probability to use the corresponding pure strategy. The sum over all entries is one. Therefore, a mixed strategy is a point in the simplex S(n). We study evolutionary dynamics in a well-mixed population of finite size. Individuals reproduce proportional to payoff. We consider the case of weak selection, which means the payoff from the game is only a small contribution to overall fitness. Reproduction can be subject to mutation; a mutant adopts a randomly chosen mixed strategy. We calculate the average abundance of every mixed strategy in the stationary distribution of the mutation-selection process. We find the crucial conditions that specify if a strategy is favored or opposed by selection. One condition holds for low mutation rate, another for high mutation rate. The result for any mutation rate is a linear combination of those two. As a specific example we study the Hawk-Dove game. We prove general statements about the relationship between games with pure and with mixed strategies.

  15. Extensively Parameterized Mutation-Selection Models Reliably Capture Site-Specific Selective Constraint.

    PubMed

    Spielman, Stephanie J; Wilke, Claus O

    2016-11-01

    The mutation-selection model of coding sequence evolution has received renewed attention for its use in estimating site-specific amino acid propensities and selection coefficient distributions. Two computationally tractable mutation-selection inference frameworks have been introduced: One framework employs a fixed-effects, highly parameterized maximum likelihood approach, whereas the other employs a random-effects Bayesian Dirichlet Process approach. While both implementations follow the same model, they appear to make distinct predictions about the distribution of selection coefficients. The fixed-effects framework estimates a large proportion of highly deleterious substitutions, whereas the random-effects framework estimates that all substitutions are either nearly neutral or weakly deleterious. It remains unknown, however, how accurately each method infers evolutionary constraints at individual sites. Indeed, selection coefficient distributions pool all site-specific inferences, thereby obscuring a precise assessment of site-specific estimates. Therefore, in this study, we use a simulation-based strategy to determine how accurately each approach recapitulates the selective constraint at individual sites. We find that the fixed-effects approach, despite its extensive parameterization, consistently and accurately estimates site-specific evolutionary constraint. By contrast, the random-effects Bayesian approach systematically underestimates the strength of natural selection, particularly for slowly evolving sites. We also find that, despite the strong differences between their inferred selection coefficient distributions, the fixed- and random-effects approaches yield surprisingly similar inferences of site-specific selective constraint. We conclude that the fixed-effects mutation-selection framework provides the more reliable software platform for model application and future development. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Everolimus-Eluting Stents or Bypass Surgery for Left Main Coronary Artery Disease.

    PubMed

    Stone, Gregg W; Sabik, Joseph F; Serruys, Patrick W; Simonton, Charles A; Généreux, Philippe; Puskas, John; Kandzari, David E; Morice, Marie-Claude; Lembo, Nicholas; Brown, W Morris; Taggart, David P; Banning, Adrian; Merkely, Béla; Horkay, Ferenc; Boonstra, Piet W; van Boven, Ad J; Ungi, Imre; Bogáts, Gabor; Mansour, Samer; Noiseux, Nicolas; Sabaté, Manel; Pomar, José; Hickey, Mark; Gershlick, Anthony; Buszman, Pawel; Bochenek, Andrzej; Schampaert, Erick; Pagé, Pierre; Dressler, Ovidiu; Kosmidou, Ioanna; Mehran, Roxana; Pocock, Stuart J; Kappetein, A Pieter

    2016-12-08

    Patients with obstructive left main coronary artery disease are usually treated with coronary-artery bypass grafting (CABG). Randomized trials have suggested that drug-eluting stents may be an acceptable alternative to CABG in selected patients with left main coronary disease. We randomly assigned 1905 eligible patients with left main coronary artery disease of low or intermediate anatomical complexity to undergo either percutaneous coronary intervention (PCI) with fluoropolymer-based cobalt-chromium everolimus-eluting stents (PCI group, 948 patients) or CABG (CABG group, 957 patients). Anatomic complexity was assessed at the sites and defined by a Synergy between Percutaneous Coronary Intervention with Taxus and Cardiac Surgery (SYNTAX) score of 32 or lower (the SYNTAX score reflects a comprehensive angiographic assessment of the coronary vasculature, with 0 as the lowest score and higher scores [no upper limit] indicating more complex coronary anatomy). The primary end point was the rate of a composite of death from any cause, stroke, or myocardial infarction at 3 years, and the trial was powered for noninferiority testing of the primary end point (noninferiority margin, 4.2 percentage points). Major secondary end points included the rate of a composite of death from any cause, stroke, or myocardial infarction at 30 days and the rate of a composite of death, stroke, myocardial infarction, or ischemia-driven revascularization at 3 years. Event rates were based on Kaplan-Meier estimates in time-to-first-event analyses. At 3 years, a primary end-point event had occurred in 15.4% of the patients in the PCI group and in 14.7% of the patients in the CABG group (difference, 0.7 percentage points; upper 97.5% confidence limit, 4.0 percentage points; P=0.02 for noninferiority; hazard ratio, 1.00; 95% confidence interval, 0.79 to 1.26; P=0.98 for superiority). The secondary end-point event of death, stroke, or myocardial infarction at 30 days occurred in 4.9% of the patients in the PCI group and in 7.9% in the CABG group (P<0.001 for noninferiority, P=0.008 for superiority). The secondary end-point event of death, stroke, myocardial infarction, or ischemia-driven revascularization at 3 years occurred in 23.1% of the patients in the PCI group and in 19.1% in the CABG group (P=0.01 for noninferiority, P=0.10 for superiority). In patients with left main coronary artery disease and low or intermediate SYNTAX scores by site assessment, PCI with everolimus-eluting stents was noninferior to CABG with respect to the rate of the composite end point of death, stroke, or myocardial infarction at 3 years. (Funded by Abbott Vascular; EXCEL ClinicalTrials.gov number, NCT01205776 .).

  17. Effects of major depression on moment-in-time work performance.

    PubMed

    Wang, Philip S; Beck, Arne L; Berglund, Pat; McKenas, David K; Pronk, Nicolaas P; Simon, Gregory E; Kessler, Ronald C

    2004-10-01

    Although major depression is thought to have substantial negative effects on work performance, the possibility of recall bias limits self-report studies of these effects. The authors used the experience sampling method to address this problem by collecting comparative data on moment-in-time work performance among service workers who were depressed and those who were not depressed. The group studied included 105 airline reservation agents and 181 telephone customer service representatives selected from a larger baseline sample; depressed workers were deliberately oversampled. Respondents were given pagers and experience sampling method diaries for each day of the study. A computerized autodialer paged respondents at random time points. When paged, respondents reported on their work performance in the diary. Moment-in-time work performance was assessed at five random times each day over a 7-day data collection period (35 data points for each respondent). Seven conditions (allergies, arthritis, back pain, headaches, high blood pressure, asthma, and major depression) occurred often enough in this group of respondents to be studied. Major depression was the only condition significantly related to decrements in both of the dimensions of work performance assessed in the diaries: task focus and productivity. These effects were equivalent to approximately 2.3 days absent because of sickness per depressed worker per month of being depressed. Previous studies based on days missed from work significantly underestimate the adverse economic effects associated with depression. Productivity losses related to depression appear to exceed the costs of effective treatment.

  18. A comparison of lower canine retraction and loss of anchorage between conventional and self-ligating brackets: a single-center randomized split-mouth controlled trial.

    PubMed

    da Costa Monini, André; Júnior, Luiz Gonzaga Gandini; Vianna, Alexandre Protásio; Martins, Renato Parsekian

    2017-05-01

    To evaluate the rate of lower canine retraction, anchorage loss, and changes on lower canines and first molars axial inclination using self-ligating and conventional brackets. Twenty-five adult patients with a treatment plan involving extractions of four first premolars were selected for this split-mouth trial and had either conventional or self-ligating brackets bonded to lower canines in a block randomization. Retraction was accomplished using 100-g nickel titanium closed-coil springs, which were reactivated each 4 weeks. Oblique radiographs were taken before and after total canine retraction and the cephalograms were superimposed on stable structures of the mandible. Cephalometric points were digitized twice by a single-blinded operator for error control and the average of the points were used to determine the following variables: canine cusp horizontal changes, molar cusp horizontal changes, and angulation changes in canines and molars. Paired t tests were used to analyze the blinded data for group differences. All patients reached final phase without bracket debonds. No differences were found between the two groups for all variables tested. No serious harm was observed. Both brackets showed the same rate of canine retraction and loss of anteroposterior anchorage of the molars. No changes were found between brackets regarding the inclination of canines and first molars. Using self-ligating brackets to retract lower canines will not increase the velocity of tooth movement, does not increase anchorage, and does not decrease tipping.

  19. Randomized and controlled prospective trials of Ultrasound-guided spinal nerve posterior ramus pulsed radiofrequency treatment for lower back post-herpetic neuralgia.

    PubMed

    Pi, Z B; Lin, H; He, G D; Cai, Z; Xu, X Z

    2015-01-01

    To evaluate the efficacy of ultrasound-guided spinal nerve posterior ramus pulsed radiofrequency treatment for lower back post-herpetic neuralgia. 128 cases of lower back or anterior abdominal wall acute post-herpetic neuralgia patients were selected. They were randomly divided into two groups. Group A: oral treatment only with gabapentin + celecoxib + amitriptyline. Group B: while taking these drugs, patients were treated with radiofrequency (RF) pulses using a portable ultrasound device using the paravertebral puncture technique. In both groups, sudden outbreaks of pain were treated with immediate release 10mg morphine tablets. Visual analogue scale (VAS) was used for pain score, Pittsburgh Sleep Quality Index scale (PSQI) was used to evaluate sleep quality and morphine consumption were recorded at different time points, before and after treatment. Treatment efficiency was calculated while the occurrence of complications was documented. At each time point after treatment, VAS scores were lower, but scores in the RF group was significantly lower than those of the oral-only group. In terms of sleep quality scores and morphine consumption between the two groups, the RF group was significantly lower than the oral-only group. During the procedure no error occurred with needle penetrating the abdominal cavity, chest, offal or blood vessels. Ultrasound-guided spinal nerve posterior ramus pulsed radiofrequency treatment of lower back or anterior abdominal wall post-herpetic neuralgia proved effective by reducing morphine use in patients and led to fewer adverse reactions.

  20. The Evaluation of Bivariate Mixed Models in Meta-analyses of Diagnostic Accuracy Studies with SAS, Stata and R.

    PubMed

    Vogelgesang, Felicitas; Schlattmann, Peter; Dewey, Marc

    2018-05-01

    Meta-analyses require a thoroughly planned procedure to obtain unbiased overall estimates. From a statistical point of view not only model selection but also model implementation in the software affects the results. The present simulation study investigates the accuracy of different implementations of general and generalized bivariate mixed models in SAS (using proc mixed, proc glimmix and proc nlmixed), Stata (using gllamm, xtmelogit and midas) and R (using reitsma from package mada and glmer from package lme4). Both models incorporate the relationship between sensitivity and specificity - the two outcomes of interest in meta-analyses of diagnostic accuracy studies - utilizing random effects. Model performance is compared in nine meta-analytic scenarios reflecting the combination of three sizes for meta-analyses (89, 30 and 10 studies) with three pairs of sensitivity/specificity values (97%/87%; 85%/75%; 90%/93%). The evaluation of accuracy in terms of bias, standard error and mean squared error reveals that all implementations of the generalized bivariate model calculate sensitivity and specificity estimates with deviations less than two percentage points. proc mixed which together with reitsma implements the general bivariate mixed model proposed by Reitsma rather shows convergence problems. The random effect parameters are in general underestimated. This study shows that flexibility and simplicity of model specification together with convergence robustness should influence implementation recommendations, as the accuracy in terms of bias was acceptable in all implementations using the generalized approach. Schattauer GmbH.

  1. The analgesic effect of photobiomodulation therapy (830 nm) on the masticatory muscles: a randomized, double-blind study.

    PubMed

    Costa, Sabrina Araújo Pinho; Florezi, Giovanna Piacenza; Artes, Gisele Ebling; Costa, Jessica Ribeiro da; Gallo, Rosane Tronchin; Freitas, Patricia Moreira de; Witzel, Andrea Lusvarghi

    2017-12-18

    This study assesses the efficacy of photobiomodulation therapy (830 nm) for myalgia treatment of masticatory muscles. Sixty patients with muscular myalgia were selected and randomly allocated into 2 groups (n=30): Group A comprised patients given a placebo (control), and Group B consisted of those undergoing photobiomodulation therapy (PBMT). PBMT and placebo were applied bilaterally to specific points on the masseter and temporal muscles. Referred pain elicited by palpation and maximum mouth opening were measured before (EV1) and after (EV2) the treatments. The data were analyzed using statistical tests, considering a significance level of 5%. No significant differences in range were observed for active or passive mouth opening (p ≥ 0.05). Comparing the final outcomes (EV1-EV2) of both treatments, statistical significance was verified for total pain in the right masseter muscle (p = 0.001) and total pain (p = 0.005). In EV2, significant differences in pain reported with palpation were found between Groups A and B for the following: left posterior temporal muscle (p = 0.025), left superior masseter muscle (p = 0.036), inferior masseter muscle (p = 0.021), total pain (left side) (p = 0.009), total masseter muscle (left side) (p = 0.014), total temporal (left side) (p = 0.024), and total pain (p = 0.035). We concluded that PBMT (830 nm) reduces pain in algic points, but does not influence the extent of mouth opening in patients with myalgia.

  2. Model selection and averaging in the assessment of the drivers of household food waste to reduce the probability of false positives.

    PubMed

    Grainger, Matthew James; Aramyan, Lusine; Piras, Simone; Quested, Thomas Edward; Righi, Simone; Setti, Marco; Vittuari, Matteo; Stewart, Gavin Bruce

    2018-01-01

    Food waste from households contributes the greatest proportion to total food waste in developed countries. Therefore, food waste reduction requires an understanding of the socio-economic (contextual and behavioural) factors that lead to its generation within the household. Addressing such a complex subject calls for sound methodological approaches that until now have been conditioned by the large number of factors involved in waste generation, by the lack of a recognised definition, and by limited available data. This work contributes to food waste generation literature by using one of the largest available datasets that includes data on the objective amount of avoidable household food waste, along with information on a series of socio-economic factors. In order to address one aspect of the complexity of the problem, machine learning algorithms (random forests and boruta) for variable selection integrated with linear modelling, model selection and averaging are implemented. Model selection addresses model structural uncertainty, which is not routinely considered in assessments of food waste in literature. The main drivers of food waste in the home selected in the most parsimonious models include household size, the presence of fussy eaters, employment status, home ownership status, and the local authority. Results, regardless of which variable set the models are run on, point toward large households as being a key target element for food waste reduction interventions.

  3. Model selection and averaging in the assessment of the drivers of household food waste to reduce the probability of false positives

    PubMed Central

    Aramyan, Lusine; Piras, Simone; Quested, Thomas Edward; Righi, Simone; Setti, Marco; Vittuari, Matteo; Stewart, Gavin Bruce

    2018-01-01

    Food waste from households contributes the greatest proportion to total food waste in developed countries. Therefore, food waste reduction requires an understanding of the socio-economic (contextual and behavioural) factors that lead to its generation within the household. Addressing such a complex subject calls for sound methodological approaches that until now have been conditioned by the large number of factors involved in waste generation, by the lack of a recognised definition, and by limited available data. This work contributes to food waste generation literature by using one of the largest available datasets that includes data on the objective amount of avoidable household food waste, along with information on a series of socio-economic factors. In order to address one aspect of the complexity of the problem, machine learning algorithms (random forests and boruta) for variable selection integrated with linear modelling, model selection and averaging are implemented. Model selection addresses model structural uncertainty, which is not routinely considered in assessments of food waste in literature. The main drivers of food waste in the home selected in the most parsimonious models include household size, the presence of fussy eaters, employment status, home ownership status, and the local authority. Results, regardless of which variable set the models are run on, point toward large households as being a key target element for food waste reduction interventions. PMID:29389949

  4. The Responsiveness of Biological Motion Processing Areas to Selective Attention Towards Goals

    PubMed Central

    Herrington, John; Nymberg, Charlotte; Faja, Susan; Price, Elinora; Schultz, Robert

    2012-01-01

    A growing literature indicates that visual cortex areas viewed as primarily responsive to exogenous stimuli are susceptible to top-down modulation by selective attention. The present study examines whether brain areas involved in biological motion perception are among these areas – particularly with respect to selective attention towards human movement goals. Fifteen participants completed a point-light biological motion study following a two-by-two factorial design, with one factor representing an exogenous manipulation of human movement goals (goal-directed versus random movement), and the other an endogenous manipulation (a goal identification task versus an ancillary color-change task). Both manipulations yielded increased activation in the human homologue of motion-sensitive area MT+ (hMT+) as well as the extrastriate body area (EBA). The endogenous manipulation was associated with increased right posterior superior temporal sulcus (STS) activation, whereas the exogenous manipulation was associated with increased activation in left posterior STS. Selective attention towards goals activated portion of left hMT+/EBA only during the perception of purposeful movement consistent with emerging theories associating this area with the matching of visual motion input to known goal-directed actions. The overall pattern of results indicates that attention towards the goals of human movement activates biological motion areas. Ultimately, selective attention may explain why some studies examining biological motion show activation in hMT+ and EBA, even when using control stimuli with comparable motion properties. PMID:22796987

  5. Chaos and complexity by design

    DOE PAGES

    Roberts, Daniel A.; Yoshida, Beni

    2017-04-20

    We study the relationship between quantum chaos and pseudorandomness by developing probes of unitary design. A natural probe of randomness is the “frame poten-tial,” which is minimized by unitary k-designs and measures the 2-norm distance between the Haar random unitary ensemble and another ensemble. A natural probe of quantum chaos is out-of-time-order (OTO) four-point correlation functions. We also show that the norm squared of a generalization of out-of-time-order 2k-point correlators is proportional to the kth frame potential, providing a quantitative connection between chaos and pseudorandomness. In addition, we prove that these 2k-point correlators for Pauli operators completely determine the k-foldmore » channel of an ensemble of unitary operators. Finally, we use a counting argument to obtain a lower bound on the quantum circuit complexity in terms of the frame potential. This provides a direct link between chaos, complexity, and randomness.« less

  6. Derivatives of random matrix characteristic polynomials with applications to elliptic curves

    NASA Astrophysics Data System (ADS)

    Snaith, N. C.

    2005-12-01

    The value distribution of derivatives of characteristic polynomials of matrices from SO(N) is calculated at the point 1, the symmetry point on the unit circle of the eigenvalues of these matrices. We consider subsets of matrices from SO(N) that are constrained to have at least n eigenvalues equal to 1 and investigate the first non-zero derivative of the characteristic polynomial at that point. The connection between the values of random matrix characteristic polynomials and values of L-functions in families has been well established. The motivation for this work is the expectation that through this connection with L-functions derived from families of elliptic curves, and using the Birch and Swinnerton-Dyer conjecture to relate values of the L-functions to the rank of elliptic curves, random matrix theory will be useful in probing important questions concerning these ranks.

  7. A stochastic-geometric model of soil variation in Pleistocene patterned ground

    NASA Astrophysics Data System (ADS)

    Lark, Murray; Meerschman, Eef; Van Meirvenne, Marc

    2013-04-01

    In this paper we examine the spatial variability of soil in parent material with complex spatial structure which arises from complex non-linear geomorphic processes. We show that this variability can be better-modelled by a stochastic-geometric model than by a standard Gaussian random field. The benefits of the new model are seen in the reproduction of features of the target variable which influence processes like water movement and pollutant dispersal. Complex non-linear processes in the soil give rise to properties with non-Gaussian distributions. Even under a transformation to approximate marginal normality, such variables may have a more complex spatial structure than the Gaussian random field model of geostatistics can accommodate. In particular the extent to which extreme values of the variable are connected in spatially coherent regions may be misrepresented. As a result, for example, geostatistical simulation generally fails to reproduce the pathways for preferential flow in an environment where coarse infill of former fluvial channels or coarse alluvium of braided streams creates pathways for rapid movement of water. Multiple point geostatistics has been developed to deal with this problem. Multiple point methods proceed by sampling from a set of training images which can be assumed to reproduce the non-Gaussian behaviour of the target variable. The challenge is to identify appropriate sources of such images. In this paper we consider a mode of soil variation in which the soil varies continuously, exhibiting short-range lateral trends induced by local effects of the factors of soil formation which vary across the region of interest in an unpredictable way. The trends in soil variation are therefore only apparent locally, and the soil variation at regional scale appears random. We propose a stochastic-geometric model for this mode of soil variation called the Continuous Local Trend (CLT) model. We consider a case study of soil formed in relict patterned ground with pronounced lateral textural variations arising from the presence of infilled ice-wedges of Pleistocene origin. We show how knowledge of the pedogenetic processes in this environment, along with some simple descriptive statistics, can be used to select and fit a CLT model for the apparent electrical conductivity (ECa) of the soil. We use the model to simulate realizations of the CLT process, and compare these with realizations of a fitted Gaussian random field. We show how statistics that summarize the spatial coherence of regions with small values of ECa, which are expected to have coarse texture and so larger saturated hydraulic conductivity, are better reproduced by the CLT model than by the Gaussian random field. This suggests that the CLT model could be used to generate an unlimited supply of training images to allow multiple point geostatistical simulation or prediction of this or similar variables.

  8. Do pharmacy staff recommend evidenced-based smoking cessation products? A pseudo patron study.

    PubMed

    Chiang, P P C; Chapman, S

    2006-06-01

    To determine whether pharmacy staff recommend evidence-based smoking cessation aids. Pseudo patron visit to 50 randomly selected Sydney pharmacies where the pseudo patron enquired about the 'best' way to quit smoking and about the efficacy of a non-evidence-based cessation product, NicoBloc. Nicotine replacement therapy was universally stocked and the first product recommended by 90% of pharmacies. After prompting, 60% of pharmacies, either also recommended NicoBloc or deferred to 'customer choice'. About 34% disparaged the product. Evidence-based smoking cessation advice in Sydney pharmacies is fragile and may be compromised by commercial concerns. Smokers should be provided with independent point-of-sale summaries of evidence of cessation product effectiveness and warned about unsubstantiated claims.

  9. Comparing the Effect of 3 Kinds of Different Materials on the Hemostasis of the Central Venous Catheter

    NASA Astrophysics Data System (ADS)

    Li, Yan-Ming; Liang, Zhen-Zhen; Song, Chun-Lei

    2016-05-01

    To compare the effect of 3 kinds of different materials on the hemostasis of puncture site after central venous catheterization. Method: A selection of 120 patients with peripheral central venous catheter chemotherapy in the Affiliated Hospital of our university from January 2014 to April 2015, Randomly divided into 3 groups, using the same specification (3.5cm × 2cm) alginate gelatin sponge and gauze dressing, 3 kinds of material compression puncture point, 3 groups of patients after puncture 24 h within the puncture point of local blood and the catheter after the catheter 72 h within the catheter maintenance costs. Result: (1) The local infiltration of the puncture point in the 24 h tube: The use of alginate dressing and gelatin sponge hemostatic effect is better than that of compression gauze. The difference was statistically significant (P <0.05). Compared with gelatin sponge and alginate dressing hemostatic effect, The difference was not statistically significant. (2) Tube maintenance cost: Puncture point using gelatin sponge, The local maintenance costs of the catheter within 72 h after insertion of the tube are lowest, compared with alginate dressing and gauze was significant (P<0.05). Conclusion: The choice of compression hemostasis material for the puncture site after PICC implantation, using gelatin sponge and gauze dressing is more effective and economic.

  10. Tobacco point-of-sale advertising in Guatemala City, Guatemala and Buenos Aires, Argentina

    PubMed Central

    Mejia, Raul; Szeinman, Debora; Kummerfeldt, Carlos E

    2010-01-01

    Objectives To determine tobacco point of sale advertising prevalence in Guatemala City, Guatemala and Buenos Aires, Argentina. Methods Convenience stores (120 per city) were chosen from randomly selected blocks in low, middle and high socioeconomic neighbourhoods. To assess tobacco point of sale advertising we used a checklist developed in Canada that was translated into Spanish and validated in both countries studied. Analysis was conducted by neighbourhood and store type. Results All stores sold cigarettes and most had tobacco products in close proximity to confectionery. In Guatemala, 60% of stores had cigarette ads. High and middle socioeconomic status neighbourhood stores had more indoor cigarette ads, but these differences were determined by store type: gas stations and supermarkets were more prevalent in high socioeconomic status neighbourhoods and had more indoor cigarette ads. In poorer areas, however, more ads could be seen from outside the stores, more stores were located within 100 metres of schools and fewer stores had ‘No smoking’ or ‘No sales to minors’ signs. In Argentina, 80% of stores had cigarette ads and few differences were observed by neighbourhood socioeconomic status. Compared to Guatemala, ‘No sales to minors’ signs were more prevalent in Argentina. Conclusions Tobacco point of sale advertising is highly prevalent in these two cities of Guatemala and Argentina. An advertising ban should also include this type of advertising. PMID:20530136

  11. Nonlinear probabilistic finite element models of laminated composite shells

    NASA Technical Reports Server (NTRS)

    Engelstad, S. P.; Reddy, J. N.

    1993-01-01

    A probabilistic finite element analysis procedure for laminated composite shells has been developed. A total Lagrangian finite element formulation, employing a degenerated 3-D laminated composite shell with the full Green-Lagrange strains and first-order shear deformable kinematics, forms the modeling foundation. The first-order second-moment technique for probabilistic finite element analysis of random fields is employed and results are presented in the form of mean and variance of the structural response. The effects of material nonlinearity are included through the use of a rate-independent anisotropic plasticity formulation with the macroscopic point of view. Both ply-level and micromechanics-level random variables can be selected, the latter by means of the Aboudi micromechanics model. A number of sample problems are solved to verify the accuracy of the procedures developed and to quantify the variability of certain material type/structure combinations. Experimental data is compared in many cases, and the Monte Carlo simulation method is used to check the probabilistic results. In general, the procedure is quite effective in modeling the mean and variance response of the linear and nonlinear behavior of laminated composite shells.

  12. Design issues in a randomized controlled trial of a pre-emptive versus empiric antifungal strategy for invasive aspergillosis in patients with high-risk hematologic malignancies.

    PubMed

    Morrissey, C Orla; Chen, Sharon C-A; Sorrell, Tania C; Bradstock, Kenneth F; Szer, Jeffrey; Halliday, Catriona L; Gilroy, Nicole M; Schwarer, Anthony P; Slavin, Monica A

    2011-02-01

    Invasive aspergillosis (IA) is a major cause of mortality in patients with hematological malignancies, due largely to the inability of traditional culture and biopsy methods to make an early or accurate diagnosis. Diagnostic accuracy studies suggest that Aspergillus galactomannan (GM) enzyme immunoassay (ELISA) and Aspergillus PCR-based methods may overcome these limitations, but their impact on patient outcomes should be evaluated in a diagnostic randomized controlled trial (D-RCT). This article describes the methodology of a D-RCT which compares a new pre-emptive strategy (GM-ELISA- and Aspergillus PCR-driven antifungal therapy) with the standard fever-driven empiric antifungal treatment strategy. Issues including primary end-point and patient selection, duration of screening, choice of tests for the pre-emptive strategy, antifungal prophylaxis and bias control, which were considered in the design of the trial, are discussed. We suggest that the template presented herein is considered by researchers when evaluating the utility of new diagnostic tests (ClinicalTrials.gov number, NCT00163722).

  13. Chance performance and floor effects: threats to the validity of the Wechsler Memory Scale--fourth edition designs subtest.

    PubMed

    Martin, Phillip K; Schroeder, Ryan W

    2014-06-01

    The Designs subtest allows for accumulation of raw score points by chance alone, creating the potential for artificially inflated performances, especially in older patients. A random number generator was used to simulate the random selection and placement of cards by 100 test naive participants, resulting in a mean raw score of 36.26 (SD = 3.86). This resulted in relatively high-scaled scores in the 45-54, 55-64, and 65-69 age groups on Designs II. In the latter age group, in particular, the mean simulated performance resulted in a scaled score of 7, with scores 1 SD below and above the performance mean translating to scaled scores of 5 and 8, respectively. The findings indicate that clinicians should use caution when interpreting Designs II performance in these age groups, as our simulations demonstrated that low average to average range scores occur frequently when patients are relying solely on chance performance. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.

    PubMed

    Verde, Pablo E

    2010-12-30

    In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.

  15. Double versus single stenting for coronary bifurcation lesions: a meta-analysis.

    PubMed

    Katritsis, Demosthenes G; Siontis, George C M; Ioannidis, John P A

    2009-10-01

    Several trials have addressed whether bifurcation lesions require stenting of both the main vessel and side branch, but uncertainty remains on the benefits of such double versus single stenting of the main vessel only. We have conducted a meta-analysis of randomized trials including patients with coronary bifurcation lesions who were randomly selected to undergo percutaneous coronary intervention by either double or single stenting. Six studies (n=1642 patients) were eligible. There was increased risk of myocardial infarction with double stenting (risk ratio, 1.78; P=0.001 by fixed effects; risk ratio, 1.49 with Bayesian meta-analysis). The summary point estimate suggested also an increased risk of stent thrombosis with double stenting, but the difference was not nominally significant given the sparse data (risk ratio, 1.85; P=0.19). No obvious difference was seen for death (risk ratio, 0.81; P=0.66) and target lesion revascularization (risk ratio, 1.09; P=0.67). Stenting of both the main vessel and side branch in bifurcation lesions may increase myocardial infarction and stent thrombosis risk compared with stenting of the main vessel only.

  16. CROSS-DISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY: Noise effect in metabolic networks

    NASA Astrophysics Data System (ADS)

    Li, Zheng-Yan; Xie, Zheng-Wei; Chen, Tong; Ouyang, Qi

    2009-12-01

    Constraint-based models such as flux balance analysis (FBA) are a powerful tool to study biological metabolic networks. Under the hypothesis that cells operate at an optimal growth rate as the result of evolution and natural selection, this model successfully predicts most cellular behaviours in growth rate. However, the model ignores the fact that cells can change their cellular metabolic states during evolution, leaving optimal metabolic states unstable. Here, we consider all the cellular processes that change metabolic states into a single term 'noise', and assume that cells change metabolic states by randomly walking in feasible solution space. By simulating a state of a cell randomly walking in the constrained solution space of metabolic networks, we found that in a noisy environment cells in optimal states tend to travel away from these points. On considering the competition between the noise effect and the growth effect in cell evolution, we found that there exists a trade-off between these two effects. As a result, the population of the cells contains different cellular metabolic states, and the population growth rate is at suboptimal states.

  17. Dynamics of habitat selection in birds: adaptive response to nest predation depends on multiple factors.

    PubMed

    Devries, J H; Clark, R G; Armstrong, L M

    2018-05-01

    According to theory, habitat selection by organisms should reflect underlying habitat-specific fitness consequences and, in birds, reproductive success has a strong impact on population growth in many species. Understanding processes affecting habitat selection also is critically important for guiding conservation initiatives. Northern pintails (Anas acuta) are migratory, temperate-nesting birds that breed in greatest concentrations in the prairies of North America and their population remains below conservation goals. Habitat loss and changing land use practices may have decoupled formerly reliable fitness cues with respect to nest habitat choices. We used data from 62 waterfowl nesting study sites across prairie Canada (1997-2009) to examine nest survival, a primary fitness metric, at multiple scales, in combination with estimates of habitat selection (i.e., nests versus random points), to test for evidence of adaptive habitat choices. We used the same habitat covariates in both analyses. Pintail nest survival varied with nest initiation date, nest habitat, pintail breeding pair density, landscape composition and annual moisture. Selection of nesting habitat reflected patterns in nest survival in some cases, indicating adaptive selection, but strength of habitat selection varied seasonally and depended on population density and landscape composition. Adaptive selection was most evident late in the breeding season, at low breeding densities and in cropland-dominated landscapes. Strikingly, at high breeding density, habitat choice appears to become maladaptive relative to nest predation. At larger spatial scales, the relative availability of habitats with low versus high nest survival, and changing land use practices, may limit the reproductive potential of pintails.

  18. Random genetic drift, natural selection, and noise in human cranial evolution.

    PubMed

    Roseman, Charles C

    2016-08-01

    This study assesses the extent to which relationships among groups complicate comparative studies of adaptation in recent human cranial variation and the extent to which departures from neutral additive models of evolution hinder the reconstruction of population relationships among groups using cranial morphology. Using a maximum likelihood evolutionary model fitting approach and a mixed population genomic and cranial data set, I evaluate the relative fits of several widely used models of human cranial evolution. Moreover, I compare the goodness of fit of models of cranial evolution constrained by genomic variation to test hypotheses about population specific departures from neutrality. Models from population genomics are much better fits to cranial variation than are traditional models from comparative human biology. There is not enough evolutionary information in the cranium to reconstruct much of recent human evolution but the influence of population history on cranial variation is strong enough to cause comparative studies of adaptation serious difficulties. Deviations from a model of random genetic drift along a tree-like population history show the importance of environmental effects, gene flow, and/or natural selection on human cranial variation. Moreover, there is a strong signal of the effect of natural selection or an environmental factor on a group of humans from Siberia. The evolution of the human cranium is complex and no one evolutionary process has prevailed at the expense of all others. A holistic unification of phenome, genome, and environmental context, gives us a strong point of purchase on these problems, which is unavailable to any one traditional approach alone. Am J Phys Anthropol 160:582-592, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. Barriers to management of cardiovascular risk in a low-resource setting using hypertension as an entry point.

    PubMed

    Mendis, Shanthi; Abegunde, Dele; Oladapo, Olulola; Celletti, Francesca; Nordet, Porfirio

    2004-01-01

    Assess capacity of health-care facilities in a low-resource setting to implement the absolute risk approach for assessment of cardiovascular risk in hypertensive patients and effective management of hypertension. A descriptive cross-sectional study in Egbeda and Oluyole local government areas of Oyo State in Nigeria in 56 randomly selected primary- (n = 42) and secondary-level (n = 2) health-care and private health-care (n = 12) facilities. One thousand consecutive, known hypertensives attending the selected facilities for follow-up, and health-care providers working in the above randomly selected facilities, were interviewed. About two-thirds of hypertensives utilized primary-care centers both for diagnosis and for follow-up. Laboratory and other investigations to exclude secondary hypertension or to assess target organ damage were not available in the majority of facilities, particularly in primary care. A considerable knowledge and awareness gap related to hypertension and its complications was found, both among patients and health-care providers. Blood pressure control rates were poor (28% with systolic blood pressure (SBP) < 140 mmHg and diastolic blood pressure (DBP) < 90 mmHg] and drug prescription patterns were not evidence based and cost effective. The majority of patients (73%) in this low socio-economic group (mean monthly income 73 US dollars) had to pay fully, out of their own pocket, for consultations and medications. If the absolute risk approach for assessment of risk and effective management of hypertension is to be implemented in low-resource settings, appropriate policy measures need to be taken to improve the competency of health-care providers, to provide basic laboratory facilities and to develop affordable financing mechanisms.

  20. The selective glycine uptake inhibitor org 25935 as an adjunctive treatment to atypical antipsychotics in predominant persistent negative symptoms of schizophrenia: results from the GIANT trial.

    PubMed

    Schoemaker, Joep H; Jansen, Wim T; Schipper, Jacques; Szegedi, Armin

    2014-04-01

    Using a selective glycine uptake inhibitor as adjunctive to second-generation antipsychotic (SGA) was hypothesized to ameliorate negative and/or cognitive symptoms in subjects with schizophrenia. Subjects with predominant persistent negative symptoms (previously stabilized ≥3 months on an SGA) were enrolled in a randomized, placebo-controlled trial to investigate adjunctive treatment with Org 25935, a selective inhibitor of type 1 glycine transporter, over 12 weeks in a flexible dose design. Org 25935 was tested at 4 to 8 mg twice daily and 12 to 16 mg twice daily versus placebo. Primary efficacy outcome was mean change from baseline in Scale for Assessment of Negative Symptoms composite score. Secondary efficacy end points were Positive and Negative Syndrome Scale total and subscale scores, depressive symptoms (Calgary Depression Scale for Schizophrenia), global functioning (Global Assessment of Functioning scale), and cognitive measures using a computerized battery (Central Nervous System Vital Signs). Responder rates were assessed post hoc. A total of 215 subjects were randomized, of which 187 (87%) completed the trial. Both dose groups of Org 25935 did not differ significantly from placebo on Scale for Assessment of Negative Symptoms, Positive and Negative Syndrome Scale (total or subscale scores), Global Assessment of Functioning, or the majority of tested cognitive domains. Org 25935 was generally well tolerated within the tested dose range, with no meaningful effects on extrapyramidal symptoms and some reports of reversible visual adverse effects. Org 25935 did not differ significantly from placebo in reducing negative symptoms or improving cognitive functioning when administered as adjunctive treatment to SGA. In our study population, Org 25935 appeared to be well tolerated in the tested dose ranges.

  1. A systematic review of financial incentives for dietary behavior change.

    PubMed

    Purnell, Jason Q; Gernes, Rebecca; Stein, Rick; Sherraden, Margaret S; Knoblock-Hahn, Amy

    2014-07-01

    In light of the obesity epidemic, there is growing interest in the use of financial incentives for dietary behavior change. Previous reviews of the literature have focused on randomized controlled trials and found mixed results. The purpose of this systematic review is to update and expand on previous reviews by considering a broader range of study designs, including randomized controlled trials, quasi-experimental, observational, and simulation studies testing the use of financial incentives to change dietary behavior and to inform both dietetic practice and research. The review was guided by theoretical consideration of the type of incentive used based on the principles of operant conditioning. There was further examination of whether studies were carried out with an institutional focus. Studies published between 2006 and 2012 were selected for review, and data were extracted regarding study population, intervention design, outcome measures, study duration and follow-up, and key findings. Twelve studies meeting selection criteria were reviewed, with 11 finding a positive association between incentives and dietary behavior change in the short term. All studies pointed to more specific information on the type, timing, and magnitude of incentives needed to motivate individuals to change behavior, the types of incentives and disincentives most likely to affect the behavior of various socioeconomic groups, and promising approaches for potential policy and practice innovations. Limitations of the studies are noted, including the lack of theoretical guidance in the selection of incentive structures and the absence of basic experimental data. Future research should consider these factors, even as policy makers and practitioners continue to experiment with this potentially useful approach to addressing obesity. Copyright © 2014 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  2. 76 FR 51038 - Draft Guidance for Industry: Cell Selection Devices for Point of Care Production of Minimally...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ...; formerly Docket No. 2007D-0290] Draft Guidance for Industry: Cell Selection Devices for Point of Care Production of Minimally Manipulated Autologous Peripheral Blood Stem Cells; Withdrawal of Draft Guidance...: Cell Selection Devices for Point of Care Production of Minimally Manipulated Autologous Peripheral...

  3. Hierarchical Solution of the Traveling Salesman Problem with Random Dyadic Tilings

    NASA Astrophysics Data System (ADS)

    Kalmár-Nagy, Tamás; Bak, Bendegúz Dezső

    We propose a hierarchical heuristic approach for solving the Traveling Salesman Problem (TSP) in the unit square. The points are partitioned with a random dyadic tiling and clusters are formed by the points located in the same tile. Each cluster is represented by its geometrical barycenter and a “coarse” TSP solution is calculated for these barycenters. Midpoints are placed at the middle of each edge in the coarse solution. Near-optimal (or optimal) minimum tours are computed for each cluster. The tours are concatenated using the midpoints yielding a solution for the original TSP. The method is tested on random TSPs (independent, identically distributed points in the unit square) up to 10,000 points as well as on a popular benchmark problem (att532 — coordinates of 532 American cities). Our solutions are 8-13% longer than the optimal ones. We also present an optimization algorithm for the partitioning to improve our solutions. This algorithm further reduces the solution errors (by several percent using 1000 iteration steps). The numerical experiments demonstrate the viability of the approach.

  4. Prevalence of wounds in a community care setting in Ireland.

    PubMed

    McDermott-Scales, L; Cowman, S; Gethin, G

    2009-10-01

    To establish the prevalence of wounds and their management in a community care setting. A multi-site, census point prevalence wound survey was conducted in the following areas: intellectual disability, psychiatry, GP practices, prisons, long-term care private nursing homes, long-term care, public nursing homes and the community/public health (district) nursing services on one randomly selected day. Acute services were excluded. Formal ethical approval was obtained. Data were collected using a pre-piloted questionnaire. Education was provided to nurses recording the tool (n=148). Descriptive statistical analysis was performed. A 97.2% response rate yielded a crude prevalence rate of 15.6% for wounds across nursing disciplines (290/1,854 total census) and 0.2% for the community area (290/133,562 population statistics for the study area). Crude point prevalence ranged from 2.7% in the prison services (7/262 total prison population surveyed) to 33.5% in the intellectual disability services (72/215 total intellectual disability population surveyed). The most frequent wounds recorded were pressure ulcers (crude point prevalence 4%, 76/1,854 total census; excluding category l crude point prevalence was 2.6%, 49/1,854 total census), leg ulcers (crude point prevalence 2.9%, 55/1,854 total census), self-inflicted superficial abrasions (crude point prevalence 2.2%, 41/1,854 total census) and surgical wounds (crude point prevalence 1.7%, 32/1,854 total census). These results support previous international research in that they identify a high prevalence of wounds in the community. The true community prevalence of wounds is arguably much higher, as this study identified only wounds known to the nursing services and excluded acute settings and was conducted on one day.

  5. Refernce Conditions for Streams in the Grand Prairie Natural Division of Illinois

    NASA Astrophysics Data System (ADS)

    Sangunett, B.; Dewalt, R.

    2005-05-01

    As part of the Critical Trends Assessment Program (CTAP) of the Illinois Department of Natural Resources (IDNR), 12 potential reference quality stream sites in the Grand Prairie Natural Division were evaluated in May 2004. This agriculturally dominated region, located in east central Illinois, is the most highly modified in the state. The quality of these sites was assessed using a modified Hilsenhoff Biotic Index and species richness of Ephemeroptera, Plecoptera, and Trichoptera (EPT) insect orders and a 12 parameter Habitat Quality Index (HQI). Illinois EPA high quality fish stations, Illinois Natural History Survey insect collection data, and best professional knowledge were used to choose which streams to evaluate. For analysis, reference quality streams were compared to 37 randomly selected meandering streams and 26 randomly selected channelized streams which were assessed by CTAP between 1997 and 2001. The results showed that the reference streams exceeded both taxa richness and habitat quality of randomly selected streams in the region. Both random meandering sites and reference quality sites increased in taxa richness and HQI as stream width increased. Randomly selected channelized streams had about the same taxa richness and HQI regardless of width.

  6. Key Aspects of Nucleic Acid Library Design for in Vitro Selection

    PubMed Central

    Vorobyeva, Maria A.; Davydova, Anna S.; Vorobjev, Pavel E.; Pyshnyi, Dmitrii V.; Venyaminova, Alya G.

    2018-01-01

    Nucleic acid aptamers capable of selectively recognizing their target molecules have nowadays been established as powerful and tunable tools for biospecific applications, be it therapeutics, drug delivery systems or biosensors. It is now generally acknowledged that in vitro selection enables one to generate aptamers to almost any target of interest. However, the success of selection and the affinity of the resulting aptamers depend to a large extent on the nature and design of an initial random nucleic acid library. In this review, we summarize and discuss the most important features of the design of nucleic acid libraries for in vitro selection such as the nature of the library (DNA, RNA or modified nucleotides), the length of a randomized region and the presence of fixed sequences. We also compare and contrast different randomization strategies and consider computer methods of library design and some other aspects. PMID:29401748

  7. Methods and analysis of realizing randomized grouping.

    PubMed

    Hu, Liang-Ping; Bao, Xiao-Lei; Wang, Qi

    2011-07-01

    Randomization is one of the four basic principles of research design. The meaning of randomization includes two aspects: one is to randomly select samples from the population, which is known as random sampling; the other is to randomly group all the samples, which is called randomized grouping. Randomized grouping can be subdivided into three categories: completely, stratified and dynamically randomized grouping. This article mainly introduces the steps of complete randomization, the definition of dynamic randomization and the realization of random sampling and grouping by SAS software.

  8. {sup 90}Y -PET imaging: Exploring limitations and accuracy under conditions of low counts and high random fraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlier, Thomas, E-mail: thomas.carlier@chu-nantes.fr; Willowson, Kathy P.; Fourkal, Eugene

    Purpose: {sup 90}Y -positron emission tomography (PET) imaging is becoming a recognized modality for postinfusion quantitative assessment following radioembolization therapy. However, the extremely low counts and high random fraction associated with {sup 90}Y -PET may significantly impair both qualitative and quantitative results. The aim of this work was to study image quality and noise level in relation to the quantification and bias performance of two types of Siemens PET scanners when imaging {sup 90}Y and to compare experimental results with clinical data from two types of commercially available {sup 90}Y microspheres. Methods: Data were acquired on both Siemens Biograph TruePointmore » [non-time-of-flight (TOF)] and Biograph microcomputed tomography (mCT) (TOF) PET/CT scanners. The study was conducted in three phases. The first aimed to assess quantification and bias for different reconstruction methods according to random fraction and number of true counts in the scan. The NEMA 1994 PET phantom was filled with water with one cylindrical insert left empty (air) and the other filled with a solution of {sup 90}Y . The phantom was scanned for 60 min in the PET/CT scanner every one or two days. The second phase used the NEMA 2001 PET phantom to derive noise and image quality metrics. The spheres and the background were filled with a {sup 90}Y solution in an 8:1 contrast ratio and four 30 min acquisitions were performed over a one week period. Finally, 32 patient data (8 treated with Therasphere{sup ®} and 24 with SIR-Spheres{sup ®}) were retrospectively reconstructed and activity in the whole field of view and the liver was compared to theoretical injected activity. Results: The contribution of both bremsstrahlung and LSO trues was found to be negligible, allowing data to be decay corrected to obtain correct quantification. In general, the recovered activity for all reconstruction methods was stable over the range studied, with a small bias appearing at extremely high random fraction and low counts for iterative algorithms. Point spread function (PSF) correction and TOF reconstruction in general reduce background variability and noise and increase recovered concentration. Results for patient data indicated a good correlation between the expected and PET reconstructed activities. A linear relationship between the expected and the measured activities in the organ of interest was observed for all reconstruction method used: a linearity coefficient of 0.89 ± 0.05 for the Biograph mCT and 0.81 ± 0.05 for the Biograph TruePoint. Conclusions: Due to the low counts and high random fraction, accurate image quantification of {sup 90}Y during selective internal radionuclide therapy is affected by random coincidence estimation, scatter correction, and any positivity constraint of the algorithm. Nevertheless, phantom and patient studies showed that the impact of number of true and random coincidences on quantitative results was found to be limited as long as ordinary Poisson ordered subsets expectation maximization reconstruction algorithms with random smoothing are used. Adding PSF correction and TOF information to the reconstruction greatly improves the image quality in terms of bias, variability, noise reduction, and detectability. On the patient studies, the total activity in the field of view is in general accurately measured by Biograph mCT and slightly overestimated by the Biograph TruePoint.« less

  9. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    PubMed

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  10. Prevalence and magnitude of acidosis sequelae to rice-based feeding regimen followed in Tamil Nadu, India.

    PubMed

    Murugeswari, Rathinam; Valli, Chinnamani; Karunakaran, Raman; Leela, Venkatasubramanian; Pandian, Amaresan Serma Saravana

    2018-04-01

    In Tamil Nadu, a southern state of India, rice is readily available at a low cost, hence, is cooked (cooking akin to human consumption) and fed irrationally to cross-bred dairy cattle with poor productivity. Hence, a study was carried out with the objective to examine the prevalence of acidosis sequelae to rice-based feeding regimen and assess its magnitude. A survey was conducted in all the 32 districts of Tamil Nadu, by randomly selecting two blocks per districts and from each block five villages were randomly selected. From each of the selected village, 10 dairy farmers belonging to the unorganized sector, owning one or two cross-bred dairy cows in early and mid-lactation were randomly selected so that a sample size of 100 farmers per district was maintained. The feeding regimen, milk yield was recorded, and occurrence of acidosis and incidence of laminitis were ascertained by the veterinarian with the confirmative test to determine the impact of feeding cooked rice to cows. It is observed that 71.5% of farmers in unorganized sector feed cooked rice to their cattle. The incidence of acidosis progressively increased significantly (p<0.05) from 29.00% in cows fed with 0.5 kg of cooked rice to 69.23% in cows fed with more than 2.5 kg of cooked rice. However, the incidence of acidosis remained significantly (p<0.05) as low as 9.9% in cows fed feeding regimen without cooked rice which is suggestive of a correlation between excessive feeding cooked rice and onset of acidosis. Further, the noticeable difference in the incidence of acidosis observed between feeding cooked rice and those fed without rice and limited intake of oil cake indicates that there is a mismatch between energy and protein supply to these cattle. Among cooked rice-based diet, the incidence of laminitis increased progressively (p<0.05) from 9.2% to 37.9% with the increase in the quantum of cooked rice in the diet. The study points out the importance of protein supplementation in rice-based feeding regimen to set right the mismatched supply between nitrogen and fermentable organic matter in the rumen. This research has practical implications for animal health, welfare, nutrition, and management.

  11. Prevalence and magnitude of acidosis sequelae to rice-based feeding regimen followed in Tamil Nadu, India

    PubMed Central

    Murugeswari, Rathinam; Valli, Chinnamani; Karunakaran, Raman; Leela, Venkatasubramanian; Pandian, Amaresan Serma Saravana

    2018-01-01

    Background and Aim In Tamil Nadu, a southern state of India, rice is readily available at a low cost, hence, is cooked (cooking akin to human consumption) and fed irrationally to cross-bred dairy cattle with poor productivity. Hence, a study was carried out with the objective to examine the prevalence of acidosis sequelae to rice-based feeding regimen and assess its magnitude. Materials and Methods A survey was conducted in all the 32 districts of Tamil Nadu, by randomly selecting two blocks per districts and from each block five villages were randomly selected. From each of the selected village, 10 dairy farmers belonging to the unorganized sector, owning one or two cross-bred dairy cows in early and mid-lactation were randomly selected so that a sample size of 100 farmers per district was maintained. The feeding regimen, milk yield was recorded, and occurrence of acidosis and incidence of laminitis were ascertained by the veterinarian with the confirmative test to determine the impact of feeding cooked rice to cows. Results It is observed that 71.5% of farmers in unorganized sector feed cooked rice to their cattle. The incidence of acidosis progressively increased significantly (p<0.05) from 29.00% in cows fed with 0.5 kg of cooked rice to 69.23% in cows fed with more than 2.5 kg of cooked rice. However, the incidence of acidosis remained significantly (p<0.05) as low as 9.9% in cows fed feeding regimen without cooked rice which is suggestive of a correlation between excessive feeding cooked rice and onset of acidosis. Further, the noticeable difference in the incidence of acidosis observed between feeding cooked rice and those fed without rice and limited intake of oil cake indicates that there is a mismatch between energy and protein supply to these cattle. Among cooked rice-based diet, the incidence of laminitis increased progressively (p<0.05) from 9.2% to 37.9% with the increase in the quantum of cooked rice in the diet. Conclusion The study points out the importance of protein supplementation in rice-based feeding regimen to set right the mismatched supply between nitrogen and fermentable organic matter in the rumen. This research has practical implications for animal health, welfare, nutrition, and management PMID:29805211

  12. A Prospective, Randomized Trial of Routine Duplex Ultrasound Surveillance on Arteriovenous Fistula Maturation.

    PubMed

    Han, Ahram; Min, Seung-Kee; Kim, Mi-Sook; Joo, Kwon Wook; Kim, Jungsun; Ha, Jongwon; Lee, Joongyub; Min, Sang-Il

    2016-10-07

    Use of arteriovenous fistulas, the most preferred type of access for hemodialysis, is limited by their high maturation failure rate. The aim of this study was to assess whether aggressive surveillance with routine duplex ultrasound and intervention can decrease the maturation failure rate of arteriovenous fistulas. We conducted a single-center, parallel-group, randomized, controlled trial of patients undergoing autogenous arteriovenous fistula. Patients were randomly assigned (1:1) to either the routine duplex or selective duplex group. In the routine duplex group, duplex ultrasound and physical examination were performed 2, 4, and 8 weeks postoperatively. In the selective duplex group, duplex examination was performed only when physical examination detected an abnormality. The primary end point was the maturation failure rate 8 weeks after fistula creation. Maturation failure was defined as the inability to achieve clinical maturation ( i.e. , a successful first use) and failure to achieve sonographic maturation (fistula flow >500 ml/min and diameter >6 mm) within 8 weeks. Between June 14, 2012, and June 25, 2014, 150 patients were enrolled (75 patients in each group), and 118 of those were included in the final analysis. The maturation failure rate was lower in the routine duplex group (8 of 59; 13.6%) than in the selective duplex group (15 of 59; 25.4%), but the difference was not statistically significant (odds ratio, 0.46; 95% confidence interval, 0.18 to 1.19; P =0.10). Factors associated with maturation failure were women (odds ratio, 3.84; 95% confidence interval, 1.05 to 14.06; P =0.04), coronary artery disease (odds ratio, 6.36; 95% confidence interval, 1.62 to 24.95; P <0.01), diabetes (odds ratio, 6.10; 95% confidence interval, 1.76 to 21.19; P <0.01), and the preoperative cephalic vein diameter (odds ratio, 0.30; 95% confidence interval, 0.13 to 0.71; P <0.01). Postoperative routine duplex surveillance failed to prove superiority compared with selective duplex after physical examination for reducing arteriovenous fistula maturation failure. However, the wide 95% confidence interval for the effect of intervention precludes a firm conclusion that routine duplex surveillance was not beneficial. Copyright © 2016 by the American Society of Nephrology.

  13. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    EPA Science Inventory

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  14. Feature selection and classifier parameters estimation for EEG signals peak detection using particle swarm optimization.

    PubMed

    Adam, Asrul; Shapiai, Mohd Ibrahim; Tumari, Mohd Zaidi Mohd; Mohamad, Mohd Saberi; Mubin, Marizan

    2014-01-01

    Electroencephalogram (EEG) signal peak detection is widely used in clinical applications. The peak point can be detected using several approaches, including time, frequency, time-frequency, and nonlinear domains depending on various peak features from several models. However, there is no study that provides the importance of every peak feature in contributing to a good and generalized model. In this study, feature selection and classifier parameters estimation based on particle swarm optimization (PSO) are proposed as a framework for peak detection on EEG signals in time domain analysis. Two versions of PSO are used in the study: (1) standard PSO and (2) random asynchronous particle swarm optimization (RA-PSO). The proposed framework tries to find the best combination of all the available features that offers good peak detection and a high classification rate from the results in the conducted experiments. The evaluation results indicate that the accuracy of the peak detection can be improved up to 99.90% and 98.59% for training and testing, respectively, as compared to the framework without feature selection adaptation. Additionally, the proposed framework based on RA-PSO offers a better and reliable classification rate as compared to standard PSO as it produces low variance model.

  15. T-DNA-genome junctions form early after infection and are influenced by the chromatin state of the host genome

    PubMed Central

    Tripathi, Pooja; Muth, Theodore R.

    2017-01-01

    Agrobacterium tumefaciens mediated T-DNA integration is a common tool for plant genome manipulation. However, there is controversy regarding whether T-DNA integration is biased towards genes or randomly distributed throughout the genome. In order to address this question, we performed high-throughput mapping of T-DNA-genome junctions obtained in the absence of selection at several time points after infection. T-DNA-genome junctions were detected as early as 6 hours post-infection. T-DNA distribution was apparently uniform throughout the chromosomes, yet local biases toward AT-rich motifs and T-DNA border sequence micro-homology were detected. Analysis of the epigenetic landscape of previously isolated sites of T-DNA integration in Kanamycin-selected transgenic plants showed an association with extremely low methylation and nucleosome occupancy. Conversely, non-selected junctions from this study showed no correlation with methylation and had chromatin marks, such as high nucleosome occupancy and high H3K27me3, that correspond to three-dimensional-interacting heterochromatin islands embedded within euchromatin. Such structures may play a role in capturing and silencing invading T-DNA. PMID:28742090

  16. Genetic improvement in mastitis resistance: comparison of selection criteria from cross-sectional and random regression sire models for somatic cell score.

    PubMed

    Odegård, J; Klemetsdal, G; Heringstad, B

    2005-04-01

    Several selection criteria for reducing incidence of mastitis were developed from a random regression sire model for test-day somatic cell score (SCS). For comparison, sire transmitting abilities were also predicted based on a cross-sectional model for lactation mean SCS. Only first-crop daughters were used in genetic evaluation of SCS, and the different selection criteria were compared based on their correlation with incidence of clinical mastitis in second-crop daughters (measured as mean daughter deviations). Selection criteria were predicted based on both complete and reduced first-crop daughter groups (261 or 65 daughters per sire, respectively). For complete daughter groups, predicted transmitting abilities at around 30 d in milk showed the best predictive ability for incidence of clinical mastitis, closely followed by average predicted transmitting abilities over the entire lactation. Both of these criteria were derived from the random regression model. These selection criteria improved accuracy of selection by approximately 2% relative to a cross-sectional model. However, for reduced daughter groups, the cross-sectional model yielded increased predictive ability compared with the selection criteria based on the random regression model. This result may be explained by the cross-sectional model being more robust, i.e., less sensitive to precision of (co)variance components estimates and effects of data structure.

  17. Hydrogenation and interesterification effects on the oxidative stability and melting point of soybean oil.

    PubMed

    Daniels, Roger L; Kim, Hyun Jung; Min, David B

    2006-08-09

    Soybean oil with an iodine value of 136 was hydrogenated to have iodine values of 126 and 117. The soybean oils with iodine values of 136, 126, and 117 were randomly interesterified using sodium methoxide. The oxidative stabilities of the hydrogenated and/or interesterified soybean oils were evaluated by measuring the headspace oxygen content by gas chromatography, and the induction time was measured using Rancimat. The melting points of the oils were evaluated by differential scanning calorimetry. Duncan's multiple range test of the headspace oxygen and induction time showed that hydrogenation increased the headspace oxygen content and induction time at alpha = 0.05. Interesterification decreased the headspace oxygen and the induction time for the soybean oils with iodine values of 136, 126, and 117 at alpha = 0.05. Hydrogenation increased the melting points as the iodine value decreased from 136 and 126 to 117 at alpha = 0.05. The random interesterification increased the melting points of soybean oils with iodine values of 136, 126, and 117 at alpha = 0.05. The combined effects of hydrogenation and interesterification increased the oxidative stability of soybean oil at alpha = 0.05 and the melting point at alpha = 0.01. The optimum combination of hydrogenation and random interesterification can improve the oxidative stability and increase the melting point to expand the application of soybean oil in foods.

  18. Chinese Massage Combined with Herbal Ointment for Athletes with Nonspecific Low Back Pain: A Randomized Controlled Trial

    PubMed Central

    Kong, Ling Jun; Fang, Min; Zhan, Hong Sheng; Yuan, Wei An; Tao, Ji Ming; Qi, Gao Wei; Cheng, Ying Wu

    2012-01-01

    Non-specific low back pain (NLBP) is an increasing health problem for athletes. This randomized controlled trial was designed to investigate the effects of Chinese massage combined with herbal ointment for NLBP. 110 athletes with NLBP were randomly assigned to experimental group with Chinese massage combined with herbal ointment or control group with simple massage therapy. The primary outcome was pain by Chinese Short Form McGill Pain Questionnaire (C-SFMPQ). The secondary outcome was local muscle stiffness by Myotonometer. After 4 weeks, the experimental group experienced significant improvements in C-SFMPQ and in local muscle stiffness compared with control group (between-group difference in mean change from baseline, −1.24 points, P = 0.005 in sensory scores; −3.14 points, P < 0.001 in affective scores; −4.39 points, P < 0.001 in total scores; −0.64 points, P = 0.002 in VAS; −1.04 points, P = 0.005 in local muscle stiffness during relaxation state). The difference remained at one month followup, but it was only significant in affective scores (−2.83 points, P < 0.001) at three months followup. No adverse events were observed. These findings suggest that Chinese massage combined with herbal ointment may be a beneficial complementary and alternative therapy for athletes with NLBP. PMID:23258996

  19. An analytical approach to gravitational lensing by an ensemble of axisymmetric lenses

    NASA Technical Reports Server (NTRS)

    Lee, Man Hoi; Spergel, David N.

    1990-01-01

    The problem of gravitational lensing by an ensemble of identical axisymmetric lenses randomly distributed on a single lens plane is considered and a formal expression is derived for the joint probability density of finding shear and convergence at a random point on the plane. The amplification probability for a source can be accurately estimated from the distribution in shear and convergence. This method is applied to two cases: lensing by an ensemble of point masses and by an ensemble of objects with Gaussian surface mass density. There is no convergence for point masses whereas shear is negligible for wide Gaussian lenses.

  20. The Effects of Total Physical Response by Storytelling and the Traditional Teaching Styles of a Foreign Language in a Selected High School

    ERIC Educational Resources Information Center

    Kariuki, Patrick N. K.; Bush, Elizabeth Danielle

    2008-01-01

    The purpose of this study was to examine the effects of Total Physical Response by Storytelling and the traditional teaching method on a foreign language in a selected high school. The sample consisted of 30 students who were randomly selected and randomly assigned to experimental and control group. The experimental group was taught using Total…

  1. Most Undirected Random Graphs Are Amplifiers of Selection for Birth-Death Dynamics, but Suppressors of Selection for Death-Birth Dynamics.

    PubMed

    Hindersin, Laura; Traulsen, Arne

    2015-11-01

    We analyze evolutionary dynamics on graphs, where the nodes represent individuals of a population. The links of a node describe which other individuals can be displaced by the offspring of the individual on that node. Amplifiers of selection are graphs for which the fixation probability is increased for advantageous mutants and decreased for disadvantageous mutants. A few examples of such amplifiers have been developed, but so far it is unclear how many such structures exist and how to construct them. Here, we show that almost any undirected random graph is an amplifier of selection for Birth-death updating, where an individual is selected to reproduce with probability proportional to its fitness and one of its neighbors is replaced by that offspring at random. If we instead focus on death-Birth updating, in which a random individual is removed and its neighbors compete for the empty spot, then the same ensemble of graphs consists of almost only suppressors of selection for which the fixation probability is decreased for advantageous mutants and increased for disadvantageous mutants. Thus, the impact of population structure on evolutionary dynamics is a subtle issue that will depend on seemingly minor details of the underlying evolutionary process.

  2. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti.

    PubMed

    Wampler, Peter J; Rediske, Richard R; Molla, Azizur R

    2013-01-18

    A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This method provides an important technique that can be applied to other developing countries where a randomized study design is needed but infrastructure is lacking to implement more traditional participant selection methods.

  3. Toward an evolutionary-predictive foundation for creativity : Commentary on "Human creativity, evolutionary algorithms, and predictive representations: The mechanics of thought trials" by Arne Dietrich and Hilde Haider, 2014 (Accepted pending minor revisions for publication in Psychonomic Bulletin & Review).

    PubMed

    Gabora, Liane; Kauffman, Stuart

    2016-04-01

    Dietrich and Haider (Psychonomic Bulletin & Review, 21 (5), 897-915, 2014) justify their integrative framework for creativity founded on evolutionary theory and prediction research on the grounds that "theories and approaches guiding empirical research on creativity have not been supported by the neuroimaging evidence." Although this justification is controversial, the general direction holds promise. This commentary clarifies points of disagreement and unresolved issues, and addresses mis-applications of evolutionary theory that lead the authors to adopt a Darwinian (versus Lamarckian) approach. To say that creativity is Darwinian is not to say that it consists of variation plus selection - in the everyday sense of the term - as the authors imply; it is to say that evolution is occurring because selection is affecting the distribution of randomly generated heritable variation across generations. In creative thought the distribution of variants is not key, i.e., one is not inclined toward idea A because 60 % of one's candidate ideas are variants of A while only 40 % are variants of B; one is inclined toward whichever seems best. The authors concede that creative variation is partly directed; however, the greater the extent to which variants are generated non-randomly, the greater the extent to which the distribution of variants can reflect not selection but the initial generation bias. Since each thought in a creative process can alter the selective criteria against which the next is evaluated, there is no demarcation into generations as assumed in a Darwinian model. We address the authors' claim that reduced variability and individuality are more characteristic of Lamarckism than Darwinian evolution, and note that a Lamarckian approach to creativity has addressed the challenge of modeling the emergent features associated with insight.

  4. Electric Propulsion Pointing Mechanism for BepiColombo

    NASA Astrophysics Data System (ADS)

    Janu, Paul; Neugebauer, Christian; Schermann, Rudolf; Supper, Ludwig

    2013-09-01

    Since 17 years the development of Electric Propulsion Pointing Mechanisms for commercial and scientific satellite applications is a key-product activity for RUAG Space in Vienna.As one of the most innovative EP mechanisms presently under development in Vienna this paper presents the Electric Propulsion Mechanism for the ESA Bepi Colombo Mission.RUAG Space delivers the mechanism assembly, consisting of the mechanisms and the control electronics.The design-driving requirements are:- the pointing capability around the stowed configuration under resitive torque coming from the thruster supply harness, the thruster supply piping, and the mechanism harness. The pointing capability around the stowed configuration is realized via a central release nut together with a spring loaded knuckle-lever system which in essence forms a "frangible pipe" that is stiff during launch and collapses upon release. The resistive torques are minimized by a helical arrangement of the supply pipes and of the mechanism harness, and a guided low stiffness routing of the thruster supply harness. A high detent torque actuator is used to maintain pointing direction in un-powered condition. Also the direct measurement of the torque on the actuator shaft during random vibration is presented in the paper.- the specified maximum input loads to the thruster. The mechanism has not only to point the thruster, but also to protect it against high launch loads. A very low Eigen- frequency of the mechanism/thruster sub-assembly of around 65 Hz was selected to minimize coupling with the thruster's modes and so to minimize load input to the thruster. An elastomer damping system is implemented which minimizes amplification in this frequency area so that the sine input can be sustained by the mechanism and the thruster. The measured amplification of 3.1 turned out to successfully protect the thruster from the launch vibrations.- the thermal load on the mechanism from the dissipation of the thruster and from the solar radiation.A staged temperature zone concept was selected, separating different temperature zones, and keeping the thermally sensitive elements in their operating temperature ranges.This paper outlines the design solution for these design driving requirements, presents the test results, and compares the results of the predictions with the tested values of the qualification tests. It also points out the lessons learnt during this development process.

  5. On the interpolation of volumetric water content in research catchments

    NASA Astrophysics Data System (ADS)

    Dlamini, Phesheya; Chaplot, Vincent

    Digital Soil Mapping (DSM) is widely used in the environmental sciences because of its accuracy and efficiency in producing soil maps compared to the traditional soil mapping. Numerous studies have investigated how the sampling density and the interpolation process of data points affect the prediction quality. While, the interpolation process is straight forward for primary attributes such as soil gravimetric water content (θg) and soil bulk density (ρb), the DSM of volumetric water content (θv), the product of θg by ρb, may either involve direct interpolations of θv (approach 1) or independent interpolation of ρb and θg data points and subsequent multiplication of ρb and θg maps (approach 2). The main objective of this study was to compare the accuracy of these two mapping approaches for θv. A 23 ha grassland catchment in KwaZulu-Natal, South Africa was selected for this study. A total of 317 data points were randomly selected and sampled during the dry season in the topsoil (0-0.05 m) for θg by ρb estimation. Data points were interpolated following approaches 1 and 2, and using inverse distance weighting with 3 or 12 neighboring points (IDW3; IDW12), regular spline with tension (RST) and ordinary kriging (OK). Based on an independent validation set of 70 data points, OK was the best interpolator for ρb (mean absolute error, MAE of 0.081 g cm-3), while θg was best estimated using IDW12 (MAE = 1.697%) and θv by IDW3 (MAE = 1.814%). It was found that approach 1 underestimated θv. Approach 2 tended to overestimate θv, but reduced the prediction bias by an average of 37% and only improved the prediction accuracy by 1.3% compared to approach 1. Such a great benefit of approach 2 (i.e., the subsequent multiplication of interpolated maps of primary variables) was unexpected considering that a higher sampling density (∼14 data point ha-1 in the present study) tends to minimize the differences between interpolations techniques and approaches. In the context of much lower sampling densities, as generally encountered in environmental studies, one can thus expect approach 2 to yield significantly greater accuracy than approach 1. This approach 2 seems promising and can be further tested for DSM of other secondary variables.

  6. Financial incentives increase fruit and vegetable intake among Supplemental Nutrition Assistance Program participants: a randomized controlled trial of the USDA Healthy Incentives Pilot.

    PubMed

    Olsho, Lauren Ew; Klerman, Jacob A; Wilde, Parke E; Bartlett, Susan

    2016-08-01

    US fruit and vegetable (FV) intake remains below recommendations, particularly for low-income populations. Evidence on effectiveness of rebates in addressing this shortfall is limited. This study evaluated the USDA Healthy Incentives Pilot (HIP), which offered rebates to Supplemental Nutrition Assistance Program (SNAP) participants for purchasing targeted FVs (TFVs). As part of a randomized controlled trial in Hampden County, Massachusetts, 7500 randomly selected SNAP households received a 30% rebate on TFVs purchased with SNAP benefits. The remaining 47,595 SNAP households in the county received usual benefits. Adults in 5076 HIP and non-HIP households were randomly sampled for telephone surveys, including 24-h dietary recall interviews. Surveys were conducted at baseline (1-3 mo before implementation) and in 2 follow-up rounds (4-6 mo and 9-11 mo after implementation). 2784 adults (1388 HIP, 1396 non-HIP) completed baseline interviews; data were analyzed for 2009 adults (72%) who also completed ≥1 follow-up interview. Regression-adjusted mean TFV intake at follow-up was 0.24 cup-equivalents/d (95% CI: 0.13, 0.34 cup-equivalents/d) higher among HIP participants. Across all fruit and vegetables (AFVs), regression-adjusted mean intake was 0.32 cup-equivalents/d (95% CI: 0.17, 0.48 cup-equivalents/d) higher among HIP participants. The AFV-TFV difference was explained by greater intake of 100% fruit juice (0.10 cup-equivalents/d; 95% CI: 0.02, 0.17 cup-equivalents/d); juice purchases did not earn the HIP rebate. Refined grain intake was 0.43 ounce-equivalents/d lower (95% CI: -0.69, -0.16 ounce-equivalents/d) among HIP participants, possibly indicating substitution effects. Increased AFV intake and decreased refined grain intake contributed to higher Healthy Eating Index-2010 scores among HIP participants (4.7 points; 95% CI: 2.4, 7.1 points). The HIP significantly increased FV intake among SNAP participants, closing ∼20% of the gap relative to recommendations and increasing dietary quality. More research on mechanisms of action is warranted. The HIP trial was registered at clinicaltrials.gov as NCT02651064. © 2016 American Society for Nutrition.

  7. Selection of floating-point or fixed-point for adaptive noise canceller in somatosensory evoked potential measurement.

    PubMed

    Shen, Chongfei; Liu, Hongtao; Xie, Xb; Luk, Keith Dk; Hu, Yong

    2007-01-01

    Adaptive noise canceller (ANC) has been used to improve signal to noise ratio (SNR) of somsatosensory evoked potential (SEP). In order to efficiently apply the ANC in hardware system, fixed-point algorithm based ANC can achieve fast, cost-efficient construction, and low-power consumption in FPGA design. However, it is still questionable whether the SNR improvement performance by fixed-point algorithm is as good as that by floating-point algorithm. This study is to compare the outputs of ANC by floating-point and fixed-point algorithm ANC when it was applied to SEP signals. The selection of step-size parameter (micro) was found different in fixed-point algorithm from floating-point algorithm. In this simulation study, the outputs of fixed-point ANC showed higher distortion from real SEP signals than that of floating-point ANC. However, the difference would be decreased with increasing micro value. In the optimal selection of micro, fixed-point ANC can get as good results as floating-point algorithm.

  8. The role of color and attention-to-color in mirror-symmetry perception.

    PubMed

    Gheorghiu, Elena; Kingdom, Frederick A A; Remkes, Aaron; Li, Hyung-Chul O; Rainville, Stéphane

    2016-07-11

    The role of color in the visual perception of mirror-symmetry is controversial. Some reports support the existence of color-selective mirror-symmetry channels, others that mirror-symmetry perception is merely sensitive to color-correlations across the symmetry axis. Here we test between the two ideas. Stimuli consisted of colored Gaussian-blobs arranged either mirror-symmetrically or quasi-randomly. We used four arrangements: (1) 'segregated' - symmetric blobs were of one color, random blobs of the other color(s); (2) 'random-segregated' - as above but with the symmetric color randomly selected on each trial; (3) 'non-segregated' - symmetric blobs were of all colors in equal proportions, as were the random blobs; (4) 'anti-symmetric' - symmetric blobs were of opposite-color across the symmetry axis. We found: (a) near-chance levels for the anti-symmetric condition, suggesting that symmetry perception is sensitive to color-correlations across the symmetry axis; (b) similar performance for random-segregated and non-segregated conditions, giving no support to the idea that mirror-symmetry is color selective; (c) highest performance for the color-segregated condition, but only when the observer knew beforehand the symmetry color, suggesting that symmetry detection benefits from color-based attention. We conclude that mirror-symmetry detection mechanisms, while sensitive to color-correlations across the symmetry axis and subject to the benefits of attention-to-color, are not color selective.

  9. The role of color and attention-to-color in mirror-symmetry perception

    PubMed Central

    Gheorghiu, Elena; Kingdom, Frederick A. A.; Remkes, Aaron; Li, Hyung-Chul O.; Rainville, Stéphane

    2016-01-01

    The role of color in the visual perception of mirror-symmetry is controversial. Some reports support the existence of color-selective mirror-symmetry channels, others that mirror-symmetry perception is merely sensitive to color-correlations across the symmetry axis. Here we test between the two ideas. Stimuli consisted of colored Gaussian-blobs arranged either mirror-symmetrically or quasi-randomly. We used four arrangements: (1) ‘segregated’ – symmetric blobs were of one color, random blobs of the other color(s); (2) ‘random-segregated’ – as above but with the symmetric color randomly selected on each trial; (3) ‘non-segregated’ – symmetric blobs were of all colors in equal proportions, as were the random blobs; (4) ‘anti-symmetric’ – symmetric blobs were of opposite-color across the symmetry axis. We found: (a) near-chance levels for the anti-symmetric condition, suggesting that symmetry perception is sensitive to color-correlations across the symmetry axis; (b) similar performance for random-segregated and non-segregated conditions, giving no support to the idea that mirror-symmetry is color selective; (c) highest performance for the color-segregated condition, but only when the observer knew beforehand the symmetry color, suggesting that symmetry detection benefits from color-based attention. We conclude that mirror-symmetry detection mechanisms, while sensitive to color-correlations across the symmetry axis and subject to the benefits of attention-to-color, are not color selective. PMID:27404804

  10. Decompressive Surgery for the Treatment of Malignant Infarction of the Middle Cerebral Artery (DESTINY): a randomized, controlled trial.

    PubMed

    Jüttler, Eric; Schwab, Stefan; Schmiedek, Peter; Unterberg, Andreas; Hennerici, Michael; Woitzik, Johannes; Witte, Steffen; Jenetzky, Ekkehart; Hacke, Werner

    2007-09-01

    Decompressive surgery (hemicraniectomy) for life-threatening massive cerebral infarction represents a controversial issue in neurocritical care medicine. We report here the 30-day mortality and 6- and 12-month functional outcomes from the DESTINY trial. DESTINY (ISRCTN01258591) is a prospective, multicenter, randomized, controlled, clinical trial based on a sequential design that used mortality after 30 days as the first end point. When this end point was reached, patient enrollment was interrupted as per protocol until recalculation of the projected sample size was performed on the basis of the 6-month outcome (primary end point=modified Rankin Scale score, dichotomized to 0 to 3 versus 4 to 6). All analyses were based on intention to treat. A statistically significant reduction in mortality was reached after 32 patients had been included: 15 of 17 (88%) patients randomized to hemicraniectomy versus 7 of 15 (47%) patients randomized to conservative therapy survived after 30 days (P=0.02). After 6 and 12 months, 47% of patients in the surgical arm versus 27% of patients in the conservative treatment arm had a modified Rankin Scale score of 0 to 3 (P=0.23). DESTINY showed that hemicraniectomy reduces mortality in large hemispheric stroke. With 32 patients included, the primary end point failed to demonstrate statistical superiority of hemicraniectomy, and the projected sample size was calculated to 188 patients. Despite this failure to meet the primary end point, the steering committee decided to terminate the trial in light of the results of the joint analysis of the 3 European hemicraniectomy trials.

  11. Ensemble Feature Learning of Genomic Data Using Support Vector Machine

    PubMed Central

    Anaissi, Ali; Goyal, Madhu; Catchpoole, Daniel R.; Braytee, Ali; Kennedy, Paul J.

    2016-01-01

    The identification of a subset of genes having the ability to capture the necessary information to distinguish classes of patients is crucial in bioinformatics applications. Ensemble and bagging methods have been shown to work effectively in the process of gene selection and classification. Testament to that is random forest which combines random decision trees with bagging to improve overall feature selection and classification accuracy. Surprisingly, the adoption of these methods in support vector machines has only recently received attention but mostly on classification not gene selection. This paper introduces an ensemble SVM-Recursive Feature Elimination (ESVM-RFE) for gene selection that follows the concepts of ensemble and bagging used in random forest but adopts the backward elimination strategy which is the rationale of RFE algorithm. The rationale behind this is, building ensemble SVM models using randomly drawn bootstrap samples from the training set, will produce different feature rankings which will be subsequently aggregated as one feature ranking. As a result, the decision for elimination of features is based upon the ranking of multiple SVM models instead of choosing one particular model. Moreover, this approach will address the problem of imbalanced datasets by constructing a nearly balanced bootstrap sample. Our experiments show that ESVM-RFE for gene selection substantially increased the classification performance on five microarray datasets compared to state-of-the-art methods. Experiments on the childhood leukaemia dataset show that an average 9% better accuracy is achieved by ESVM-RFE over SVM-RFE, and 5% over random forest based approach. The selected genes by the ESVM-RFE algorithm were further explored with Singular Value Decomposition (SVD) which reveals significant clusters with the selected data. PMID:27304923

  12. Randomization Methods in Emergency Setting Trials: A Descriptive Review

    ERIC Educational Resources Information Center

    Corbett, Mark Stephen; Moe-Byrne, Thirimon; Oddie, Sam; McGuire, William

    2016-01-01

    Background: Quasi-randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods: We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic…

  13. Middle Level Practices in European International and Department of Defense Schools.

    ERIC Educational Resources Information Center

    Waggoner, V. Christine; McEwin, C. Kenneth

    1993-01-01

    Discusses results of a 1989-90 survey of 70 randomly selected international schools and 70 randomly selected Department of Defense Schools in Europe. Programs and practices surveyed included enrollments, grade organization, curriculum and instructional plans, core subjects, grouping patterns, exploratory courses, advisory programs, and scheduling.…

  14. Variable selection with random forest: Balancing stability, performance, and interpretation in ecological and environmental modeling

    EPA Science Inventory

    Random forest (RF) is popular in ecological and environmental modeling, in part, because of its insensitivity to correlated predictors and resistance to overfitting. Although variable selection has been proposed to improve both performance and interpretation of RF models, it is u...

  15. Dental fear and caries in 6-12 year old children in Greece. Determination of dental fear cut-off points.

    PubMed

    Boka, V; Arapostathis, K; Karagiannis, V; Kotsanos, N; van Loveren, C; Veerkamp, J

    2017-03-01

    To present: the normative data on dental fear and caries status; the dental fear cut-off points of young children in the city of Thessaloniki, Greece. Study Design: This is a cross-sectional study with two independent study groups. A first representative sample consisted of 1484 children from 15 primary public schools of Thessaloniki. A second sample consisted of 195 randomly selected age-matched children, all patients of the Postgraduate Paediatric Dental Clinic of Aristotle University of Thessaloniki. First sample: In order to select data on dental fear and caries, dental examination took place in the classroom with disposable mirrors and a penlight. All the children completed the Dental Subscale of the Children's Fear Survey Schedule (CFSS-DS). Second sample: In order to define the cut-off points of the CFSS-DS, dental treatment of the 195 children was performed at the University Clinic. Children⁁s dental fear was assessed using the CFSS-DS and their behaviour during dental treatment was observed by one calibrated examiner using the Venham scale. Statistical analysis of the data was performed with IBM SPSS Statistics 20 at a statistical significance level of <0.05. First sample: The mean CFSS-DS score was 27.1±10.8. Age was significantly (p<0.05) related to dental fear. Mean differences between boys and girls were not significant. Caries was not correlated with dental fear. Second sample: CFSS-DS< 33 was defined as 'no dental fear', scores 33-37 as 'borderline' and scores > 37 as 'dental fear'. In the first sample, 84.6% of the children did not suffer from dental fear (CFSS-DS<33). Dental fear was correlated to age and not to caries and gender. The dental fear cut-off point for the CFSS-DS was estimated at 37 for 6-12 year old children (33-37 borderlines).

  16. Hebbian Learning in a Random Network Captures Selectivity Properties of the Prefrontal Cortex.

    PubMed

    Lindsay, Grace W; Rigotti, Mattia; Warden, Melissa R; Miller, Earl K; Fusi, Stefano

    2017-11-08

    Complex cognitive behaviors, such as context-switching and rule-following, are thought to be supported by the prefrontal cortex (PFC). Neural activity in the PFC must thus be specialized to specific tasks while retaining flexibility. Nonlinear "mixed" selectivity is an important neurophysiological trait for enabling complex and context-dependent behaviors. Here we investigate (1) the extent to which the PFC exhibits computationally relevant properties, such as mixed selectivity, and (2) how such properties could arise via circuit mechanisms. We show that PFC cells recorded from male and female rhesus macaques during a complex task show a moderate level of specialization and structure that is not replicated by a model wherein cells receive random feedforward inputs. While random connectivity can be effective at generating mixed selectivity, the data show significantly more mixed selectivity than predicted by a model with otherwise matched parameters. A simple Hebbian learning rule applied to the random connectivity, however, increases mixed selectivity and enables the model to match the data more accurately. To explain how learning achieves this, we provide analysis along with a clear geometric interpretation of the impact of learning on selectivity. After learning, the model also matches the data on measures of noise, response density, clustering, and the distribution of selectivities. Of two styles of Hebbian learning tested, the simpler and more biologically plausible option better matches the data. These modeling results provide clues about how neural properties important for cognition can arise in a circuit and make clear experimental predictions regarding how various measures of selectivity would evolve during animal training. SIGNIFICANCE STATEMENT The prefrontal cortex is a brain region believed to support the ability of animals to engage in complex behavior. How neurons in this area respond to stimuli-and in particular, to combinations of stimuli ("mixed selectivity")-is a topic of interest. Even though models with random feedforward connectivity are capable of creating computationally relevant mixed selectivity, such a model does not match the levels of mixed selectivity seen in the data analyzed in this study. Adding simple Hebbian learning to the model increases mixed selectivity to the correct level and makes the model match the data on several other relevant measures. This study thus offers predictions on how mixed selectivity and other properties evolve with training. Copyright © 2017 the authors 0270-6474/17/3711021-16$15.00/0.

  17. Sirenum Fossae Trough

    NASA Technical Reports Server (NTRS)

    2000-01-01

    [figure removed for brevity, see original site]

    The Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) orbits the red planet twelve times each day. The number of pictures that MOC can take varies from orbit to orbit, depending upon whether the data are being stored in MGS's onboard tape recorder for playback at a later time, or whether the data are being sent directly back to Earth via a real-time radio link. More data can be acquired during orbits with real-time downlink.

    During real-time orbits, the MOC team often will take a few random or semi-random pictures in between the carefully-selected, hand-targeted images. On rare occasions, one of these random pictures will surprise the MOC team. The picture shown here is an excellent example, because the high resolution view (top) is centered so nicely on a trough and an adjacent, shallow crater that it is as if someone very carefully selected the target for MOC. The high-resolution view covers an area only 1.1 km (0.7 mi) wide by 2.3 km (1.4 mi) long. Hitting a target such as this with such a small image is very difficult to do, on purpose, because there are small uncertainties in the predicted orbit, the maps used to select targets, and the minor adjustments of spacecraft pointing at any given moment. Nevertheless, a very impressive image was received.

    The high resolution view crosses one of the troughs of the Sirenum Fossae near 31.2oS, 152.3oW. The context image (above) was acquired at the same time as the high resolution view on July 23, 2000. The small white box shows the location of the high resolution picture. The lines running diagonally across the context image from upper right toward lower left are the Sirenum Fossae troughs, formed by faults that are radial to the volcanic region of Tharsis. Both pictures are illuminated from the upper left. The scene shows part of the martian southern hemisphere nearly autumn.

  18. Automatic Recognition of Indoor Navigation Elements from Kinect Point Clouds

    NASA Astrophysics Data System (ADS)

    Zeng, L.; Kang, Z.

    2017-09-01

    This paper realizes automatically the navigating elements defined by indoorGML data standard - door, stairway and wall. The data used is indoor 3D point cloud collected by Kinect v2 launched in 2011 through the means of ORB-SLAM. By contrast, it is cheaper and more convenient than lidar, but the point clouds also have the problem of noise, registration error and large data volume. Hence, we adopt a shape descriptor - histogram of distances between two randomly chosen points, proposed by Osada and merges with other descriptor - in conjunction with random forest classifier to recognize the navigation elements (door, stairway and wall) from Kinect point clouds. This research acquires navigation elements and their 3-d location information from each single data frame through segmentation of point clouds, boundary extraction, feature calculation and classification. Finally, this paper utilizes the acquired navigation elements and their information to generate the state data of the indoor navigation module automatically. The experimental results demonstrate a high recognition accuracy of the proposed method.

  19. Validation of psoriatic arthritis diagnoses in electronic medical records using natural language processing

    PubMed Central

    Cai, Tianxi; Karlson, Elizabeth W.

    2013-01-01

    Objectives To test whether data extracted from full text patient visit notes from an electronic medical record (EMR) would improve the classification of PsA compared to an algorithm based on codified data. Methods From the > 1,350,000 adults in a large academic EMR, all 2318 patients with a billing code for PsA were extracted and 550 were randomly selected for chart review and algorithm training. Using codified data and phrases extracted from narrative data using natural language processing, 31 predictors were extracted and three random forest algorithms trained using coded, narrative, and combined predictors. The receiver operator curve (ROC) was used to identify the optimal algorithm and a cut point was chosen to achieve the maximum sensitivity possible at a 90% positive predictive value (PPV). The algorithm was then used to classify the remaining 1768 charts and finally validated in a random sample of 300 cases predicted to have PsA. Results The PPV of a single PsA code was 57% (95%CI 55%–58%). Using a combination of coded data and NLP the random forest algorithm reached a PPV of 90% (95%CI 86%–93%) at sensitivity of 87% (95% CI 83% – 91%) in the training data. The PPV was 93% (95%CI 89%–96%) in the validation set. Adding NLP predictors to codified data increased the area under the ROC (p < 0.001). Conclusions Using NLP with text notes from electronic medical records improved the performance of the prediction algorithm significantly. Random forests were a useful tool to accurately classify psoriatic arthritis cases to enable epidemiological research. PMID:20701955

  20. Clinical trials in crisis: four simple methodologic fixes

    PubMed Central

    Vickers, Andrew J.

    2014-01-01

    There is growing consensus that the US clinical trials system is broken, with trial costs and complexity increasing exponentially, and many trials failing to accrue. Yet concerns about the expense and failure rate of randomized trials are only the tip of the iceberg; perhaps what should worry us most is the number of trials that are not even considered because of projected costs and poor accrual. Several initiatives, including the Clinical Trials Transformation Initiative and the “Sensible Guidelines Group” seek to push back against current trends in clinical trials, arguing that all aspects of trials - including design, approval, conduct, monitoring, analysis and dissemination - should be based on evidence rather than contemporary norms. Proposed here are four methodologic fixes for current clinical trials. The first two aim to simplify trials, reducing costs and increasing patient acceptability by dramatically reducing eligibility criteria - often to the single criterion that the consenting physician is uncertain which of the two randomized arms is optimal - and by clinical integration, investment in data infrastructure to bring routinely collected data up to research grade to be used as endpoints in trials. The second two methodologic fixes aim to shed barriers to accrual, either by cluster randomization of clinicians (in the case of modifications to existing treatment) or by early consent, where patients are offered the chance of being randomly selected to be offered a novel intervention if disease progresses at a subsequent point. Such solutions may be partial, or result in a new set of problems of their own. Yet the current crisis in clinical trials mandates innovative approaches: randomized trials have resulted in enormous benefits for patients and we need to ensure that they continue to do so. PMID:25278228

Top