Variable selection under multiple imputation using the bootstrap in a prognostic study
Heymans, Martijn W; van Buuren, Stef; Knol, Dirk L; van Mechelen, Willem; de Vet, Henrica CW
2007-01-01
Background Missing data is a challenging problem in many prognostic studies. Multiple imputation (MI) accounts for imputation uncertainty that allows for adequate statistical testing. We developed and tested a methodology combining MI with bootstrapping techniques for studying prognostic variable selection. Method In our prospective cohort study we merged data from three different randomized controlled trials (RCTs) to assess prognostic variables for chronicity of low back pain. Among the outcome and prognostic variables data were missing in the range of 0 and 48.1%. We used four methods to investigate the influence of respectively sampling and imputation variation: MI only, bootstrap only, and two methods that combine MI and bootstrapping. Variables were selected based on the inclusion frequency of each prognostic variable, i.e. the proportion of times that the variable appeared in the model. The discriminative and calibrative abilities of prognostic models developed by the four methods were assessed at different inclusion levels. Results We found that the effect of imputation variation on the inclusion frequency was larger than the effect of sampling variation. When MI and bootstrapping were combined at the range of 0% (full model) to 90% of variable selection, bootstrap corrected c-index values of 0.70 to 0.71 and slope values of 0.64 to 0.86 were found. Conclusion We recommend to account for both imputation and sampling variation in sets of missing data. The new procedure of combining MI with bootstrapping for variable selection, results in multivariable prognostic models with good performance and is therefore attractive to apply on data sets with missing values. PMID:17629912
ERIC Educational Resources Information Center
Enders, Craig K.
2005-01-01
The Bollen-Stine bootstrap can be used to correct for standard error and fit statistic bias that occurs in structural equation modeling (SEM) applications due to nonnormal data. The purpose of this article is to demonstrate the use of a custom SAS macro program that can be used to implement the Bollen-Stine bootstrap with existing SEM software.…
Comulada, W. Scott
2015-01-01
Stata’s mi commands provide powerful tools to conduct multiple imputation in the presence of ignorable missing data. In this article, I present Stata code to extend the capabilities of the mi commands to address two areas of statistical inference where results are not easily aggregated across imputed datasets. First, mi commands are restricted to covariate selection. I show how to address model fit to correctly specify a model. Second, the mi commands readily aggregate model-based standard errors. I show how standard errors can be bootstrapped for situations where model assumptions may not be met. I illustrate model specification and bootstrapping on frequency counts for the number of times that alcohol was consumed in data with missing observations from a behavioral intervention. PMID:26973439
Wang, Yi; Zheng, Tong; Zhao, Ying; Jiang, Jiping; Wang, Yuanyuan; Guo, Liang; Wang, Peng
2013-12-01
In this paper, bootstrapped wavelet neural network (BWNN) was developed for predicting monthly ammonia nitrogen (NH(4+)-N) and dissolved oxygen (DO) in Harbin region, northeast of China. The Morlet wavelet basis function (WBF) was employed as a nonlinear activation function of traditional three-layer artificial neural network (ANN) structure. Prediction intervals (PI) were constructed according to the calculated uncertainties from the model structure and data noise. Performance of BWNN model was also compared with four different models: traditional ANN, WNN, bootstrapped ANN, and autoregressive integrated moving average model. The results showed that BWNN could handle the severely fluctuating and non-seasonal time series data of water quality, and it produced better performance than the other four models. The uncertainty from data noise was smaller than that from the model structure for NH(4+)-N; conversely, the uncertainty from data noise was larger for DO series. Besides, total uncertainties in the low-flow period were the biggest due to complicated processes during the freeze-up period of the Songhua River. Further, a data missing-refilling scheme was designed, and better performances of BWNNs for structural data missing (SD) were observed than incidental data missing (ID). For both ID and SD, temporal method was satisfactory for filling NH(4+)-N series, whereas spatial imputation was fit for DO series. This filling BWNN forecasting method was applied to other areas suffering "real" data missing, and the results demonstrated its efficiency. Thus, the methods introduced here will help managers to obtain informed decisions.
Progress Toward Steady State Tokamak Operation Exploiting the high bootstrap current fraction regime
NASA Astrophysics Data System (ADS)
Ren, Q.
2015-11-01
Recent DIII-D experiments have advanced the normalized fusion performance of the high bootstrap current fraction tokamak regime toward reactor-relevant steady state operation. The experiments, conducted by a joint team of researchers from the DIII-D and EAST tokamaks, developed a fully noninductive scenario that could be extended on EAST to a demonstration of long pulse steady-state tokamak operation. Fully noninductive plasmas with extremely high values of the poloidal beta, βp >= 4 , have been sustained at βT >= 2 % for long durations with excellent energy confinement quality (H98y,2 >= 1 . 5) and internal transport barriers (ITBs) generated at large minor radius (>= 0 . 6) in all channels (Te, Ti, ne, VTf). Large bootstrap fraction (fBS ~ 80 %) has been obtained with high βp. ITBs have been shown to be compatible with steady state operation. Because of the unusually large ITB radius, normalized pressure is not limited to low βN values by internal ITB-driven modes. βN up to ~4.3 has been obtained by optimizing the plasma-wall distance. The scenario is robust against several variations, including replacing some on-axis with off-axis neutral beam injection (NBI), adding electron cyclotron (EC) heating, and reducing the NBI torque by a factor of 2. This latter observation is particularly promising for extension of the scenario to EAST, where maximum power is obtained with balanced NBI injection, and to a reactor, expected to have low rotation. However, modeling of this regime has provided new challenges to state-of-the-art modeling capabilities: quasilinear models can dramatically underpredict the electron transport, and the Sauter bootstrap current can be insufficient. The analysis shows first-principle NEO is in good agreement with experiments for the bootstrap current calculation and ETG modes with a larger saturated amplitude or EM modes may provide the missing electron transport. Work supported in part by the US DOE under DE-FC02-04ER54698, DE-AC52-07NA27344, DE-AC02-09CH11466, and the NMCFP of China under 2015GB110000 and 2015GB102000.
On the Model-Based Bootstrap with Missing Data: Obtaining a "P"-Value for a Test of Exact Fit
ERIC Educational Resources Information Center
Savalei, Victoria; Yuan, Ke-Hai
2009-01-01
Evaluating the fit of a structural equation model via bootstrap requires a transformation of the data so that the null hypothesis holds exactly in the sample. For complete data, such a transformation was proposed by Beran and Srivastava (1985) for general covariance structure models and applied to structural equation modeling by Bollen and Stine…
Kaufmann, Esther; Wittmann, Werner W.
2016-01-01
The success of bootstrapping or replacing a human judge with a model (e.g., an equation) has been demonstrated in Paul Meehl’s (1954) seminal work and bolstered by the results of several meta-analyses. To date, however, analyses considering different types of meta-analyses as well as the potential dependence of bootstrapping success on the decision domain, the level of expertise of the human judge, and the criterion for what constitutes an accurate decision have been missing from the literature. In this study, we addressed these research gaps by conducting a meta-analysis of lens model studies. We compared the results of a traditional (bare-bones) meta-analysis with findings of a meta-analysis of the success of bootstrap models corrected for various methodological artifacts. In line with previous studies, we found that bootstrapping was more successful than human judgment. Furthermore, bootstrapping was more successful in studies with an objective decision criterion than in studies with subjective or test score criteria. We did not find clear evidence that the success of bootstrapping depended on the decision domain (e.g., education or medicine) or on the judge’s level of expertise (novice or expert). Correction of methodological artifacts increased the estimated success of bootstrapping, suggesting that previous analyses without artifact correction (i.e., traditional meta-analyses) may have underestimated the value of bootstrapping models. PMID:27327085
Effects of magnetic islands on bootstrap current in toroidal plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, G.; Lin, Z.
The effects of magnetic islands on electron bootstrap current in toroidal plasmas are studied using gyrokinetic simulations. The magnetic islands cause little changes of the bootstrap current level in the banana regime because of trapped electron effects. In the plateau regime, the bootstrap current is completely suppressed at the island centers due to the destruction of trapped electron orbits by collisions and the flattening of pressure profiles by the islands. In the collisional regime, small but finite bootstrap current can exist inside the islands because of the pressure gradients created by large collisional transport across the islands. Lastly, simulation resultsmore » show that the bootstrap current level increases near the island separatrix due to steeper local density gradients.« less
Effects of magnetic islands on bootstrap current in toroidal plasmas
Dong, G.; Lin, Z.
2016-12-19
The effects of magnetic islands on electron bootstrap current in toroidal plasmas are studied using gyrokinetic simulations. The magnetic islands cause little changes of the bootstrap current level in the banana regime because of trapped electron effects. In the plateau regime, the bootstrap current is completely suppressed at the island centers due to the destruction of trapped electron orbits by collisions and the flattening of pressure profiles by the islands. In the collisional regime, small but finite bootstrap current can exist inside the islands because of the pressure gradients created by large collisional transport across the islands. Lastly, simulation resultsmore » show that the bootstrap current level increases near the island separatrix due to steeper local density gradients.« less
Reduced ion bootstrap current drive on NTM instability
NASA Astrophysics Data System (ADS)
Qu, Hongpeng; Wang, Feng; Wang, Aike; Peng, Xiaodong; Li, Jiquan
2018-05-01
The loss of bootstrap current inside magnetic island plays a dominant role in driving the neoclassical tearing mode (NTM) instability in tokamak plasmas. In this work, we investigate the finite-banana-width (FBW) effect on the profile of ion bootstrap current in the island vicinity via an analytical approach. The results show that even if the pressure gradient vanishes inside the island, the ion bootstrap current can partly survive due to the FBW effect. The efficiency of the FBW effect is higher when the island width becomes smaller. Nevertheless, even when the island width is comparable to the ion FBW, the unperturbed ion bootstrap current inside the island cannot be largely recovered by the FBW effect, and thus the current loss still exists. This suggests that FBW effect alone cannot dramatically reduce the ion bootstrap current drive on NTMs.
NASA Astrophysics Data System (ADS)
Monticello, D. A.; Reiman, A. H.; Watanabe, K. Y.; Nakajima, N.; Okamoto, M.
1997-11-01
The existence of bootstrap currents in both tokamaks and stellarators was confirmed, experimentally, more than ten years ago. Such currents can have significant effects on the equilibrium and stability of these MHD devices. In addition, stellarators, with the notable exception of W7-X, are predicted to have such large bootstrap currents that reliable equilibrium calculations require the self-consistent evaluation of bootstrap currents. Modeling of discharges which contain islands requires an algorithm that does not assume good surfaces. Only one of the two 3-D equilibrium codes that exist, PIES( Reiman, A. H., Greenside, H. S., Compt. Phys. Commun. 43), (1986)., can easily be modified to handle bootstrap current. Here we report on the coupling of the PIES 3-D equilibrium code and NIFS bootstrap code(Watanabe, K., et al., Nuclear Fusion 35) (1995), 335.
Test of bootstrap current models using high- β p EAST-demonstration plasmas on DIII-D
Ren, Qilong; Lao, Lang L.; Garofalo, Andrea M.; ...
2015-01-12
Magnetic measurements together with kinetic profile and motional Stark effect measurements are used in full kinetic equilibrium reconstructions to test the Sauter and NEO bootstrap current models in a DIII-D high-more » $${{\\beta}_{\\text{p}}}$$ EAST-demonstration experiment. This aims at developing on DIII-D a high bootstrap current scenario to be extended on EAST for a demonstration of true steady-state at high performance and uses EAST-similar operational conditions: plasma shape, plasma current, toroidal magnetic field, total heating power and current ramp-up rate. It is found that the large edge bootstrap current in these high-$${{\\beta}_{\\text{p}}}$$ plasmas allows the use of magnetic measurements to clearly distinguish the two bootstrap current models. In these high collisionality and high-$${{\\beta}_{\\text{p}}}$$ plasmas, the Sauter model overpredicts the peak of the edge current density by about 30%, while the first-principle kinetic NEO model is in close agreement with the edge current density of the reconstructed equilibrium. Furthermore, these results are consistent with recent work showing that the Sauter model largely overestimates the edge bootstrap current at high collisionality.« less
Bootstrap and fast wave current drive for tokamak reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ehst, D.A.
1991-09-01
Using the multi-species neoclassical treatment of Hirshman and Sigmar we study steady state bootstrap equilibria with seed currents provided by low frequency (ICRF) fast waves and with additional surface current density driven by lower hybrid waves. This study applies to reactor plasmas of arbitrary aspect ratio. IN one limit the bootstrap component can supply nearly the total equilibrium current with minimal driving power (< 20 MW). However, for larger total currents considerable driving power is required (for ITER: I{sub o} = 18 MA needs P{sub FW} = 15 MW, P{sub LH} = 75 MW). A computational survey of bootstrap fractionmore » and current drive efficiency is presented. 11 refs., 8 figs.« less
Bootstrap current in a tokamak
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kessel, C.E.
1994-03-01
The bootstrap current in a tokamak is examined by implementing the Hirshman-Sigmar model and comparing the predicted current profiles with those from two popular approximations. The dependences of the bootstrap current profile on the plasma properties are illustrated. The implications for steady state tokamaks are presented through two constraints; the pressure profile must be peaked and {beta}{sub p} must be kept below a critical value.
NASA Astrophysics Data System (ADS)
Chatziantonaki, Ioanna; Tsironis, Christos; Isliker, Heinz; Vlahos, Loukas
2013-11-01
The most promising technique for the control of neoclassical tearing modes in tokamak experiments is the compensation of the missing bootstrap current with an electron-cyclotron current drive (ECCD). In this frame, the dynamics of magnetic islands has been studied extensively in terms of the modified Rutherford equation (MRE), including the presence of a current drive, either analytically described or computed by numerical methods. In this article, a self-consistent model for the dynamic evolution of the magnetic island and the driven current is derived, which takes into account the island's magnetic topology and its effect on the current drive. The model combines the MRE with a ray-tracing approach to electron-cyclotron wave-propagation and absorption. Numerical results exhibit a decrease in the time required for complete stabilization with respect to the conventional computation (not taking into account the island geometry), which increases by increasing the initial island size and radial misalignment of the deposition.
Control of bootstrap current in the pedestal region of tokamaks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaing, K. C.; Department of Engineering Physics, University of Wisconsin, Madison, Wisconsin 53796; Lai, A. L.
2013-12-15
The high confinement mode (H-mode) plasmas in the pedestal region of tokamaks are characterized by steep gradient of the radial electric field, and sonic poloidal U{sub p,m} flow that consists of poloidal components of the E×B flow and the plasma flow velocity that is parallel to the magnetic field B. Here, E is the electric field. The bootstrap current that is important for the equilibrium, and stability of the pedestal of H-mode plasmas is shown to have an expression different from that in the conventional theory. In the limit where ‖U{sub p,m}‖≫ 1, the bootstrap current is driven by themore » electron temperature gradient and inductive electric field fundamentally different from that in the conventional theory. The bootstrap current in the pedestal region can be controlled through manipulating U{sub p,m} and the gradient of the radial electric. This, in turn, can control plasma stability such as edge-localized modes. Quantitative evaluations of various coefficients are shown to illustrate that the bootstrap current remains finite when ‖U{sub p,m}‖ approaches infinite and to provide indications how to control the bootstrap current. Approximate analytic expressions for viscous coefficients that join results in the banana and plateau-Pfirsch-Schluter regimes are presented to facilitate bootstrap and neoclassical transport simulations in the pedestal region.« less
Weak percolation on multiplex networks
NASA Astrophysics Data System (ADS)
Baxter, Gareth J.; Dorogovtsev, Sergey N.; Mendes, José F. F.; Cellai, Davide
2014-04-01
Bootstrap percolation is a simple but nontrivial model. It has applications in many areas of science and has been explored on random networks for several decades. In single-layer (simplex) networks, it has been recently observed that bootstrap percolation, which is defined as an incremental process, can be seen as the opposite of pruning percolation, where nodes are removed according to a connectivity rule. Here we propose models of both bootstrap and pruning percolation for multiplex networks. We collectively refer to these two models with the concept of "weak" percolation, to distinguish them from the somewhat classical concept of ordinary ("strong") percolation. While the two models coincide in simplex networks, we show that they decouple when considering multiplexes, giving rise to a wealth of critical phenomena. Our bootstrap model constitutes the simplest example of a contagion process on a multiplex network and has potential applications in critical infrastructure recovery and information security. Moreover, we show that our pruning percolation model may provide a way to diagnose missing layers in a multiplex network. Finally, our analytical approach allows us to calculate critical behavior and characterize critical clusters.
Electron transport fluxes in potato plateau regime
NASA Astrophysics Data System (ADS)
Shaing, K. C.; Hazeltine, R. D.
1997-12-01
Electron transport fluxes in the potato plateau regime are calculated from the solutions of the drift kinetic equation and fluid equations. It is found that the bootstrap current density remains finite in the region close to the magnetic axis, although it decreases with increasing collision frequency. This finite amount of the bootstrap current in the relatively collisional regime is important in modeling tokamak startup with 100% bootstrap current.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaing, K.C.; Hazeltine, R.D.
Electron transport fluxes in the potato plateau regime are calculated from the solutions of the drift kinetic equation and fluid equations. It is found that the bootstrap current density remains finite in the region close to the magnetic axis, although it decreases with increasing collision frequency. This finite amount of the bootstrap current in the relatively collisional regime is important in modeling tokamak startup with 100{percent} bootstrap current. {copyright} {ital 1997 American Institute of Physics.}
Variable Selection in the Presence of Missing Data: Imputation-based Methods.
Zhao, Yize; Long, Qi
2017-01-01
Variable selection plays an essential role in regression analysis as it identifies important variables that associated with outcomes and is known to improve predictive accuracy of resulting models. Variable selection methods have been widely investigated for fully observed data. However, in the presence of missing data, methods for variable selection need to be carefully designed to account for missing data mechanisms and statistical techniques used for handling missing data. Since imputation is arguably the most popular method for handling missing data due to its ease of use, statistical methods for variable selection that are combined with imputation are of particular interest. These methods, valid used under the assumptions of missing at random (MAR) and missing completely at random (MCAR), largely fall into three general strategies. The first strategy applies existing variable selection methods to each imputed dataset and then combine variable selection results across all imputed datasets. The second strategy applies existing variable selection methods to stacked imputed datasets. The third variable selection strategy combines resampling techniques such as bootstrap with imputation. Despite recent advances, this area remains under-developed and offers fertile ground for further research.
NASA Astrophysics Data System (ADS)
Al-Mudhafar, W. J.
2013-12-01
Precisely prediction of rock facies leads to adequate reservoir characterization by improving the porosity-permeability relationships to estimate the properties in non-cored intervals. It also helps to accurately identify the spatial facies distribution to perform an accurate reservoir model for optimal future reservoir performance. In this paper, the facies estimation has been done through Multinomial logistic regression (MLR) with respect to the well logs and core data in a well in upper sandstone formation of South Rumaila oil field. The entire independent variables are gamma rays, formation density, water saturation, shale volume, log porosity, core porosity, and core permeability. Firstly, Robust Sequential Imputation Algorithm has been considered to impute the missing data. This algorithm starts from a complete subset of the dataset and estimates sequentially the missing values in an incomplete observation by minimizing the determinant of the covariance of the augmented data matrix. Then, the observation is added to the complete data matrix and the algorithm continues with the next observation with missing values. The MLR has been chosen to estimate the maximum likelihood and minimize the standard error for the nonlinear relationships between facies & core and log data. The MLR is used to predict the probabilities of the different possible facies given each independent variable by constructing a linear predictor function having a set of weights that are linearly combined with the independent variables by using a dot product. Beta distribution of facies has been considered as prior knowledge and the resulted predicted probability (posterior) has been estimated from MLR based on Baye's theorem that represents the relationship between predicted probability (posterior) with the conditional probability and the prior knowledge. To assess the statistical accuracy of the model, the bootstrap should be carried out to estimate extra-sample prediction error by randomly drawing datasets with replacement from the training data. Each sample has the same size of the original training set and it can be conducted N times to produce N bootstrap datasets to re-fit the model accordingly to decrease the squared difference between the estimated and observed categorical variables (facies) leading to decrease the degree of uncertainty.
Wahl, Simone; Boulesteix, Anne-Laure; Zierer, Astrid; Thorand, Barbara; van de Wiel, Mark A
2016-10-26
Missing values are a frequent issue in human studies. In many situations, multiple imputation (MI) is an appropriate missing data handling strategy, whereby missing values are imputed multiple times, the analysis is performed in every imputed data set, and the obtained estimates are pooled. If the aim is to estimate (added) predictive performance measures, such as (change in) the area under the receiver-operating characteristic curve (AUC), internal validation strategies become desirable in order to correct for optimism. It is not fully understood how internal validation should be combined with multiple imputation. In a comprehensive simulation study and in a real data set based on blood markers as predictors for mortality, we compare three combination strategies: Val-MI, internal validation followed by MI on the training and test parts separately, MI-Val, MI on the full data set followed by internal validation, and MI(-y)-Val, MI on the full data set omitting the outcome followed by internal validation. Different validation strategies, including bootstrap und cross-validation, different (added) performance measures, and various data characteristics are considered, and the strategies are evaluated with regard to bias and mean squared error of the obtained performance estimates. In addition, we elaborate on the number of resamples and imputations to be used, and adopt a strategy for confidence interval construction to incomplete data. Internal validation is essential in order to avoid optimism, with the bootstrap 0.632+ estimate representing a reliable method to correct for optimism. While estimates obtained by MI-Val are optimistically biased, those obtained by MI(-y)-Val tend to be pessimistic in the presence of a true underlying effect. Val-MI provides largely unbiased estimates, with a slight pessimistic bias with increasing true effect size, number of covariates and decreasing sample size. In Val-MI, accuracy of the estimate is more strongly improved by increasing the number of bootstrap draws rather than the number of imputations. With a simple integrated approach, valid confidence intervals for performance estimates can be obtained. When prognostic models are developed on incomplete data, Val-MI represents a valid strategy to obtain estimates of predictive performance measures.
NASA Astrophysics Data System (ADS)
Peraza-Rodriguez, H.; Reynolds-Barredo, J. M.; Sanchez, R.; Tribaldos, V.; Geiger, J.
2018-02-01
The recently developed free-plasma-boundary version of the SIESTA MHD equilibrium code (Hirshman et al 2011 Phys. Plasmas 18 062504; Peraza-Rodriguez et al 2017 Phys. Plasmas 24 082516) is used for the first time to study scenarios with considerable bootstrap currents for the Wendelstein 7-X (W7-X) stellarator. Bootstrap currents in the range of tens of kAs can lead to the formation of unwanted magnetic island chains or stochastic regions within the plasma and alter the boundary rotational transform due to the small shear in W7-X. The latter issue is of relevance since the island divertor operation of W7-X relies on a proper positioning of magnetic island chains at the plasma edge to control the particle and energy exhaust towards the divertor plates. Two scenarios are examined with the new free-plasma-boundary capabilities of SIESTA: a freely evolving bootstrap current one that illustrates the difficulties arising from the dislocation of the boundary islands, and a second one in which off-axis electron cyclotron current drive (ECCD) is applied to compensate the effects of the bootstrap current and keep the island divertor configuration intact. SIESTA finds that off-axis ECCD is indeed able to keep the location and phase of the edge magnetic island chain unchanged, but it may also lead to an undesired stochastization of parts of the confined plasma if the EC deposition radial profile becomes too narrow.
ERIC Educational Resources Information Center
Kim, Se-Kang
2010-01-01
The aim of the current study is to validate the invariance of major profile patterns derived from multidimensional scaling (MDS) by bootstrapping. Profile Analysis via Multidimensional Scaling (PAMS) was employed to obtain profiles and bootstrapping was used to construct the sampling distributions of the profile coordinates and the empirical…
Comparison of bootstrap approaches for estimation of uncertainties of DTI parameters.
Chung, SungWon; Lu, Ying; Henry, Roland G
2006-11-01
Bootstrap is an empirical non-parametric statistical technique based on data resampling that has been used to quantify uncertainties of diffusion tensor MRI (DTI) parameters, useful in tractography and in assessing DTI methods. The current bootstrap method (repetition bootstrap) used for DTI analysis performs resampling within the data sharing common diffusion gradients, requiring multiple acquisitions for each diffusion gradient. Recently, wild bootstrap was proposed that can be applied without multiple acquisitions. In this paper, two new approaches are introduced called residual bootstrap and repetition bootknife. We show that repetition bootknife corrects for the large bias present in the repetition bootstrap method and, therefore, better estimates the standard errors. Like wild bootstrap, residual bootstrap is applicable to single acquisition scheme, and both are based on regression residuals (called model-based resampling). Residual bootstrap is based on the assumption that non-constant variance of measured diffusion-attenuated signals can be modeled, which is actually the assumption behind the widely used weighted least squares solution of diffusion tensor. The performances of these bootstrap approaches were compared in terms of bias, variance, and overall error of bootstrap-estimated standard error by Monte Carlo simulation. We demonstrate that residual bootstrap has smaller biases and overall errors, which enables estimation of uncertainties with higher accuracy. Understanding the properties of these bootstrap procedures will help us to choose the optimal approach for estimating uncertainties that can benefit hypothesis testing based on DTI parameters, probabilistic fiber tracking, and optimizing DTI methods.
A Critical Meta-Analysis of Lens Model Studies in Human Judgment and Decision-Making
Kaufmann, Esther; Reips, Ulf-Dietrich; Wittmann, Werner W.
2013-01-01
Achieving accurate judgment (‘judgmental achievement’) is of utmost importance in daily life across multiple domains. The lens model and the lens model equation provide useful frameworks for modeling components of judgmental achievement and for creating tools to help decision makers (e.g., physicians, teachers) reach better judgments (e.g., a correct diagnosis, an accurate estimation of intelligence). Previous meta-analyses of judgment and decision-making studies have attempted to evaluate overall judgmental achievement and have provided the basis for evaluating the success of bootstrapping (i.e., replacing judges by linear models that guide decision making). However, previous meta-analyses have failed to appropriately correct for a number of study design artifacts (e.g., measurement error, dichotomization), which may have potentially biased estimations (e.g., of the variability between studies) and led to erroneous interpretations (e.g., with regards to moderator variables). In the current study we therefore conduct the first psychometric meta-analysis of judgmental achievement studies that corrects for a number of study design artifacts. We identified 31 lens model studies (N = 1,151, k = 49) that met our inclusion criteria. We evaluated overall judgmental achievement as well as whether judgmental achievement depended on decision domain (e.g., medicine, education) and/or the level of expertise (expert vs. novice). We also evaluated whether using corrected estimates affected conclusions with regards to the success of bootstrapping with psychometrically-corrected models. Further, we introduce a new psychometric trim-and-fill method to estimate the effect sizes of potentially missing studies correct psychometric meta-analyses for effects of publication bias. Comparison of the results of the psychometric meta-analysis with the results of a traditional meta-analysis (which only corrected for sampling error) indicated that artifact correction leads to a) an increase in values of the lens model components, b) reduced heterogeneity between studies, and c) increases the success of bootstrapping. We argue that psychometric meta-analysis is useful for accurately evaluating human judgment and show the success of bootstrapping. PMID:24391781
A critical meta-analysis of lens model studies in human judgment and decision-making.
Kaufmann, Esther; Reips, Ulf-Dietrich; Wittmann, Werner W
2013-01-01
Achieving accurate judgment ('judgmental achievement') is of utmost importance in daily life across multiple domains. The lens model and the lens model equation provide useful frameworks for modeling components of judgmental achievement and for creating tools to help decision makers (e.g., physicians, teachers) reach better judgments (e.g., a correct diagnosis, an accurate estimation of intelligence). Previous meta-analyses of judgment and decision-making studies have attempted to evaluate overall judgmental achievement and have provided the basis for evaluating the success of bootstrapping (i.e., replacing judges by linear models that guide decision making). However, previous meta-analyses have failed to appropriately correct for a number of study design artifacts (e.g., measurement error, dichotomization), which may have potentially biased estimations (e.g., of the variability between studies) and led to erroneous interpretations (e.g., with regards to moderator variables). In the current study we therefore conduct the first psychometric meta-analysis of judgmental achievement studies that corrects for a number of study design artifacts. We identified 31 lens model studies (N = 1,151, k = 49) that met our inclusion criteria. We evaluated overall judgmental achievement as well as whether judgmental achievement depended on decision domain (e.g., medicine, education) and/or the level of expertise (expert vs. novice). We also evaluated whether using corrected estimates affected conclusions with regards to the success of bootstrapping with psychometrically-corrected models. Further, we introduce a new psychometric trim-and-fill method to estimate the effect sizes of potentially missing studies correct psychometric meta-analyses for effects of publication bias. Comparison of the results of the psychometric meta-analysis with the results of a traditional meta-analysis (which only corrected for sampling error) indicated that artifact correction leads to a) an increase in values of the lens model components, b) reduced heterogeneity between studies, and c) increases the success of bootstrapping. We argue that psychometric meta-analysis is useful for accurately evaluating human judgment and show the success of bootstrapping.
1984-06-01
SEQUENTIAL TESTING (Bldg. A, Room C) 1300-1330 ’ 1330-1415 1415-1445 1445-1515 BREAK 1515-1545 A TRUNCATED SEQUENTIAL PROBABILITY RATIO TEST J...suicide optical data operational testing reliability random numbers bootstrap methods missing data sequential testing fire support complex computer model carcinogenesis studies EUITION Of 1 NOV 68 I% OBSOLETE a ...contributed papers can be ascertained from the titles of the
A neural network based reputation bootstrapping approach for service selection
NASA Astrophysics Data System (ADS)
Wu, Quanwang; Zhu, Qingsheng; Li, Peng
2015-10-01
With the concept of service-oriented computing becoming widely accepted in enterprise application integration, more and more computing resources are encapsulated as services and published online. Reputation mechanism has been studied to establish trust on prior unknown services. One of the limitations of current reputation mechanisms is that they cannot assess the reputation of newly deployed services as no record of their previous behaviours exists. Most of the current bootstrapping approaches merely assign default reputation values to newcomers. However, by this kind of methods, either newcomers or existing services will be favoured. In this paper, we present a novel reputation bootstrapping approach, where correlations between features and performance of existing services are learned through an artificial neural network (ANN) and they are then generalised to establish a tentative reputation when evaluating new and unknown services. Reputations of services published previously by the same provider are also incorporated for reputation bootstrapping if available. The proposed reputation bootstrapping approach is seamlessly embedded into an existing reputation model and implemented in the extended service-oriented architecture. Empirical studies of the proposed approach are shown at last.
Impact of bootstrap current and Landau-fluid closure on ELM crashes and transport
NASA Astrophysics Data System (ADS)
Chen, J. G.; Xu, X. Q.; Ma, C. H.; Lei, Y. A.
2018-05-01
Results presented here are from 6-field Landau-Fluid simulations using shifted circular cross-section tokamak equilibria on BOUT++ framework. Linear benchmark results imply that the collisional and collisionless Landau resonance closures make a little difference on linear growth rate spectra which are quite close to the results with the flux limited Spitzer-Härm parallel flux. Both linear and nonlinear simulations show that the plasma current profile plays dual roles on the peeling-ballooning modes that it can drive the low-n peeling modes and stabilize the high-n ballooning modes. For fixed total pressure and current, as the pedestal current decreases due to the bootstrap current which becomes smaller when the density (collisionality) increases, the operational point is shifted downwards vertically in the Jped - α diagram, resulting in threshold changes of different modes. The bootstrap current can slightly increase radial turbulence spreading range and enhance the energy and particle transports by increasing the perturbed amplitude and broadening cross-phase frequency distribution.
Validation of neoclassical bootstrap current models in the edge of an H-mode plasma.
Wade, M R; Murakami, M; Politzer, P A
2004-06-11
Analysis of the parallel electric field E(parallel) evolution following an L-H transition in the DIII-D tokamak indicates the generation of a large negative pulse near the edge which propagates inward, indicative of the generation of a noninductive edge current. Modeling indicates that the observed E(parallel) evolution is consistent with a narrow current density peak generated in the plasma edge. Very good quantitative agreement is found between the measured E(parallel) evolution and that expected from neoclassical theory predictions of the bootstrap current.
Shavit Grievink, Liat; Penny, David; Holland, Barbara R.
2013-01-01
Phylogenetic studies based on molecular sequence alignments are expected to become more accurate as the number of sites in the alignments increases. With the advent of genomic-scale data, where alignments have very large numbers of sites, bootstrap values close to 100% and posterior probabilities close to 1 are the norm, suggesting that the number of sites is now seldom a limiting factor on phylogenetic accuracy. This provokes the question, should we be fussy about the sites we choose to include in a genomic-scale phylogenetic analysis? If some sites contain missing data, ambiguous character states, or gaps, then why not just throw them away before conducting the phylogenetic analysis? Indeed, this is exactly the approach taken in many phylogenetic studies. Here, we present an example where the decision on how to treat sites with missing data is of equal importance to decisions on taxon sampling and model choice, and we introduce a graphical method for illustrating this. PMID:23471508
Regge vertex for quark production in the central rapidity region in the next-to-leading order
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kozlov, M. G., E-mail: M.G.Kozlov@inp.nsk.su; Reznichenko, A. V., E-mail: A.V.Reznichenko@inp.nsk.su
2016-03-15
The effective vertex for quark production in the interaction of a Reggeized quark and a Reggeized gluon is calculated in the next-to-leading order (NLO). The resulting vertex is the missing component of the NLO multi-Regge amplitude featuring quark and gluon exchanges in the t channels. This calculation will make it possible to develop in future the bootstrap approach to proving quark Reggeization in the next-to-leading logarithmic approximation.
Multi-baseline bootstrapping at the Navy precision optical interferometer
NASA Astrophysics Data System (ADS)
Armstrong, J. T.; Schmitt, H. R.; Mozurkewich, D.; Jorgensen, A. M.; Muterspaugh, M. W.; Baines, E. K.; Benson, J. A.; Zavala, Robert T.; Hutter, D. J.
2014-07-01
The Navy Precision Optical Interferometer (NPOI) was designed from the beginning to support baseline boot- strapping with equally-spaced array elements. The motivation was the desire to image the surfaces of resolved stars with the maximum resolution possible with a six-element array. Bootstrapping two baselines together to track fringes on a third baseline has been used at the NPOI for many years, but the capabilities of the fringe tracking software did not permit us to bootstrap three or more baselines together. Recently, both a new backend (VISION; Tennessee State Univ.) and new hardware and firmware (AZ Embedded Systems and New Mexico Tech, respectively) for the current hybrid backend have made multi-baseline bootstrapping possible.
Bootstrap Current for the Edge Pedestal Plasma in a Diverted Tokamak Geometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koh, S.; Chang, C. S.; Ku, S.
The edge bootstrap current plays a critical role in the equilibrium and stability of the steep edge pedestal plasma. The pedestal plasma has an unconventional and difficult neoclassical property, as compared with the core plasma. It has a narrow passing particle region in velocity space that can be easily modified or destroyed by Coulomb collisions. At the same time, the edge pedestal plasma has steep pressure and electrostatic potential gradients whose scale-lengths are comparable with the ion banana width, and includes a magnetic separatrix surface, across which the topological properties of the magnetic field and particle orbits change abruptly. Amore » driftkinetic particle code XGC0, equipped with a mass-momentum-energy conserving collision operator, is used to study the edge bootstrap current in a realistic diverted magnetic field geometry with a self-consistent radial electric field. When the edge electrons are in the weakly collisional banana regime, surprisingly, the present kinetic simulation confirms that the existing analytic expressions [represented by O. Sauter et al. , Phys. Plasmas 6 , 2834 (1999)] are still valid in this unconventional region, except in a thin radial layer in contact with the magnetic separatrix. The agreement arises from the dominance of the electron contribution to the bootstrap current compared with ion contribution and from a reasonable separation of the trapped-passing dynamics without a strong collisional mixing. However, when the pedestal electrons are in plateau-collisional regime, there is significant deviation of numerical results from the existing analytic formulas, mainly due to large effective collisionality of the passing and the boundary layer trapped particles in edge region. In a conventional aspect ratio tokamak, the edge bootstrap current from kinetic simulation can be significantly less than that from the Sauter formula if the electron collisionality is high. On the other hand, when the aspect ratio is close to unity, the collisional edge bootstrap current can be significantly greater than that from the Sauter formula. Rapid toroidal rotation of the magnetic field lines at the high field side of a tight aspect-ratio tokamak is believed to be the cause of the different behavior. A new analytic fitting formula, as a simple modification to the Sauter formula, is obtained to bring the analytic expression to a better agreement with the edge kinetic simulation results« less
Bootstrap current for the edge pedestal plasma in a diverted tokamak geometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koh, S.; Choe, W.; Chang, C. S.
The edge bootstrap current plays a critical role in the equilibrium and stability of the steep edge pedestal plasma. The pedestal plasma has an unconventional and difficult neoclassical property, as compared with the core plasma. It has a narrow passing particle region in velocity space that can be easily modified or destroyed by Coulomb collisions. At the same time, the edge pedestal plasma has steep pressure and electrostatic potential gradients whose scale-lengths are comparable with the ion banana width, and includes a magnetic separatrix surface, across which the topological properties of the magnetic field and particle orbits change abruptly. Amore » drift-kinetic particle code XGC0, equipped with a mass-momentum-energy conserving collision operator, is used to study the edge bootstrap current in a realistic diverted magnetic field geometry with a self-consistent radial electric field. When the edge electrons are in the weakly collisional banana regime, surprisingly, the present kinetic simulation confirms that the existing analytic expressions [represented by O. Sauter et al., Phys. Plasmas 6, 2834 (1999)] are still valid in this unconventional region, except in a thin radial layer in contact with the magnetic separatrix. The agreement arises from the dominance of the electron contribution to the bootstrap current compared with ion contribution and from a reasonable separation of the trapped-passing dynamics without a strong collisional mixing. However, when the pedestal electrons are in plateau-collisional regime, there is significant deviation of numerical results from the existing analytic formulas, mainly due to large effective collisionality of the passing and the boundary layer trapped particles in edge region. In a conventional aspect ratio tokamak, the edge bootstrap current from kinetic simulation can be significantly less than that from the Sauter formula if the electron collisionality is high. On the other hand, when the aspect ratio is close to unity, the collisional edge bootstrap current can be significantly greater than that from the Sauter formula. Rapid toroidal rotation of the magnetic field lines at the high field side of a tight aspect-ratio tokamak is believed to be the cause of the different behavior. A new analytic fitting formula, as a simple modification to the Sauter formula, is obtained to bring the analytic expression to a better agreement with the edge kinetic simulation results.« less
Kang, Le; Carter, Randy; Darcy, Kathleen; Kauderer, James; Liao, Shu-Yuan
2013-01-01
In this article we use a latent class model (LCM) with prevalence modeled as a function of covariates to assess diagnostic test accuracy in situations where the true disease status is not observed, but observations on three or more conditionally independent diagnostic tests are available. A fast Monte Carlo EM (MCEM) algorithm with binary (disease) diagnostic data is implemented to estimate parameters of interest; namely, sensitivity, specificity, and prevalence of the disease as a function of covariates. To obtain standard errors for confidence interval construction of estimated parameters, the missing information principle is applied to adjust information matrix estimates. We compare the adjusted information matrix based standard error estimates with the bootstrap standard error estimates both obtained using the fast MCEM algorithm through an extensive Monte Carlo study. Simulation demonstrates that the adjusted information matrix approach estimates the standard error similarly with the bootstrap methods under certain scenarios. The bootstrap percentile intervals have satisfactory coverage probabilities. We then apply the LCM analysis to a real data set of 122 subjects from a Gynecologic Oncology Group (GOG) study of significant cervical lesion (S-CL) diagnosis in women with atypical glandular cells of undetermined significance (AGC) to compare the diagnostic accuracy of a histology-based evaluation, a CA-IX biomarker-based test and a human papillomavirus (HPV) DNA test. PMID:24163493
Jones, Adam G
2015-11-01
Bateman's principles continue to play a major role in the characterization of genetic mating systems in natural populations. The modern manifestations of Bateman's ideas include the opportunity for sexual selection (i.e. I(s) - the variance in relative mating success), the opportunity for selection (i.e. I - the variance in relative reproductive success) and the Bateman gradient (i.e. β(ss) - the slope of the least-squares regression of reproductive success on mating success). These variables serve as the foundation for one convenient approach for the quantification of mating systems. However, their estimation presents at least two challenges, which I address here with a new Windows-based computer software package called BATEMANATER. The first challenge is that confidence intervals for these variables are not easy to calculate. BATEMANATER solves this problem using a bootstrapping approach. The second, more serious, problem is that direct estimates of mating system variables from open populations will typically be biased if some potential progeny or adults are missing from the analysed sample. BATEMANATER addresses this problem using a maximum-likelihood approach to estimate mating system variables from incompletely sampled breeding populations. The current version of BATEMANATER addresses the problem for systems in which progeny can be collected in groups of half- or full-siblings, as would occur when eggs are laid in discrete masses or offspring occur in pregnant females. BATEMANATER has a user-friendly graphical interface and thus represents a new, convenient tool for the characterization and comparison of genetic mating systems. © 2015 John Wiley & Sons Ltd.
Valid statistical inference methods for a case-control study with missing data.
Tian, Guo-Liang; Zhang, Chi; Jiang, Xuejun
2018-04-01
The main objective of this paper is to derive the valid sampling distribution of the observed counts in a case-control study with missing data under the assumption of missing at random by employing the conditional sampling method and the mechanism augmentation method. The proposed sampling distribution, called the case-control sampling distribution, can be used to calculate the standard errors of the maximum likelihood estimates of parameters via the Fisher information matrix and to generate independent samples for constructing small-sample bootstrap confidence intervals. Theoretical comparisons of the new case-control sampling distribution with two existing sampling distributions exhibit a large difference. Simulations are conducted to investigate the influence of the three different sampling distributions on statistical inferences. One finding is that the conclusion by the Wald test for testing independency under the two existing sampling distributions could be completely different (even contradictory) from the Wald test for testing the equality of the success probabilities in control/case groups under the proposed distribution. A real cervical cancer data set is used to illustrate the proposed statistical methods.
NASA Astrophysics Data System (ADS)
Olafsdottir, Kristin B.; Mudelsee, Manfred
2013-04-01
Estimation of the Pearson's correlation coefficient between two time series to evaluate the influences of one time depended variable on another is one of the most often used statistical method in climate sciences. Various methods are used to estimate confidence interval to support the correlation point estimate. Many of them make strong mathematical assumptions regarding distributional shape and serial correlation, which are rarely met. More robust statistical methods are needed to increase the accuracy of the confidence intervals. Bootstrap confidence intervals are estimated in the Fortran 90 program PearsonT (Mudelsee, 2003), where the main intention was to get an accurate confidence interval for correlation coefficient between two time series by taking the serial dependence of the process that generated the data into account. However, Monte Carlo experiments show that the coverage accuracy for smaller data sizes can be improved. Here we adapt the PearsonT program into a new version called PearsonT3, by calibrating the confidence interval to increase the coverage accuracy. Calibration is a bootstrap resampling technique, which basically performs a second bootstrap loop or resamples from the bootstrap resamples. It offers, like the non-calibrated bootstrap confidence intervals, robustness against the data distribution. Pairwise moving block bootstrap is used to preserve the serial correlation of both time series. The calibration is applied to standard error based bootstrap Student's t confidence intervals. The performances of the calibrated confidence intervals are examined with Monte Carlo simulations, and compared with the performances of confidence intervals without calibration, that is, PearsonT. The coverage accuracy is evidently better for the calibrated confidence intervals where the coverage error is acceptably small (i.e., within a few percentage points) already for data sizes as small as 20. One form of climate time series is output from numerical models which simulate the climate system. The method is applied to model data from the high resolution ocean model, INALT01 where the relationship between the Agulhas Leakage and the North Brazil Current is evaluated. Preliminary results show significant correlation between the two variables when there is 10 year lag between them, which is more or less the time that takes the Agulhas Leakage water to reach the North Brazil Current. Mudelsee, M., 2003. Estimating Pearson's correlation coefficient with bootstrap confidence interval from serially dependent time series. Mathematical Geology 35, 651-665.
Non-inductive current generation in fusion plasmas with turbulence
NASA Astrophysics Data System (ADS)
Wang, Weixing; Ethier, S.; Startsev, E.; Chen, J.; Hahm, T. S.; Yoo, M. G.
2017-10-01
It is found that plasma turbulence may strongly influence non-inductive current generation. This may have radical impact on various aspects of tokamak physics. Our simulation study employs a global gyrokinetic model coupling self-consistent neoclassical and turbulent dynamics with focus on electron current. Distinct phases in electron current generation are illustrated in the initial value simulation. In the early phase before turbulence develops, the electron bootstrap current is established in a time scale of a few electron collision times, which closely agrees with the neoclassical prediction. The second phase follows when turbulence begins to saturate, during which turbulent fluctuations are found to strongly affect electron current. The profile structure, amplitude and phase space structure of electron current density are all significantly modified relative to the neoclassical bootstrap current by the presence of turbulence. Both electron parallel acceleration and parallel residual stress drive are shown to play important roles in turbulence-induced current generation. The current density profile is modified in a way that correlates with the fluctuation intensity gradient through its effect on k//-symmetry breaking in fluctuation spectrum. Turbulence is shown to deduct (enhance) plasma self-generated current in low (high) collisionality regime, and the reduction of total electron current relative to the neoclassical bootstrap current increases as collisionality decreases. The implication of this result to the fully non-inductive current operation in steady state burning plasma regime should be investigated. Finally, significant non-inductive current is observed in flat pressure region, which is a nonlocal effect and results from turbulence spreading induced current diffusion. Work supported by U.S. DOE Contract DE-AC02-09-CH11466.
Hager, Robert; Chang, C. S.
2016-04-08
As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. In conclusion, a new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hager, Robert; Chang, C. S.
As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. In conclusion, a new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hager, Robert, E-mail: rhager@pppl.gov; Chang, C. S., E-mail: cschang@pppl.gov
As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. A new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less
Investigation of the n = 1 resistive wall modes in the ITER high-mode confinement
NASA Astrophysics Data System (ADS)
Zheng, L. J.; Kotschenreuther, M. T.; Valanju, P.
2017-06-01
The n = 1 resistive wall mode (RWM) stability of ITER high-mode confinement is investigated with bootstrap current included for equilibrium, together with the rotation and diamagnetic drift effects for stability. Here, n is the toroidal mode number. We use the CORSICA code for computing the free boundary equilibrium and AEGIS code for stability. We find that the inclusion of bootstrap current for equilibrium is critical. It can reduce the local magnetic shear in the pedestal, so that the infernal mode branches can develop. Consequently, the n = 1 modes become unstable without a stabilizing wall at a considerably lower beta limit, driven by the steep pressure gradient in the pedestal. Typical values of the wall position stabilize the ideal mode, but give rise to the ‘pedestal’ resistive wall modes. We find that the rotation can contribute a stabilizing effect on RWMs and the diamagnetic drift effects can further improve the stability in the co-current rotation case. But, generally speaking, the rotation stabilization effects are not as effective as the case without including the bootstrap current effects on equilibrium. We also find that the diamagnetic drift effects are actually destabilizing when there is a counter-current rotation.
Statistical innovations in diagnostic device evaluation.
Yu, Tinghui; Li, Qin; Gray, Gerry; Yue, Lilly Q
2016-01-01
Due to rapid technological development, innovations in diagnostic devices are proceeding at an extremely fast pace. Accordingly, the needs for adopting innovative statistical methods have emerged in the evaluation of diagnostic devices. Statisticians in the Center for Devices and Radiological Health at the Food and Drug Administration have provided leadership in implementing statistical innovations. The innovations discussed in this article include: the adoption of bootstrap and Jackknife methods, the implementation of appropriate multiple reader multiple case study design, the application of robustness analyses for missing data, and the development of study designs and data analyses for companion diagnostics.
A condition for small bootstrap current in three-dimensional toroidal configurations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikhailov, M. I., E-mail: mikhaylov-mi@nrcki.ru; Nührenberg, J.; Zille, R.
2016-11-15
It is shown that, if the maximum of the magnetic field strength on a magnetic surface in a threedimensional magnetic confinement configuration with stellarator symmetry constitutes a line that is orthogonal to the field lines and crosses the symmetry line, then the bootstrap current density is smaller compared to that in quasi-axisymmetric (qa) [J. Nührenberg et al., in Proc. of Joint Varenna−Lausanne Int. Workshop on Theory of Fusion Plasmas, Varenna, 1994, p. 3] and quasi-helically (qh) symmetric [J. Nührenberg and R. Zille, Phys. Lett. A 129, 113 (1988)] configurations.
NASA Astrophysics Data System (ADS)
Cesario, R. C.; Castaldo, C.; Fonseca, A.; De Angelis, R.; Parail, V.; Smeulders, P.; Beurskens, M.; Brix, M.; Calabrò, G.; De Vries, P.; Mailloux, J.; Pericoli, V.; Ravera, G.; Zagorski, R.
2007-09-01
LHCD has been used in JET experiments aimed at producing internal transport barriers (ITBs) in highly triangular plasmas (δ≈0.4) at high βN (up to 3) for steady-state application. The LHCD is a potentially valuable tool for (i) modifying the target q-profile, which can help avoid deleterious MHD modes and favour the formation of ITBs, and (ii) contributing to the non-inductive current drive required to prolong such plasma regimes. The q-profile evolution has been simulated during the current ramp-up phase for such a discharge (B0 = 2.3 T, IP = 1.5 MA) where 2 MW of LHCD has been coupled. The JETTO code was used taking measured plasma profiles, and the LHCD profile modeled by the LHstar code. The results are in agreement with MSE measurements and indicate the importance of the elevated electron temperature due to LHCD, as well as the driven current. During main heating with 18 MW of NBI and 3 MW of ICRH the bootstrap current density at the edge also becomes large, consistently with the observed reduction of the local turbulence and of the MHD activity. JETTO modelling suggests that the bootstrap current can reduce the magnetic shear (sh) at large radius, potentially affecting the MHD stability and turbulence behaviour in this region. Keywords: lower hybrid current drive (LHCD), bootstrap current, q (safety factor) and shear (sh) profile evolutions.
Carving out the end of the world or (superconformal bootstrap in six dimensions)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Chi-Ming; Lin, Ying-Hsuan
We bootstrap N=(1,0) superconformal field theories in six dimensions, by analyzing the four-point function of flavor current multiplets. By assuming E 8 flavor group, we present universal bounds on the central charge C T and the flavor central charge C J. Based on the numerical data, we conjecture that the rank-one E-string theory saturates the universal lower bound on C J , and numerically determine the spectrum of long multiplets in the rank-one E-string theory. We comment on the possibility of solving the higher-rank E-string theories by bootstrap and thereby probing M-theory on AdS 7×S 4/Z 2 .
Carving out the end of the world or (superconformal bootstrap in six dimensions)
Chang, Chi-Ming; Lin, Ying-Hsuan
2017-08-29
We bootstrap N=(1,0) superconformal field theories in six dimensions, by analyzing the four-point function of flavor current multiplets. By assuming E 8 flavor group, we present universal bounds on the central charge C T and the flavor central charge C J. Based on the numerical data, we conjecture that the rank-one E-string theory saturates the universal lower bound on C J , and numerically determine the spectrum of long multiplets in the rank-one E-string theory. We comment on the possibility of solving the higher-rank E-string theories by bootstrap and thereby probing M-theory on AdS 7×S 4/Z 2 .
Innovation cascades: artefacts, organization and attributions
2016-01-01
Innovation cascades inextricably link the introduction of new artefacts, transformations in social organization, and the emergence of new functionalities and new needs. This paper describes a positive feedback dynamic, exaptive bootstrapping, through which these cascades proceed, and the characteristics of the relationships in which the new attributions that drive this dynamic are generated. It concludes by arguing that the exaptive bootstrapping dynamic is the principal driver of our current Innovation Society. PMID:26926284
Zhang, Zhaoyang; Fang, Hua; Wang, Honggang
2016-06-01
Web-delivered trials are an important component in eHealth services. These trials, mostly behavior-based, generate big heterogeneous data that are longitudinal, high dimensional with missing values. Unsupervised learning methods have been widely applied in this area, however, validating the optimal number of clusters has been challenging. Built upon our multiple imputation (MI) based fuzzy clustering, MIfuzzy, we proposed a new multiple imputation based validation (MIV) framework and corresponding MIV algorithms for clustering big longitudinal eHealth data with missing values, more generally for fuzzy-logic based clustering methods. Specifically, we detect the optimal number of clusters by auto-searching and -synthesizing a suite of MI-based validation methods and indices, including conventional (bootstrap or cross-validation based) and emerging (modularity-based) validation indices for general clustering methods as well as the specific one (Xie and Beni) for fuzzy clustering. The MIV performance was demonstrated on a big longitudinal dataset from a real web-delivered trial and using simulation. The results indicate MI-based Xie and Beni index for fuzzy-clustering are more appropriate for detecting the optimal number of clusters for such complex data. The MIV concept and algorithms could be easily adapted to different types of clustering that could process big incomplete longitudinal trial data in eHealth services.
Zhang, Zhaoyang; Wang, Honggang
2016-01-01
Web-delivered trials are an important component in eHealth services. These trials, mostly behavior-based, generate big heterogeneous data that are longitudinal, high dimensional with missing values. Unsupervised learning methods have been widely applied in this area, however, validating the optimal number of clusters has been challenging. Built upon our multiple imputation (MI) based fuzzy clustering, MIfuzzy, we proposed a new multiple imputation based validation (MIV) framework and corresponding MIV algorithms for clustering big longitudinal eHealth data with missing values, more generally for fuzzy-logic based clustering methods. Specifically, we detect the optimal number of clusters by auto-searching and -synthesizing a suite of MI-based validation methods and indices, including conventional (bootstrap or cross-validation based) and emerging (modularity-based) validation indices for general clustering methods as well as the specific one (Xie and Beni) for fuzzy clustering. The MIV performance was demonstrated on a big longitudinal dataset from a real web-delivered trial and using simulation. The results indicate MI-based Xie and Beni index for fuzzy-clustering is more appropriate for detecting the optimal number of clusters for such complex data. The MIV concept and algorithms could be easily adapted to different types of clustering that could process big incomplete longitudinal trial data in eHealth services. PMID:27126063
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyle, Peter; Christ, Norman; Gara, Alan
A list prefetch engine improves a performance of a parallel computing system. The list prefetch engine receives a current cache miss address. The list prefetch engine evaluates whether the current cache miss address is valid. If the current cache miss address is valid, the list prefetch engine compares the current cache miss address and a list address. A list address represents an address in a list. A list describes an arbitrary sequence of prior cache miss addresses. The prefetch engine prefetches data according to the list, if there is a match between the current cache miss address and the listmore » address.« less
Boyle, Peter [Edinburgh, GB; Christ, Norman [Irvington, NY; Gara, Alan [Yorktown Heights, NY; Kim,; Changhoan, [San Jose, CA; Mawhinney, Robert [New York, NY; Ohmacht, Martin [Yorktown Heights, NY; Sugavanam, Krishnan [Yorktown Heights, NY
2012-08-28
A list prefetch engine improves a performance of a parallel computing system. The list prefetch engine receives a current cache miss address. The list prefetch engine evaluates whether the current cache miss address is valid. If the current cache miss address is valid, the list prefetch engine compares the current cache miss address and a list address. A list address represents an address in a list. A list describes an arbitrary sequence of prior cache miss addresses. The prefetch engine prefetches data according to the list, if there is a match between the current cache miss address and the list address.
Impurities in a non-axisymmetric plasma. Transport and effect on bootstrap current
Mollén, A.; Landreman, M.; Smith, H. M.; ...
2015-11-20
Impurities cause radiation losses and plasma dilution, and in stellarator plasmas the neoclassical ambipolar radial electric field is often unfavorable for avoiding strong impurity peaking. In this work we use a new continuum drift-kinetic solver, the SFINCS code (the Stellarator Fokker-Planck Iterative Neoclassical Conservative Solver) [M. Landreman et al., Phys. Plasmas 21 (2014) 042503] which employs the full linearized Fokker-Planck-Landau operator, to calculate neoclassical impurity transport coefficients for a Wendelstein 7-X (W7-X) magnetic configuration. We compare SFINCS calculations with theoretical asymptotes in the high collisionality limit. We observe and explain a 1/nu-scaling of the inter-species radial transport coefficient at lowmore » collisionality, arising due to the field term in the inter-species collision operator, and which is not found with simplified collision models even when momentum correction is applied. However, this type of scaling disappears if a radial electric field is present. We use SFINCS to analyze how the impurity content affects the neoclassical impurity dynamics and the bootstrap current. We show that a change in plasma effective charge Z eff of order unity can affect the bootstrap current enough to cause a deviation in the divertor strike point locations.« less
The prospects for magnetohydrodynamic stability in advanced tokamak regimes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manickam, J.; Chance, M.S.; Jardin, S.C.
1994-05-01
Stability analysis of advanced regime tokamaks is presented. Here advanced regimes are defined to include configurations where the ratio of the bootstrap current, [ital I][sub BS], to the total plasma current, [ital I][sub [ital p
Data imputation analysis for Cosmic Rays time series
NASA Astrophysics Data System (ADS)
Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.
2017-05-01
The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.
NASA Astrophysics Data System (ADS)
Wu, M. Q.; Pan, C. K.; Chan, V. S.; Li, G. Q.; Garofalo, A. M.; Jian, X.; Liu, L.; Ren, Q. L.; Chen, J. L.; Gao, X.; Gong, X. Z.; Ding, S. Y.; Qian, J. P.; Cfetr Physics Team
2018-04-01
Time-dependent integrated modeling of DIII-D ITER-like and high bootstrap current plasma ramp-up discharges has been performed with the equilibrium code EFIT, and the transport codes TGYRO and ONETWO. Electron and ion temperature profiles are simulated by TGYRO with the TGLF (SAT0 or VX model) turbulent and NEO neoclassical transport models. The VX model is a new empirical extension of the TGLF turbulent model [Jian et al., Nucl. Fusion 58, 016011 (2018)], which captures the physics of multi-scale interaction between low-k and high-k turbulence from nonlinear gyro-kinetic simulation. This model is demonstrated to accurately model low Ip discharges from the EAST tokamak. Time evolution of the plasma current density profile is simulated by ONETWO with the experimental current ramp-up rate. The general trend of the predicted evolution of the current density profile is consistent with that obtained from the equilibrium reconstruction with Motional Stark effect constraints. The predicted evolution of βN , li , and βP also agrees well with the experiments. For the ITER-like cases, the predicted electron and ion temperature profiles using TGLF_Sat0 agree closely with the experimental measured profiles, and are demonstrably better than other proposed transport models. For the high bootstrap current case, the predicted electron and ion temperature profiles perform better in the VX model. It is found that the SAT0 model works well at high IP (>0.76 MA) while the VX model covers a wider range of plasma current ( IP > 0.6 MA). The results reported in this paper suggest that the developed integrated modeling could be a candidate for ITER and CFETR ramp-up engineering design modeling.
DAMBE7: New and Improved Tools for Data Analysis in Molecular Biology and Evolution.
Xia, Xuhua
2018-06-01
DAMBE is a comprehensive software package for genomic and phylogenetic data analysis on Windows, Linux, and Macintosh computers. New functions include imputing missing distances and phylogeny simultaneously (paving the way to build large phage and transposon trees), new bootstrapping/jackknifing methods for PhyPA (phylogenetics from pairwise alignments), and an improved function for fast and accurate estimation of the shape parameter of the gamma distribution for fitting rate heterogeneity over sites. Previous method corrects multiple hits for each site independently. DAMBE's new method uses all sites simultaneously for correction. DAMBE, featuring a user-friendly graphic interface, is freely available from http://dambe.bio.uottawa.ca (last accessed, April 17, 2018).
2014-01-01
Background Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. Methods We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. Results In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. Conclusions The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes. PMID:24888356
Sadatsafavi, Mohsen; Marra, Carlo; Aaron, Shawn; Bryan, Stirling
2014-06-03
Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes.
Causality constraints in conformal field theory
Hartman, Thomas; Jain, Sachin; Kundu, Sandipan
2016-05-17
Causality places nontrivial constraints on QFT in Lorentzian signature, for example fixing the signs of certain terms in the low energy Lagrangian. In d dimensional conformal field theory, we show how such constraints are encoded in crossing symmetry of Euclidean correlators, and derive analogous constraints directly from the conformal bootstrap (analytically). The bootstrap setup is a Lorentzian four-point function corresponding to propagation through a shockwave. Crossing symmetry fixes the signs of certain log terms that appear in the conformal block expansion, which constrains the interactions of low-lying operators. As an application, we use the bootstrap to rederive the well knownmore » sign constraint on the (Φ) 4 coupling in effective field theory, from a dual CFT. We also find constraints on theories with higher spin conserved currents. As a result, our analysis is restricted to scalar correlators, but we argue that similar methods should also impose nontrivial constraints on the interactions of spinning operators« less
Conformal Bootstrap in Mellin Space
NASA Astrophysics Data System (ADS)
Gopakumar, Rajesh; Kaviraj, Apratim; Sen, Kallol; Sinha, Aninda
2017-02-01
We propose a new approach towards analytically solving for the dynamical content of conformal field theories (CFTs) using the bootstrap philosophy. This combines the original bootstrap idea of Polyakov with the modern technology of the Mellin representation of CFT amplitudes. We employ exchange Witten diagrams with built-in crossing symmetry as our basic building blocks rather than the conventional conformal blocks in a particular channel. Demanding consistency with the operator product expansion (OPE) implies an infinite set of constraints on operator dimensions and OPE coefficients. We illustrate the power of this method in the ɛ expansion of the Wilson-Fisher fixed point by reproducing anomalous dimensions and, strikingly, obtaining OPE coefficients to higher orders in ɛ than currently available using other analytic techniques (including Feynman diagram calculations). Our results enable us to get a somewhat better agreement between certain observables in the 3D Ising model and the precise numerical values that have been recently obtained.
McClenaghan, Joseph; Garofalo, Andrea M.; Meneghini, Orso; ...
2017-08-03
In this study, transport modeling of a proposed ITER steady-state scenario based on DIII-D high poloidal-beta (more » $${{\\beta}_{p}}$$ ) discharges finds that ITB formation can occur with either sufficient rotation or a negative central shear q-profile. The high $${{\\beta}_{p}}$$ scenario is characterized by a large bootstrap current fraction (80%) which reduces the demands on the external current drive, and a large radius internal transport barrier which is associated with excellent normalized confinement. Modeling predictions of the electron transport in the high $${{\\beta}_{p}}$$ scenario improve as $${{q}_{95}}$$ approaches levels similar to typical existing models of ITER steady-state and the ion transport is turbulence dominated. Typical temperature and density profiles from the non-inductive high $${{\\beta}_{p}}$$ scenario on DIII-D are scaled according to 0D modeling predictions of the requirements for achieving a $Q=5$ steady-state fusion gain in ITER with 'day one' heating and current drive capabilities. Then, TGLF turbulence modeling is carried out under systematic variations of the toroidal rotation and the core q-profile. A high bootstrap fraction, high $${{\\beta}_{p}}$$ scenario is found to be near an ITB formation threshold, and either strong negative central magnetic shear or rotation in a high bootstrap fraction are found to successfully provide the turbulence suppression required to achieve $Q=5$.« less
Transport Barriers in Bootstrap Driven Tokamaks
NASA Astrophysics Data System (ADS)
Staebler, Gary
2017-10-01
Maximizing the bootstrap current in a tokamak, so that it drives a high fraction of the total current, reduces the external power required to drive current by other means. Improved energy confinement, relative to empirical scaling laws, enables a reactor to more fully take advantage of the bootstrap driven tokamak. Experiments have demonstrated improved energy confinement due to the spontaneous formation of an internal transport barrier in high bootstrap fraction discharges. Gyrokinetic analysis, and quasilinear predictive modeling, demonstrates that the observed transport barrier is due to the suppression of turbulence primarily due to the large Shafranov shift. ExB velocity shear does not play a significant role in the transport barrier due to the high safety factor. It will be shown, that the Shafranov shift can produce a bifurcation to improved confinement in regions of positive magnetic shear or a continuous reduction in transport for weak or negative magnetic shear. Operation at high safety factor lowers the pressure gradient threshold for the Shafranov shift driven barrier formation. The ion energy transport is reduced to neoclassical and electron energy and particle transport is reduced, but still turbulent, within the barrier. Deeper into the plasma, very large levels of electron transport are observed. The observed electron temperature profile is shown to be close to the threshold for the electron temperature gradient (ETG) mode. A large ETG driven energy transport is qualitatively consistent with recent multi-scale gyrokinetic simulations showing that reducing the ion scale turbulence can lead to large increase in the electron scale transport. A new saturation model for the quasilinear TGLF transport code, that fits these multi-scale gyrokinetic simulations, can match the data if the impact of zonal flow mixing on the ETG modes is reduced at high safety factor. This work was supported by the U.S. Department of Energy under DE-FG02-95ER54309 and DE-FC02-04ER54698.
Limitations of bootstrap current models
Belli, Emily A.; Candy, Jefferey M.; Meneghini, Orso; ...
2014-03-27
We assess the accuracy and limitations of two analytic models of the tokamak bootstrap current: (1) the well-known Sauter model and (2) a recent modification of the Sauter model by Koh et al. For this study, we use simulations from the first-principles kinetic code NEO as the baseline to which the models are compared. Tests are performed using both theoretical parameter scans as well as core- to-edge scans of real DIII-D and NSTX plasma profiles. The effects of extreme aspect ratio, large impurity fraction, energetic particles, and high collisionality are studied. In particular, the error in neglecting cross-species collisional couplingmore » – an approximation inherent to both analytic models – is quantified. Moreover, the implications of the corrections from kinetic NEO simulations on MHD equilibrium reconstructions is studied via integrated modeling with kinetic EFIT.« less
Joint DIII-D/EAST Experiments Toward Steady State AT Demonstration
NASA Astrophysics Data System (ADS)
Garofalo, A. M.; Meneghini, O.; Staebler, G. M.; van Zeeland, M. A.; Gong, X.; Ding, S.; Qian, J.; Ren, Q.; Xu, G.; Grierson, B. A.; Solomon, W. M.; Holcomb, C. T.
2015-11-01
Joint DIII-D/EAST experiments on fully noninductive operation at high poloidal beta have demonstrated several attractive features of this regime for a steady-state fusion reactor. Very large bootstrap fraction (>80 %) is desirable because it reduces the demands on external noninductive current drive. High bootstrap fraction with an H-mode edge results in a broad current profile and internal transport barriers (ITBs) at large minor radius, leading to high normalized energy confinement and high MHD stability limits. The ITB radius expands with higher normalized beta, further improving both stability and confinement. Electron density ITB and large Shafranov shift lead to low AE activity in the plasma core and low anomalous fast ion losses. Both the ITB and the current profile show remarkable robustness against perturbations, without external control. Supported by US DOE under DE-FC02-04ER54698, DE-AC02-09CH11466 & DE-AC52-07NA27344 & by NMCFSP under contracts 2015GB102000 and 2015GB110001.
Transport barriers in bootstrap-driven tokamaks
NASA Astrophysics Data System (ADS)
Staebler, G. M.; Garofalo, A. M.; Pan, C.; McClenaghan, J.; Van Zeeland, M. A.; Lao, L. L.
2018-05-01
Experiments have demonstrated improved energy confinement due to the spontaneous formation of an internal transport barrier in high bootstrap fraction discharges. Gyrokinetic analysis, and quasilinear predictive modeling, demonstrates that the observed transport barrier is caused by the suppression of turbulence primarily from the large Shafranov shift. It is shown that the Shafranov shift can produce a bifurcation to improved confinement in regions of positive magnetic shear or a continuous reduction in transport for weak or negative magnetic shear. Operation at high safety factor lowers the pressure gradient threshold for the Shafranov shift-driven barrier formation. Two self-organized states of the internal and edge transport barrier are observed. It is shown that these two states are controlled by the interaction of the bootstrap current with magnetic shear, and the kinetic ballooning mode instability boundary. Election scale energy transport is predicted to be dominant in the inner 60% of the profile. Evidence is presented that energetic particle-driven instabilities could be playing a role in the thermal energy transport in this region.
Liu, Benmei; Yu, Mandi; Graubard, Barry I; Troiano, Richard P; Schenker, Nathaniel
2016-01-01
The Physical Activity Monitor (PAM) component was introduced into the 2003-2004 National Health and Nutrition Examination Survey (NHANES) to collect objective information on physical activity including both movement intensity counts and ambulatory steps. Due to an error in the accelerometer device initialization process, the steps data were missing for all participants in several primary sampling units (PSUs), typically a single county or group of contiguous counties, who had intensity count data from their accelerometers. To avoid potential bias and loss in efficiency in estimation and inference involving the steps data, we considered methods to accurately impute the missing values for steps collected in the 2003-2004 NHANES. The objective was to come up with an efficient imputation method which minimized model-based assumptions. We adopted a multiple imputation approach based on Additive Regression, Bootstrapping and Predictive mean matching (ARBP) methods. This method fits alternative conditional expectation (ace) models, which use an automated procedure to estimate optimal transformations for both the predictor and response variables. This paper describes the approaches used in this imputation and evaluates the methods by comparing the distributions of the original and the imputed data. A simulation study using the observed data is also conducted as part of the model diagnostics. Finally some real data analyses are performed to compare the before and after imputation results. PMID:27488606
Jiang, Wenyu; Simon, Richard
2007-12-20
This paper first provides a critical review on some existing methods for estimating the prediction error in classifying microarray data where the number of genes greatly exceeds the number of specimens. Special attention is given to the bootstrap-related methods. When the sample size n is small, we find that all the reviewed methods suffer from either substantial bias or variability. We introduce a repeated leave-one-out bootstrap (RLOOB) method that predicts for each specimen in the sample using bootstrap learning sets of size ln. We then propose an adjusted bootstrap (ABS) method that fits a learning curve to the RLOOB estimates calculated with different bootstrap learning set sizes. The ABS method is robust across the situations we investigate and provides a slightly conservative estimate for the prediction error. Even with small samples, it does not suffer from large upward bias as the leave-one-out bootstrap and the 0.632+ bootstrap, and it does not suffer from large variability as the leave-one-out cross-validation in microarray applications. Copyright (c) 2007 John Wiley & Sons, Ltd.
Theodoratou, Evropi; Farrington, Susan M; Tenesa, Albert; McNeill, Geraldine; Cetnarskyj, Roseanne; Korakakis, Emmanouil; Din, Farhat V N; Porteous, Mary E; Dunlop, Malcolm G; Campbell, Harry
2014-01-01
Colorectal cancer (CRC) accounts for 9.7% of all cancer cases and for 8% of all cancer-related deaths. Established risk factors include personal or family history of CRC as well as lifestyle and dietary factors. We investigated the relationship between CRC and demographic, lifestyle, food and nutrient risk factors through a case-control study that included 2062 patients and 2776 controls from Scotland. Forward and backward stepwise regression was applied and the stability of the models was assessed in 1000 bootstrap samples. The variables that were automatically selected to be included by the forward or backward stepwise regression and whose selection was verified by bootstrap sampling in the current study were family history, dietary energy, 'high-energy snack foods', eggs, juice, sugar-sweetened beverages and white fish (associated with an increased CRC risk) and NSAIDs, coffee and magnesium (associated with a decreased CRC risk). Application of forward and backward stepwise regression in this CRC study identified some already established as well as some novel potential risk factors. Bootstrap findings suggest that examination of the stability of regression models by bootstrap sampling is useful in the interpretation of study findings. 'High-energy snack foods' and high-energy drinks (including sugar-sweetened beverages and fruit juices) as risk factors for CRC have not been reported previously and merit further investigation as such snacks and beverages are important contributors in European and North American diets.
Using the Bootstrap Concept to Build an Adaptable and Compact Subversion Artifice
2003-06-01
however, and the current “second generation” of microkernel implementations has resulted in significantly better performance. Of note is the L4 micro...63 c. GEMSOS Kernel .....................................................................63 d. L4 ... Microkernel ........................................................................64 VI. CONCLUSIONS
Fast, Exact Bootstrap Principal Component Analysis for p > 1 million
Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim
2015-01-01
Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject (p) is much larger than the number of subjects (n), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n-dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n-dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p-dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings (p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) (p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods. PMID:27616801
Chaibub Neto, Elias
2015-01-01
In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson’s sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling. PMID:26125965
Bootstrapping the O(N) archipelago
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kos, Filip; Poland, David; Simmons-Duffin, David
2015-11-17
We study 3d CFTs with an O(N) global symmetry using the conformal bootstrap for a system of mixed correlators. Specifically, we consider all nonvanishing scalar four-point functions containing the lowest dimension O(N) vector Φ i and the lowest dimension O(N) singlet s, assumed to be the only relevant operators in their symmetry representations. The constraints of crossing symmetry and unitarity for these four-point functions force the scaling dimensions (Δ Φ , Δ s ) to lie inside small islands. Here, we also make rigorous determinations of current two-point functions in the O(2) and O(3) models, with applications to transport inmore » condensed matter systems.« less
Coefficient Omega Bootstrap Confidence Intervals: Nonnormal Distributions
ERIC Educational Resources Information Center
Padilla, Miguel A.; Divers, Jasmin
2013-01-01
The performance of the normal theory bootstrap (NTB), the percentile bootstrap (PB), and the bias-corrected and accelerated (BCa) bootstrap confidence intervals (CIs) for coefficient omega was assessed through a Monte Carlo simulation under conditions not previously investigated. Of particular interests were nonnormal Likert-type and binary items.…
Tests of Independence for Ordinal Data Using Bootstrap.
ERIC Educational Resources Information Center
Chan, Wai; Yung, Yiu-Fai; Bentler, Peter M.; Tang, Man-Lai
1998-01-01
Two bootstrap tests are proposed to test the independence hypothesis in a two-way cross table. Monte Carlo studies are used to compare the traditional asymptotic test with these bootstrap methods, and the bootstrap methods are found superior in two ways: control of Type I error and statistical power. (SLD)
A Simulation Study of Missing Data with Multiple Missing X's
ERIC Educational Resources Information Center
Rubright, Jonathan D.; Nandakumar, Ratna; Glutting, Joseph J.
2014-01-01
When exploring missing data techniques in a realistic scenario, the current literature is limited: most studies only consider consequences with data missing on a single variable. This simulation study compares the relative bias of two commonly used missing data techniques when data are missing on more than one variable. Factors varied include type…
ERIC Educational Resources Information Center
Fan, Xitao
This paper empirically and systematically assessed the performance of bootstrap resampling procedure as it was applied to a regression model. Parameter estimates from Monte Carlo experiments (repeated sampling from population) and bootstrap experiments (repeated resampling from one original bootstrap sample) were generated and compared. Sample…
ERIC Educational Resources Information Center
Spinella, Sarah
2011-01-01
As result replicability is essential to science and difficult to achieve through external replicability, the present paper notes the insufficiency of null hypothesis statistical significance testing (NHSST) and explains the bootstrap as a plausible alternative, with a heuristic example to illustrate the bootstrap method. The bootstrap relies on…
Salvatore, Stefania; Bramness, Jørgen G; Røislien, Jo
2016-07-12
Wastewater-based epidemiology (WBE) is a novel approach in drug use epidemiology which aims to monitor the extent of use of various drugs in a community. In this study, we investigate functional principal component analysis (FPCA) as a tool for analysing WBE data and compare it to traditional principal component analysis (PCA) and to wavelet principal component analysis (WPCA) which is more flexible temporally. We analysed temporal wastewater data from 42 European cities collected daily over one week in March 2013. The main temporal features of ecstasy (MDMA) were extracted using FPCA using both Fourier and B-spline basis functions with three different smoothing parameters, along with PCA and WPCA with different mother wavelets and shrinkage rules. The stability of FPCA was explored through bootstrapping and analysis of sensitivity to missing data. The first three principal components (PCs), functional principal components (FPCs) and wavelet principal components (WPCs) explained 87.5-99.6 % of the temporal variation between cities, depending on the choice of basis and smoothing. The extracted temporal features from PCA, FPCA and WPCA were consistent. FPCA using Fourier basis and common-optimal smoothing was the most stable and least sensitive to missing data. FPCA is a flexible and analytically tractable method for analysing temporal changes in wastewater data, and is robust to missing data. WPCA did not reveal any rapid temporal changes in the data not captured by FPCA. Overall the results suggest FPCA with Fourier basis functions and common-optimal smoothing parameter as the most accurate approach when analysing WBE data.
Murphy, J; Gray, A; Cooper, C; Cooper, D; Ramsay, C; Carr, A
2016-12-01
A trial-based comparison of the use of resources, costs and quality of life outcomes of arthroscopic and open surgical management for rotator cuff tears in the United Kingdom NHS was performed using data from the United Kingdom Rotator Cuff Study (UKUFF) randomised controlled trial. Using data from 273 patients, healthcare-related use of resources, costs and quality-adjusted life years (QALYs) were estimated at 12 months and 24 months after surgery on an intention-to-treat basis with adjustment for covariates. Uncertainty about the incremental cost-effectiveness ratio for arthroscopic versus open management at 24 months of follow-up was incorporated using bootstrapping. Multiple imputation methods were used to deal with missing data. There were no significant differences between the arthroscopic and open groups in terms of total mean use and cost of resources or QALYs at any time post-operatively. Open management dominated arthroscopic management in 59.8% of bootstrapped cost and effect differences. The probability that arthroscopic management was cost-effective compared with open management at a willingness-to-pay threshold of £20 000 per QALY gained was 20.9%. There was no significant overall difference in the use or cost of resources or quality of life between arthroscopic and open management in the trial. There was uncertainty about which strategy was most cost-effective. Cite this article: Bone Joint J 2016;98-B:1648-55. ©2016 Gray et al.
ERIC Educational Resources Information Center
Nevitt, Jonathan; Hancock, Gregory R.
2001-01-01
Evaluated the bootstrap method under varying conditions of nonnormality, sample size, model specification, and number of bootstrap samples drawn from the resampling space. Results for the bootstrap suggest the resampling-based method may be conservative in its control over model rejections, thus having an impact on the statistical power associated…
Nonparametric bootstrap analysis with applications to demographic effects in demand functions.
Gozalo, P L
1997-12-01
"A new bootstrap proposal, labeled smooth conditional moment (SCM) bootstrap, is introduced for independent but not necessarily identically distributed data, where the classical bootstrap procedure fails.... A good example of the benefits of using nonparametric and bootstrap methods is the area of empirical demand analysis. In particular, we will be concerned with their application to the study of two important topics: what are the most relevant effects of household demographic variables on demand behavior, and to what extent present parametric specifications capture these effects." excerpt
2017-02-01
scale blade servers (Dell PowerEdge) [20]. It must be recognized however, that the findings are distributed over this collection of architectures not...current operating system designs run into millions of lines of code. Moreover, they compound the opportunity for compromise by granting device drivers...properties (e.g. IP & MAC address) so as to invalidate an adversary’s surveillance data. The current running and bootstrapping instances of the micro
Quasi-Axially Symmetric Stellarators with 3 Field Periods
NASA Astrophysics Data System (ADS)
Garabedian, Paul; Ku, Long-Poe
1998-11-01
Compact hybrid configurations with 2 field periods have been studied recently as candidates for a proof of principle experiment at PPPL, cf. A. Reiman et al., Physics design of a high beta quasi-axially symmetric stellarator, J. Plas. Fus. Res. SERIES 1, 429(1998). This enterprise has led us to the discovery of a family of quasi-axially symmetric stellarators with 3 field periods that seem to have significant advantages, although their aspect ratios are a little larger. They have reversed shear and perform better in a local analysis of ballooning modes. Nonlinear equilibrium and stability calculations predict that the average beta limit may be as high as 6% if the bootstrap current turns out to be as big as that expected in comparable tokamaks. The concept relies on a combination of helical fields and bootstrap current to achieve adequate rotational transform at low aspect ratio. A detailed manuscript describing some of this work will be published soon, cf. P.R. Garabedian, Quasi-axially symmetric stellarators, Proc. Natl. Acad. Sci. USA 95 (1998).
Efficient bootstrap estimates for tail statistics
NASA Astrophysics Data System (ADS)
Breivik, Øyvind; Aarnes, Ole Johan
2017-03-01
Bootstrap resamples can be used to investigate the tail of empirical distributions as well as return value estimates from the extremal behaviour of the sample. Specifically, the confidence intervals on return value estimates or bounds on in-sample tail statistics can be obtained using bootstrap techniques. However, non-parametric bootstrapping from the entire sample is expensive. It is shown here that it suffices to bootstrap from a small subset consisting of the highest entries in the sequence to make estimates that are essentially identical to bootstraps from the entire sample. Similarly, bootstrap estimates of confidence intervals of threshold return estimates are found to be well approximated by using a subset consisting of the highest entries. This has practical consequences in fields such as meteorology, oceanography and hydrology where return values are calculated from very large gridded model integrations spanning decades at high temporal resolution or from large ensembles of independent and identically distributed model fields. In such cases the computational savings are substantial.
What Teachers Should Know About the Bootstrap: Resampling in the Undergraduate Statistics Curriculum
Hesterberg, Tim C.
2015-01-01
Bootstrapping has enormous potential in statistics education and practice, but there are subtle issues and ways to go wrong. For example, the common combination of nonparametric bootstrapping and bootstrap percentile confidence intervals is less accurate than using t-intervals for small samples, though more accurate for larger samples. My goals in this article are to provide a deeper understanding of bootstrap methods—how they work, when they work or not, and which methods work better—and to highlight pedagogical issues. Supplementary materials for this article are available online. [Received December 2014. Revised August 2015] PMID:27019512
The Strengths and Limitations of South Africa's Search for Apartheid-Era Missing Persons.
Aronson, Jay D
2011-07-01
This article examines efforts to account for missing persons from the apartheid era in South Africa by family members, civil society organizations and the current government's Missing Persons Task Team, which emerged out of the Truth and Reconciliation Commission process. It focuses on how missing persons have been officially defined in the South African context and the extent to which the South African government is able to address the current needs and desires of relatives of the missing. I make two main arguments: that family members ought to have an active role in shaping the initiatives and institutions that seek to resolve the fate of missing people, and that the South African government ought to take a more holistic 'grave-to-grave' approach to the process of identifying, returning and reburying the remains of the missing.
NASA Astrophysics Data System (ADS)
Jimenez-Pizarro, R.; Rojas, A. M.; Pulido-Guio, A. D.
2012-12-01
The development of environmentally, socially and financially suitable greenhouse gas (GHG) mitigation portfolios requires detailed disaggregation of emissions by activity sector, preferably at the regional level. Bottom-up (BU) emission inventories are intrinsically disaggregated, but although detailed, they are frequently incomplete. Missing and erroneous activity data are rather common in emission inventories of GHG, criteria and toxic pollutants, even in developed countries. The fraction of missing and erroneous data can be rather large in developing country inventories. In addition, the cost and time for obtaining or correcting this information can be prohibitive or can delay the inventory development. This is particularly true for regional BU inventories in the developing world. Moreover, a rather common practice is to disregard or to arbitrarily impute low default activity or emission values to missing data, which typically leads to significant underestimation of the total emissions. Our investigation focuses on GHG emissions by fossil fuel combustion in industry in the Bogota Region, composed by Bogota and its adjacent, semi-rural area of influence, the Province of Cundinamarca. We found that the BU inventories for this sub-category substantially underestimate emissions when compared to top-down (TD) estimations based on sub-sector specific national fuel consumption data and regional energy intensities. Although both BU inventories have a substantial number of missing and evidently erroneous entries, i.e. information on fuel consumption per combustion unit per company, the validated energy use and emission data display clear and smooth frequency distributions, which can be adequately fitted to bimodal log-normal distributions. This is not unexpected as industrial plant sizes are typically log-normally distributed. Moreover, our statistical tests suggest that industrial sub-sectors, as classified by the International Standard Industrial Classification (ISIC), are also well represented by log-normal distributions. Using the validated data, we tested several missing data estimation procedures, including Montecarlo sampling of the real and fitted distributions, and a per ISIC estimation based on bootstrap-calculated mean values. These results will be presented and discussed in detail. Our results suggest that the accuracy of sub-sector BU emission inventories, particularly in developing regions, could be significantly improved if they are designed and carried out to be representative sub-samples (surveys) of the actual universe of emitters. A large fraction the missing data could be subsequently estimated by robust statistical procedures provided that most of the emitters were accounted by number and ISIC.
Warton, David I; Thibaut, Loïc; Wang, Yi Alice
2017-01-01
Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)-common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of "model-free bootstrap", adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods.
Bootstrap Percolation on Homogeneous Trees Has 2 Phase Transitions
NASA Astrophysics Data System (ADS)
Fontes, L. R. G.; Schonmann, R. H.
2008-09-01
We study the threshold θ bootstrap percolation model on the homogeneous tree with degree b+1, 2≤ θ≤ b, and initial density p. It is known that there exists a nontrivial critical value for p, which we call p f , such that a) for p> p f , the final bootstrapped configuration is fully occupied for almost every initial configuration, and b) if p< p f , then for almost every initial configuration, the final bootstrapped configuration has density of occupied vertices less than 1. In this paper, we establish the existence of a distinct critical value for p, p c , such that 0< p c < p f , with the following properties: 1) if p≤ p c , then for almost every initial configuration there is no infinite cluster of occupied vertices in the final bootstrapped configuration; 2) if p> p c , then for almost every initial configuration there are infinite clusters of occupied vertices in the final bootstrapped configuration. Moreover, we show that 3) for p< p c , the distribution of the occupied cluster size in the final bootstrapped configuration has an exponential tail; 4) at p= p c , the expected occupied cluster size in the final bootstrapped configuration is infinite; 5) the probability of percolation of occupied vertices in the final bootstrapped configuration is continuous on [0, p f ] and analytic on ( p c , p f ), admitting an analytic continuation from the right at p c and, only in the case θ= b, also from the left at p f .
Heart rate reactivity and current post-traumatic stress disorder when data are missing.
Jeon-Slaughter, Haekyung; Tucker, Phebe; Pfefferbaum, Betty; North, Carol S; de Andrade, Bernardo Borba; Neas, Barbara
2011-08-01
This study demonstrates that auxiliary and exclusion criteria variables increase the effectiveness of missing imputation in correcting underestimation of physiologic reactivity in relation to post-traumatic stress disorder (PTSD) caused by deleting cases with missing physiologic data. This study used data from survivors of the 1995 Oklahoma City bombing and imputed missing heart rate data using auxiliary and exclusion criteria variables. Logistic regression was used to examine heart rate reactivity in relation to current PTSD. Of 113 survivors who participated in the bombing study's 7-year follow-up interview, 42 (37%) had missing data on heart rate reactivity due to exclusion criteria (medical illness or use of cardiovascular or psychotropic medications) or non-participation. Logistic regression results based on imputed heart rate data using exclusion criteria and auxiliary (the presence of any current PTSD arousal symptoms) variables showed that survivors with current bombing-related PTSD had significantly higher heart rates at baseline and recovered more slowly back to baseline heart rate during resting periods than survivors without current PTSD, while results based on complete cases failed to show significant correlations between current PTSD and heart rates at any assessment points. Suggested methods yielded an otherwise undetectable link between physiology and current PTSD. © 2011 The Authors. Psychiatry and Clinical Neurosciences © 2011 Japanese Society of Psychiatry and Neurology.
The Strengths and Limitations of South Africa’s Search for Apartheid-Era Missing Persons
Aronson, Jay D.
2011-01-01
This article examines efforts to account for missing persons from the apartheid era in South Africa by family members, civil society organizations and the current government’s Missing Persons Task Team, which emerged out of the Truth and Reconciliation Commission process. It focuses on how missing persons have been officially defined in the South African context and the extent to which the South African government is able to address the current needs and desires of relatives of the missing. I make two main arguments: that family members ought to have an active role in shaping the initiatives and institutions that seek to resolve the fate of missing people, and that the South African government ought to take a more holistic ‘grave-to-grave’ approach to the process of identifying, returning and reburying the remains of the missing. PMID:21984885
Xiao, Yongling; Abrahamowicz, Michal
2010-03-30
We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.
Darling, Stephen; Parker, Mary-Jane; Goodall, Karen E; Havelka, Jelena; Allen, Richard J
2014-03-01
When participants carry out visually presented digit serial recall, their performance is better if they are given the opportunity to encode extra visuospatial information at encoding-a phenomenon that has been termed visuospatial bootstrapping. This bootstrapping is the result of integration of information from different modality-specific short-term memory systems and visuospatial knowledge in long term memory, and it can be understood in the context of recent models of working memory that address multimodal binding (e.g., models incorporating an episodic buffer). Here we report a cross-sectional developmental study that demonstrated visuospatial bootstrapping in adults (n=18) and 9-year-old children (n=15) but not in 6-year-old children (n=18). This is the first developmental study addressing visuospatial bootstrapping, and results demonstrate that the developmental trajectory of bootstrapping is different from that of basic verbal and visuospatial working memory. This pattern suggests that bootstrapping (and hence integrative functions such as those associated with the episodic buffer) emerge independent of the development of basic working memory slave systems during childhood. Copyright © 2013 Elsevier Inc. All rights reserved.
A bootstrap based space-time surveillance model with an application to crime occurrences
NASA Astrophysics Data System (ADS)
Kim, Youngho; O'Kelly, Morton
2008-06-01
This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.
Augmenting Literacy: The Role of Expertise in Digital Writing
ERIC Educational Resources Information Center
Van Ittersum, Derek
2011-01-01
This essay presents a model of reflective use of writing technologies, one that provides a means of more fully exploiting the possibilities of these tools for transforming writing activity. Derived from the work of computer designer Douglas Engelbart, the "bootstrapping" model of reflective use extends current arguments in the field…
ERIC Educational Resources Information Center
Stapleton, Laura M.
2008-01-01
This article discusses replication sampling variance estimation techniques that are often applied in analyses using data from complex sampling designs: jackknife repeated replication, balanced repeated replication, and bootstrapping. These techniques are used with traditional analyses such as regression, but are currently not used with structural…
ERIC Educational Resources Information Center
Harrison, David
1979-01-01
The issue of observability and the relative roles of the senses and reason in understanding the world is reviewed. Eastern "mystical" philosophy serves as a focus in which interpretations of quantum mechanics, as well as the current bootstrap-quark controversy are seen in some slightly different contexts. (Author/GA)
Comparison of Sample Size by Bootstrap and by Formulas Based on Normal Distribution Assumption.
Wang, Zuozhen
2018-01-01
Bootstrapping technique is distribution-independent, which provides an indirect way to estimate the sample size for a clinical trial based on a relatively smaller sample. In this paper, sample size estimation to compare two parallel-design arms for continuous data by bootstrap procedure are presented for various test types (inequality, non-inferiority, superiority, and equivalence), respectively. Meanwhile, sample size calculation by mathematical formulas (normal distribution assumption) for the identical data are also carried out. Consequently, power difference between the two calculation methods is acceptably small for all the test types. It shows that the bootstrap procedure is a credible technique for sample size estimation. After that, we compared the powers determined using the two methods based on data that violate the normal distribution assumption. To accommodate the feature of the data, the nonparametric statistical method of Wilcoxon test was applied to compare the two groups in the data during the process of bootstrap power estimation. As a result, the power estimated by normal distribution-based formula is far larger than that by bootstrap for each specific sample size per group. Hence, for this type of data, it is preferable that the bootstrap method be applied for sample size calculation at the beginning, and that the same statistical method as used in the subsequent statistical analysis is employed for each bootstrap sample during the course of bootstrap sample size estimation, provided there is historical true data available that can be well representative of the population to which the proposed trial is planning to extrapolate.
Application of the Bootstrap Methods in Factor Analysis.
ERIC Educational Resources Information Center
Ichikawa, Masanori; Konishi, Sadanori
1995-01-01
A Monte Carlo experiment was conducted to investigate the performance of bootstrap methods in normal theory maximum likelihood factor analysis when the distributional assumption was satisfied or unsatisfied. Problems arising with the use of bootstrap methods are highlighted. (SLD)
2013-01-01
Background and objectives. Acute respiratory infection (ARI) is among the most common, debilitating and expensive human illnesses. The purpose of this study was to assess ARI-related costs and determine if mindfulness meditation or exercise can add value. Methods. One hundred and fifty-four adults ≥50 years from Madison, WI for the 2009–10 cold/flu season were randomized to (i) wait-list control (ii) meditation or (iii) moderate intensity exercise. ARI-related costs were assessed through self-reported medication use, number of missed work days and medical visits. Costs per subject were based on cost of generic medications, missed work days ($126.20) and clinic visits ($78.70). Monte Carlo bootstrap methods evaluated reduced costs of ARI episodes. Results. The total cost per subject for the control group was $214 (95% CI: $105–$358), exercise $136 (95% CI: $64–$232) and meditation $65 (95% CI: $34–$104). The majority of cost savings was through a reduction in missed days of work. Exercise had the highest medication costs at $16.60 compared with $5.90 for meditation (P = 0.004) and $7.20 for control (P = 0.046). Combining these cost benefits with the improved outcomes in incidence, duration and severity seen with the Meditation or Exercise for Preventing Acute Respiratory Infection study, meditation and exercise add value for ARI. Compared with control, meditation had the greatest cost benefit. This savings is offset by the cost of the intervention ($450/subject) that would negate the short-term but perhaps not long-term savings. Conclusions. Meditation and exercise add value to ARI-associated health-related costs with improved outcomes. Further research is needed to confirm results and inform policies on adding value to medical spending. PMID:23515373
Rakel, David; Mundt, Marlon; Ewers, Tola; Fortney, Luke; Zgierska, Aleksandra; Gassman, Michele; Barrett, Bruce
2013-08-01
Acute respiratory infection (ARI) is among the most common, debilitating and expensive human illnesses. The purpose of this study was to assess ARI-related costs and determine if mindfulness meditation or exercise can add value. One hundred and fifty-four adults ≥50 years from Madison, WI for the 2009-10 cold/flu season were randomized to (i) wait-list control (ii) meditation or (iii) moderate intensity exercise. ARI-related costs were assessed through self-reported medication use, number of missed work days and medical visits. Costs per subject were based on cost of generic medications, missed work days ($126.20) and clinic visits ($78.70). Monte Carlo bootstrap methods evaluated reduced costs of ARI episodes. The total cost per subject for the control group was $214 (95% CI: $105-$358), exercise $136 (95% CI: $64-$232) and meditation $65 (95% CI: $34-$104). The majority of cost savings was through a reduction in missed days of work. Exercise had the highest medication costs at $16.60 compared with $5.90 for meditation (P = 0.004) and $7.20 for control (P = 0.046). Combining these cost benefits with the improved outcomes in incidence, duration and severity seen with the Meditation or Exercise for Preventing Acute Respiratory Infection study, meditation and exercise add value for ARI. Compared with control, meditation had the greatest cost benefit. This savings is offset by the cost of the intervention ($450/subject) that would negate the short-term but perhaps not long-term savings. Meditation and exercise add value to ARI-associated health-related costs with improved outcomes. Further research is needed to confirm results and inform policies on adding value to medical spending.
The Impact of Respondent Burden on Current Drinker Rates.
Callinan, Sarah
2017-09-19
Increasing response burden in alcohol surveys combined with filter questions to exclude abstainers, results in systematically missing data in questions on alcohol consumption as abstainers are not required to answer them. The aim of the current study is to assess the impact of responder burden on current drinker rates in a large scale Australian survey. 23,855 Australian adults completed the National Drug Strategy Household Survey in 2013 and answered increasingly complex questions on alcohol consumption. Although 80% of respondents stated that they had consumed alcohol in the past 12 months, the current drinker rate appears to be 78% excluding, or 74% including missing data if taken from the quantity frequency measure. When respondents are then asked to give more detailed responses in a graduated frequency measure, current drinker rates appear to be at 75% or 73%, excluding or including missing data. The rate of abstention in alcohol survey research is artificially inflated when more complex survey methods are used. Excluding missing data only partially corrects for this. Given that more sensitive analyses are usually performed on more detailed survey questions, rates of abstention and consumption should be adjusted to account for systematically missing data.
External heating and current drive source requirements towards steady-state operation in ITER
NASA Astrophysics Data System (ADS)
Poli, F. M.; Kessel, C. E.; Bonoli, P. T.; Batchelor, D. B.; Harvey, R. W.; Snyder, P. B.
2014-07-01
Steady state scenarios envisaged for ITER aim at optimizing the bootstrap current, while maintaining sufficient confinement and stability to provide the necessary fusion yield. Non-inductive scenarios will need to operate with internal transport barriers (ITBs) in order to reach adequate fusion gain at typical currents of 9 MA. However, the large pressure gradients associated with ITBs in regions of weak or negative magnetic shear can be conducive to ideal MHD instabilities, reducing the no-wall limit. The E × B flow shear from toroidal plasma rotation is expected to be low in ITER, with a major role in the ITB dynamics being played by magnetic geometry. Combinations of heating and current drive (H/CD) sources that sustain reversed magnetic shear profiles throughout the discharge are the focus of this work. Time-dependent transport simulations indicate that a combination of electron cyclotron (EC) and lower hybrid (LH) waves is a promising route towards steady state operation in ITER. The LH forms and sustains expanded barriers and the EC deposition at mid-radius freezes the bootstrap current profile stabilizing the barrier and leading to confinement levels 50% higher than typical H-mode energy confinement times. Using LH spectra with spectrum centred on parallel refractive index of 1.75-1.85, the performance of these plasma scenarios is close to the ITER target of 9 MA non-inductive current, global confinement gain H98 = 1.6 and fusion gain Q = 5.
Zhou, Hanzhi; Elliott, Michael R; Raghunathan, Trivellore E
2016-06-01
Multistage sampling is often employed in survey samples for cost and convenience. However, accounting for clustering features when generating datasets for multiple imputation is a nontrivial task, particularly when, as is often the case, cluster sampling is accompanied by unequal probabilities of selection, necessitating case weights. Thus, multiple imputation often ignores complex sample designs and assumes simple random sampling when generating imputations, even though failing to account for complex sample design features is known to yield biased estimates and confidence intervals that have incorrect nominal coverage. In this article, we extend a recently developed, weighted, finite-population Bayesian bootstrap procedure to generate synthetic populations conditional on complex sample design data that can be treated as simple random samples at the imputation stage, obviating the need to directly model design features for imputation. We develop two forms of this method: one where the probabilities of selection are known at the first and second stages of the design, and the other, more common in public use files, where only the final weight based on the product of the two probabilities is known. We show that this method has advantages in terms of bias, mean square error, and coverage properties over methods where sample designs are ignored, with little loss in efficiency, even when compared with correct fully parametric models. An application is made using the National Automotive Sampling System Crashworthiness Data System, a multistage, unequal probability sample of U.S. passenger vehicle crashes, which suffers from a substantial amount of missing data in "Delta-V," a key crash severity measure.
Zhou, Hanzhi; Elliott, Michael R.; Raghunathan, Trivellore E.
2017-01-01
Multistage sampling is often employed in survey samples for cost and convenience. However, accounting for clustering features when generating datasets for multiple imputation is a nontrivial task, particularly when, as is often the case, cluster sampling is accompanied by unequal probabilities of selection, necessitating case weights. Thus, multiple imputation often ignores complex sample designs and assumes simple random sampling when generating imputations, even though failing to account for complex sample design features is known to yield biased estimates and confidence intervals that have incorrect nominal coverage. In this article, we extend a recently developed, weighted, finite-population Bayesian bootstrap procedure to generate synthetic populations conditional on complex sample design data that can be treated as simple random samples at the imputation stage, obviating the need to directly model design features for imputation. We develop two forms of this method: one where the probabilities of selection are known at the first and second stages of the design, and the other, more common in public use files, where only the final weight based on the product of the two probabilities is known. We show that this method has advantages in terms of bias, mean square error, and coverage properties over methods where sample designs are ignored, with little loss in efficiency, even when compared with correct fully parametric models. An application is made using the National Automotive Sampling System Crashworthiness Data System, a multistage, unequal probability sample of U.S. passenger vehicle crashes, which suffers from a substantial amount of missing data in “Delta-V,” a key crash severity measure. PMID:29226161
Small sample mediation testing: misplaced confidence in bootstrapped confidence intervals.
Koopman, Joel; Howe, Michael; Hollenbeck, John R; Sin, Hock-Peng
2015-01-01
Bootstrapping is an analytical tool commonly used in psychology to test the statistical significance of the indirect effect in mediation models. Bootstrapping proponents have particularly advocated for its use for samples of 20-80 cases. This advocacy has been heeded, especially in the Journal of Applied Psychology, as researchers are increasingly utilizing bootstrapping to test mediation with samples in this range. We discuss reasons to be concerned with this escalation, and in a simulation study focused specifically on this range of sample sizes, we demonstrate not only that bootstrapping has insufficient statistical power to provide a rigorous hypothesis test in most conditions but also that bootstrapping has a tendency to exhibit an inflated Type I error rate. We then extend our simulations to investigate an alternative empirical resampling method as well as a Bayesian approach and demonstrate that they exhibit comparable statistical power to bootstrapping in small samples without the associated inflated Type I error. Implications for researchers testing mediation hypotheses in small samples are presented. For researchers wishing to use these methods in their own research, we have provided R syntax in the online supplemental materials. (c) 2015 APA, all rights reserved.
Testing of NASA LaRC Materials under MISSE 6 and MISSE 7 Missions
NASA Technical Reports Server (NTRS)
Prasad, Narasimha S.
2009-01-01
The objective of the Materials International Space Station Experiment (MISSE) is to study the performance of novel materials when subjected to the synergistic effects of the harsh space environment for several months. MISSE missions provide an opportunity for developing space qualifiable materials. Two lasers and a few optical components from NASA Langley Research Center (LaRC) were included in the MISSE 6 mission for long term exposure. MISSE 6 items were characterized and packed inside a ruggedized Passive Experiment Container (PEC) that resembles a suitcase. The PEC was tested for survivability due to launch conditions. MISSE 6 was transported to the international Space Station (ISS) via STS 123 on March 11. 2008. The astronauts successfully attached the PEC to external handrails of the ISS and opened the PEC for long term exposure to the space environment. The current plan is to bring the MISSE 6 PEC back to the Earth via STS 128 mission scheduled for launch in August 2009. Currently, preparations for launching the MISSE 7 mission are progressing. Laser and lidar components assembled on a flight-worthy platform are included from NASA LaRC. MISSE 7 launch is scheduled to be launched on STS 129 mission. This paper will briefly review recent efforts on MISSE 6 and MISSE 7 missions at NASA Langley Research Center (LaRC).
Bootstrap confidence levels for phylogenetic trees.
Efron, B; Halloran, E; Holmes, S
1996-07-09
Evolutionary trees are often estimated from DNA or RNA sequence data. How much confidence should we have in the estimated trees? In 1985, Felsenstein [Felsenstein, J. (1985) Evolution 39, 783-791] suggested the use of the bootstrap to answer this question. Felsenstein's method, which in concept is a straightforward application of the bootstrap, is widely used, but has been criticized as biased in the genetics literature. This paper concerns the use of the bootstrap in the tree problem. We show that Felsenstein's method is not biased, but that it can be corrected to better agree with standard ideas of confidence levels and hypothesis testing. These corrections can be made by using the more elaborate bootstrap method presented here, at the expense of considerably more computation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Batchelor, D.B.; Carreras, B.A.; Hirshman, S.P.
Significant progress has been made in the development of new modest-size compact stellarator devices that could test optimization principles for the design of a more attractive reactor. These are 3 and 4 field period low-aspect-ratio quasi-omnigenous (QO) stellarators based on an optimization method that targets improved confinement, stability, ease of coil design, low-aspect-ratio, and low bootstrap current.
Working Memory Deficits and Social Problems in Children with ADHD
ERIC Educational Resources Information Center
Kofler, Michael J.; Rapport, Mark D.; Bolden, Jennifer; Sarver, Dustin E.; Raiker, Joseph S.; Alderson, R. Matt
2011-01-01
Social problems are a prevalent feature of ADHD and reflect a major source of functional impairment for these children. The current study examined the impact of working memory deficits on parent- and teacher-reported social problems in a sample of children with ADHD and typically developing boys (N = 39). Bootstrapped, bias-corrected mediation…
ERIC Educational Resources Information Center
Pejovic, Jovana; Molnar, Monika
2017-01-01
Recently it has been proposed that sensitivity to nonarbitrary relationships between speech sounds and objects potentially bootstraps lexical acquisition. However, it is currently unclear whether preverbal infants (e.g., before 6 months of age) with different linguistic profiles are sensitive to such nonarbitrary relationships. Here, the authors…
An algebraic approach to the analytic bootstrap
Alday, Luis F.; Zhiboedov, Alexander
2017-04-27
We develop an algebraic approach to the analytic bootstrap in CFTs. By acting with the Casimir operator on the crossing equation we map the problem of doing large spin sums to any desired order to the problem of solving a set of recursion relations. We compute corrections to the anomalous dimension of large spin operators due to the exchange of a primary and its descendants in the crossed channel and show that this leads to a Borel-summable expansion. Here, we analyse higher order corrections to the microscopic CFT data in the direct channel and its matching to infinite towers ofmore » operators in the crossed channel. We apply this method to the critical O(N ) model. At large N we reproduce the first few terms in the large spin expansion of the known two-loop anomalous dimensions of higher spin currents in the traceless symmetric representation of O(N ) and make further predictions. At small N we present the results for the truncated large spin expansion series of anomalous dimensions of higher spin currents.« less
Progress toward steady-state tokamak operation exploiting the high bootstrap current fraction regime
Ren, Q. L.; Garofalo, A. M.; Gong, X. Z.; ...
2016-06-20
Recent DIII-D experiments have increased the normalized fusion performance of the high bootstrap current fraction tokamak regime toward reactor-relevant steady state operation. The experiments, conducted by a joint team of researchers from the DIII-D and EAST tokamaks, developed a fully noninductive scenario that could be extended on EAST to a demonstration of long pulse steady-state tokamak operation. Improved understanding of scenario stability has led to the achievement of very high values of β p and β N despite strong ITBs. Good confinement has been achieved with reduced toroidal rotation. These high β p plasmas challenge the energy transport understanding, especiallymore » in the electron energy channel. A new turbulent transport model, named 2 TGLF-SAT1, has been developed which improves the transport prediction. Experiments extending results to long pulse on EAST, based on the physics basis developed at DIII-D, have been conducted. Finally, more investigations will be carried out on EAST with more additional auxiliary power to come online in the near term.« less
Coefficient Alpha Bootstrap Confidence Interval under Nonnormality
ERIC Educational Resources Information Center
Padilla, Miguel A.; Divers, Jasmin; Newton, Matthew
2012-01-01
Three different bootstrap methods for estimating confidence intervals (CIs) for coefficient alpha were investigated. In addition, the bootstrap methods were compared with the most promising coefficient alpha CI estimation methods reported in the literature. The CI methods were assessed through a Monte Carlo simulation utilizing conditions…
Pearson-type goodness-of-fit test with bootstrap maximum likelihood estimation.
Yin, Guosheng; Ma, Yanyuan
2013-01-01
The Pearson test statistic is constructed by partitioning the data into bins and computing the difference between the observed and expected counts in these bins. If the maximum likelihood estimator (MLE) of the original data is used, the statistic generally does not follow a chi-squared distribution or any explicit distribution. We propose a bootstrap-based modification of the Pearson test statistic to recover the chi-squared distribution. We compute the observed and expected counts in the partitioned bins by using the MLE obtained from a bootstrap sample. This bootstrap-sample MLE adjusts exactly the right amount of randomness to the test statistic, and recovers the chi-squared distribution. The bootstrap chi-squared test is easy to implement, as it only requires fitting exactly the same model to the bootstrap data to obtain the corresponding MLE, and then constructs the bin counts based on the original data. We examine the test size and power of the new model diagnostic procedure using simulation studies and illustrate it with a real data set.
Thibaut, Loïc; Wang, Yi Alice
2017-01-01
Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)—common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of “model-free bootstrap”, adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods. PMID:28738071
Rombach, Ines; Rivero-Arias, Oliver; Gray, Alastair M; Jenkinson, Crispin; Burke, Órlaith
2016-07-01
Patient-reported outcome measures (PROMs) are designed to assess patients' perceived health states or health-related quality of life. However, PROMs are susceptible to missing data, which can affect the validity of conclusions from randomised controlled trials (RCTs). This review aims to assess current practice in the handling, analysis and reporting of missing PROMs outcome data in RCTs compared to contemporary methodology and guidance. This structured review of the literature includes RCTs with a minimum of 50 participants per arm. Studies using the EQ-5D-3L, EORTC QLQ-C30, SF-12 and SF-36 were included if published in 2013; those using the less commonly implemented HUI, OHS, OKS and PDQ were included if published between 2009 and 2013. The review included 237 records (4-76 per relevant PROM). Complete case analysis and single imputation were commonly used in 33 and 15 % of publications, respectively. Multiple imputation was reported for 9 % of the PROMs reviewed. The majority of publications (93 %) failed to describe the assumed missing data mechanism, while low numbers of papers reported methods to minimise missing data (23 %), performed sensitivity analyses (22 %) or discussed the potential influence of missing data on results (16 %). Considerable discrepancy exists between approved methodology and current practice in handling, analysis and reporting of missing PROMs outcome data in RCTs. Greater awareness is needed for the potential biases introduced by inappropriate handling of missing data, as well as the importance of sensitivity analysis and clear reporting to enable appropriate assessments of treatment effects and conclusions from RCTs.
Bootstrap Estimates of Standard Errors in Generalizability Theory
ERIC Educational Resources Information Center
Tong, Ye; Brennan, Robert L.
2007-01-01
Estimating standard errors of estimated variance components has long been a challenging task in generalizability theory. Researchers have speculated about the potential applicability of the bootstrap for obtaining such estimates, but they have identified problems (especially bias) in using the bootstrap. Using Brennan's bias-correcting procedures…
Problems with Multivariate Normality: Can the Multivariate Bootstrap Help?
ERIC Educational Resources Information Center
Thompson, Bruce
Multivariate normality is required for some statistical tests. This paper explores the implications of violating the assumption of multivariate normality and illustrates a graphical procedure for evaluating multivariate normality. The logic for using the multivariate bootstrap is presented. The multivariate bootstrap can be used when distribution…
Investigation of geomagnetic induced current at high latitude during the storm-time variation
NASA Astrophysics Data System (ADS)
Falayi, E. O.; Ogunmodimu, O.; Bolaji, O. S.; Ayanda, J. D.; Ojoniyi, O. S.
2017-06-01
During the geomagnetic disturbances, the geomagnetically induced current (GIC) are influenced by the geoelectric field flowing in conductive Earth. In this paper, we studied the variability of GICs, the time derivatives of the geomagnetic field (dB/dt), geomagnetic indices: Symmetric disturbance field in H (SYM-H) index, AU (eastward electrojet) and AL (westward electrojet) indices, Interplanetary parameters such as solar wind speed (v), and interplanetary magnetic field (Bz) during the geomagnetic storms on 31 March 2001, 21 October 2001, 6 November 2001, 29 October 2003, 31 October 2003 and 9 November 2004 with high solar wind speed due to a coronal mass ejection. Wavelet spectrum based approach was employed to analyze the GIC time series in a sequence of time scales of one to twenty four hours. It was observed that there are more concentration of power between the 14-24 h on 31 March 2001, 17-24 h on 21 October 2001, 1-7 h on 6 November 2001, two peaks were observed between 5-8 h and 21-24 h on 29 October 2003, 1-3 h on 31 October 2003 and 18-22 h on 9 November 2004. Bootstrap method was used to obtain regression correlations between the time derivative of the geomagnetic field (dB/dt) and the observed values of the geomagnetic induced current on 31 March 2001, 21 October 2001, 6 November 2001, 29 October 2003, 31 October 2003 and 9 November 2004 which shows a distributed cluster of correlation coefficients at around r = -0.567, -0.717, -0.477, -0.419, -0.210 and r = -0.488 respectively. We observed that high energy wavelet coefficient correlated well with bootstrap correlation, while low energy wavelet coefficient gives low bootstrap correlation. It was noticed that the geomagnetic storm has a influence on GIC and geomagnetic field derivatives (dB/dt). This might be ascribed to the coronal mass ejection with solar wind due to particle acceleration processes in the solar atmosphere.
Transport in the plateau regime in a tokamak pedestal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seol, J.; Shaing, K. C.
In a tokamak H-mode, a strong E Multiplication-Sign B flow shear is generated during the L-H transition. Turbulence in a pedestal is suppressed significantly by this E Multiplication-Sign B flow shear. In this case, neoclassical transport may become important. The neoclassical fluxes are calculated in the plateau regime with the parallel plasma flow using their kinetic definitions. In an axisymmetric tokamak, the neoclassical particles fluxes can be decomposed into the banana-plateau flux and the Pfirsch-Schlueter flux. The banana-plateau particle flux is driven by the parallel viscous force and the Pfirsch-Schlueter flux by the poloidal variation of the friction force. Themore » combined quantity of the radial electric field and the parallel flow is determined by the flux surface averaged parallel momentum balance equation rather than requiring the ambipolarity of the total particle fluxes. In this process, the Pfirsch-Schlueter flux does not appear in the flux surface averaged parallel momentum equation. Only the banana-plateau flux is used to determine the parallel flow in the form of the flux surface averaged parallel viscosity. The heat flux, obtained using the solution of the parallel momentum balance equation, decreases exponentially in the presence of sonic M{sub p} without any enhancement over that in the standard neoclassical theory. Here, M{sub p} is a combination of the poloidal E Multiplication-Sign B flow and the parallel mass flow. The neoclassical bootstrap current in the plateau regime is presented. It indicates that the neoclassical bootstrap current also is related only to the banana-plateau fluxes. Finally, transport fluxes are calculated when M{sub p} is large enough to make the parallel electron viscosity comparable with the parallel ion viscosity. It is found that the bootstrap current has a finite value regardless of the magnitude of M{sub p}.« less
From current-driven to neoclassically driven tearing modes.
Reimerdes, H; Sauter, O; Goodman, T; Pochelon, A
2002-03-11
In the TCV tokamak, the m/n = 2/1 island is observed in low-density discharges with central electron-cyclotron current drive. The evolution of its width has two distinct growth phases, one of which can be linked to a "conventional" tearing mode driven unstable by the current profile and the other to a neoclassical tearing mode driven by a perturbation of the bootstrap current. The TCV results provide the first clear observation of such a destabilization mechanism and reconcile the theory of conventional and neoclassical tearing modes, which differ only in the dominant driving term.
Three-dimensional magnetohydrodynamic equilibrium of quiescent H-modes in tokamak systems
NASA Astrophysics Data System (ADS)
Cooper, W. A.; Graves, J. P.; Duval, B. P.; Sauter, O.; Faustin, J. M.; Kleiner, A.; Lanthaler, S.; Patten, H.; Raghunathan, M.; Tran, T.-M.; Chapman, I. T.; Ham, C. J.
2016-06-01
Three dimensional free boundary magnetohydrodynamic equilibria that recover saturated ideal kink/peeling structures are obtained numerically. Simulations that model the JET tokamak at fixed < β > =1.7% with a large edge bootstrap current that flattens the q-profile near the plasma boundary demonstrate that a radial parallel current density ribbon with a dominant m /n = 5/1 Fourier component at {{I}\\text{t}}=2.2 MA develops into a broadband spectrum when the toroidal current I t is increased to 2.5 MA.
Impact of Sampling Density on the Extent of HIV Clustering
Novitsky, Vlad; Moyo, Sikhulile; Lei, Quanhong; DeGruttola, Victor
2014-01-01
Abstract Identifying and monitoring HIV clusters could be useful in tracking the leading edge of HIV transmission in epidemics. Currently, greater specificity in the definition of HIV clusters is needed to reduce confusion in the interpretation of HIV clustering results. We address sampling density as one of the key aspects of HIV cluster analysis. The proportion of viral sequences in clusters was estimated at sampling densities from 1.0% to 70%. A set of 1,248 HIV-1C env gp120 V1C5 sequences from a single community in Botswana was utilized in simulation studies. Matching numbers of HIV-1C V1C5 sequences from the LANL HIV Database were used as comparators. HIV clusters were identified by phylogenetic inference under bootstrapped maximum likelihood and pairwise distance cut-offs. Sampling density below 10% was associated with stochastic HIV clustering with broad confidence intervals. HIV clustering increased linearly at sampling density >10%, and was accompanied by narrowing confidence intervals. Patterns of HIV clustering were similar at bootstrap thresholds 0.7 to 1.0, but the extent of HIV clustering decreased with higher bootstrap thresholds. The origin of sampling (local concentrated vs. scattered global) had a substantial impact on HIV clustering at sampling densities ≥10%. Pairwise distances at 10% were estimated as a threshold for cluster analysis of HIV-1 V1C5 sequences. The node bootstrap support distribution provided additional evidence for 10% sampling density as the threshold for HIV cluster analysis. The detectability of HIV clusters is substantially affected by sampling density. A minimal genotyping density of 10% and sampling density of 50–70% are suggested for HIV-1 V1C5 cluster analysis. PMID:25275430
Visceral sensitivity, anxiety, and smoking among treatment-seeking smokers.
Zvolensky, Michael J; Bakhshaie, Jafar; Norton, Peter J; Smits, Jasper A J; Buckner, Julia D; Garey, Lorra; Manning, Kara
2017-12-01
It is widely recognized that smoking is related to abdominal pain and discomfort, as well as gastrointestinal disorders. Research has shown that visceral sensitivity, experiencing anxiety around gastrointestinal sensations, is associated with poorer gastrointestinal health and related health outcomes. Visceral sensitivity also increases anxiety symptoms and mediates the relation with other risk factors, including gastrointestinal distress. No work to date, however, has evaluated visceral sensitivity in the context of smoking despite the strong association between smoking and poor physical and mental health. The current study sought to examine visceral sensitivity as a unique predictor of cigarette dependence, threat-related smoking abstinence expectancies (somatic symptoms and harmful consequences), and perceived barriers for cessation via anxiety symptoms. Eighty-four treatment seeking adult daily smokers (M age =45.1years [SD=10.4]; 71.6% male) participated in this study. There was a statistically significant indirect effect of visceral sensitivity via general anxiety symptoms on cigarette dependence (b=0.02, SE=0.01, Bootstrapped 95% CI [0.006, 0.05]), smoking abstinence somatic expectancies (b=0.10, SE=0.03, Bootstrapped 95% CI [0.03, 0.19]), smoking abstinence harmful experiences (b=0.13, SE=0.05, Bootstrapped 95% CI [0.03, 0.25]), and barriers to cessation (b=0.05, SE=0.06, Bootstrapped 95% CI [0.01, 0.13]). Overall, the present study serves as an initial investigation into the nature of the associations between visceral sensitivity, anxiety symptoms, and clinically significant smoking processes among treatment-seeking smokers. Future work is needed to explore the extent to which anxiety accounts for relations between visceral sensitivity and other smoking processes (e.g., withdrawal, cessation outcome). Copyright © 2017 Elsevier Ltd. All rights reserved.
Unbiased Estimates of Variance Components with Bootstrap Procedures
ERIC Educational Resources Information Center
Brennan, Robert L.
2007-01-01
This article provides general procedures for obtaining unbiased estimates of variance components for any random-model balanced design under any bootstrap sampling plan, with the focus on designs of the type typically used in generalizability theory. The results reported here are particularly helpful when the bootstrap is used to estimate standard…
Explorations in Statistics: the Bootstrap
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2009-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fourth installment of Explorations in Statistics explores the bootstrap. The bootstrap gives us an empirical approach to estimate the theoretical variability among possible values of a sample statistic such as the…
Bootstrapping Confidence Intervals for Robust Measures of Association.
ERIC Educational Resources Information Center
King, Jason E.
A Monte Carlo simulation study was conducted to determine the bootstrap correction formula yielding the most accurate confidence intervals for robust measures of association. Confidence intervals were generated via the percentile, adjusted, BC, and BC(a) bootstrap procedures and applied to the Winsorized, percentage bend, and Pearson correlation…
Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection
NASA Technical Reports Server (NTRS)
Kumar, Sricharan; Srivistava, Ashok N.
2012-01-01
Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.
High performance advanced tokamak regimes in DIII-D for next-step experiments
NASA Astrophysics Data System (ADS)
Greenfield, C. M.; Murakami, M.; Ferron, J. R.; Wade, M. R.; Luce, T. C.; Petty, C. C.; Menard, J. E.; Petrie, T. W.; Allen, S. L.; Burrell, K. H.; Casper, T. A.; DeBoo, J. C.; Doyle, E. J.; Garofalo, A. M.; Gorelov, I. A.; Groebner, R. J.; Hobirk, J.; Hyatt, A. W.; Jayakumar, R. J.; Kessel, C. E.; La Haye, R. J.; Jackson, G. L.; Lohr, J.; Makowski, M. A.; Pinsker, R. I.; Politzer, P. A.; Prater, R.; Strait, E. J.; Taylor, T. S.; West, W. P.; DIII-D Team
2004-05-01
Advanced Tokamak (AT) research in DIII-D [K. H. Burrell for the DIII-D Team, in Proceedings of the 19th Fusion Energy Conference, Lyon, France, 2002 (International Atomic Energy Agency, Vienna, 2002) published on CD-ROM] seeks to provide a scientific basis for steady-state high performance operation in future devices. These regimes require high toroidal beta to maximize fusion output and poloidal beta to maximize the self-driven bootstrap current. Achieving these conditions requires integrated, simultaneous control of the current and pressure profiles, and active magnetohydrodynamic stability control. The building blocks for AT operation are in hand. Resistive wall mode stabilization via plasma rotation and active feedback with nonaxisymmetric coils allows routine operation above the no-wall beta limit. Neoclassical tearing modes are stabilized by active feedback control of localized electron cyclotron current drive (ECCD). Plasma shaping and profile control provide further improvements. Under these conditions, bootstrap supplies most of the current. Steady-state operation requires replacing the remaining Ohmic current, mostly located near the half radius, with noninductive external sources. In DIII-D this current is provided by ECCD, and nearly stationary AT discharges have been sustained with little remaining Ohmic current. Fast wave current drive is being developed to control the central magnetic shear. Density control, with divertor cryopumps, of AT discharges with edge localized moding H-mode edges facilitates high current drive efficiency at reactor relevant collisionalities. A sophisticated plasma control system allows integrated control of these elements. Close coupling between modeling and experiment is key to understanding the separate elements, their complex nonlinear interactions, and their integration into self-consistent high performance scenarios. Progress on this development, and its implications for next-step devices, will be illustrated by results of recent experiment and simulation efforts.
Schneider, Kevin; Koblmüller, Stephan; Sefc, Kristina M
2015-11-11
The homoplasy excess test (HET) is a tree-based screen for hybrid taxa in multilocus nuclear phylogenies. Homoplasy between a hybrid taxon and the clades containing the parental taxa reduces bootstrap support in the tree. The HET is based on the expectation that excluding the hybrid taxon from the data set increases the bootstrap support for the parental clades, whereas excluding non-hybrid taxa has little effect on statistical node support. To carry out a HET, bootstrap trees are calculated with taxon-jackknife data sets, that is excluding one taxon (species, population) at a time. Excess increase in bootstrap support for certain nodes upon exclusion of a particular taxon indicates the hybrid (the excluded taxon) and its parents (the clades with increased support).We introduce a new software program, hext, which generates the taxon-jackknife data sets, runs the bootstrap tree calculations, and identifies excess bootstrap increases as outlier values in boxplot graphs. hext is written in r language and accepts binary data (0/1; e.g. AFLP) as well as co-dominant SNP and genotype data.We demonstrate the usefulness of hext in large SNP data sets containing putative hybrids and their parents. For instance, using published data of the genus Vitis (~6,000 SNP loci), hext output supports V. × champinii as a hybrid between V. rupestris and V. mustangensis .With simulated SNP and AFLP data sets, excess increases in bootstrap support were not always connected with the hybrid taxon (false positives), whereas the expected bootstrap signal failed to appear on several occasions (false negatives). Potential causes for both types of spurious results are discussed.With both empirical and simulated data sets, the taxon-jackknife output generated by hext provided additional signatures of hybrid taxa, including changes in tree topology across trees, consistent effects of exclusions of the hybrid and the parent taxa, and moderate (rather than excessive) increases in bootstrap support. hext significantly facilitates the taxon-jackknife approach to hybrid taxon detection, even though the simple test for excess bootstrap increase may not reliably identify hybrid taxa in all applications.
Core transport properties in JT-60U and JET identity plasmas
NASA Astrophysics Data System (ADS)
Litaudon, X.; Sakamoto, Y.; de Vries, P. C.; Salmi, A.; Tala, T.; Angioni, C.; Benkadda, S.; Beurskens, M. N. A.; Bourdelle, C.; Brix, M.; Crombé, K.; Fujita, T.; Futatani, S.; Garbet, X.; Giroud, C.; Hawkes, N. C.; Hayashi, N.; Hoang, G. T.; Hogeweij, G. M. D.; Matsunaga, G.; Nakano, T.; Oyama, N.; Parail, V.; Shinohara, K.; Suzuki, T.; Takechi, M.; Takenaga, H.; Takizuka, T.; Urano, H.; Voitsekhovitch, I.; Yoshida, M.; ITPA Transport Group; JT-60 Team; EFDA contributors, JET
2011-07-01
The paper compares the transport properties of a set of dimensionless identity experiments performed between JET and JT-60U in the advanced tokamak regime with internal transport barrier, ITB. These International Tokamak Physics Activity, ITPA, joint experiments were carried out with the same plasma shape, toroidal magnetic field ripple and dimensionless profiles as close as possible during the ITB triggering phase in terms of safety factor, normalized Larmor radius, normalized collision frequency, thermal beta, ratio of ion to electron temperatures. Similarities in the ITB triggering mechanisms and sustainment were observed when a good match was achieved of the most relevant normalized profiles except the toroidal Mach number. Similar thermal ion transport levels in the two devices have been measured in either monotonic or non-monotonic q-profiles. In contrast, differences between JET and JT-60U were observed on the electron thermal and particle confinement in reversed magnetic shear configurations. It was found that the larger shear reversal in the very centre (inside normalized radius of 0.2) of JT-60U plasmas allowed the sustainment of stronger electron density ITBs compared with JET. As a consequence of peaked density profile, the core bootstrap current density is more than five times higher in JT-60U compared with JET. Thanks to the bootstrap effect and the slightly broader neutral beam deposition, reversed magnetic shear configurations are self-sustained in JT-60U scenarios. Analyses of similarities and differences between the two devices address key questions on the validity of the usual assumptions made in ITER steady scenario modelling, e.g. a flat density profile in the core with thermal transport barrier? Such assumptions have consequences on the prediction of fusion performance, bootstrap current and on the sustainment of the scenario.
Bootstrap Estimation of Sample Statistic Bias in Structural Equation Modeling.
ERIC Educational Resources Information Center
Thompson, Bruce; Fan, Xitao
This study empirically investigated bootstrap bias estimation in the area of structural equation modeling (SEM). Three correctly specified SEM models were used under four different sample size conditions. Monte Carlo experiments were carried out to generate the criteria against which bootstrap bias estimation should be judged. For SEM fit indices,…
A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment
ERIC Educational Resources Information Center
Finch, Holmes; Monahan, Patrick
2008-01-01
This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…
NASA Astrophysics Data System (ADS)
Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker
2017-10-01
Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c≥ 13/24 for the central charge of such models, which we argue cannot be saturated by an interacting SCFT.
Unconventional Expressions: Productive Syntax in the L2 Acquisition of Formulaic Language
ERIC Educational Resources Information Center
Bardovi-Harlig, Kathleen; Stringer, David
2017-01-01
This article presents a generative analysis of the acquisition of formulaic language as an alternative to current usage-based proposals. One influential view of the role of formulaic expressions in second language (L2) development is that they are a bootstrapping mechanism into the L2 grammar; an initial repertoire of constructions allows for…
Epistemic uncertainty in the location and magnitude of earthquakes in Italy from Macroseismic data
Bakun, W.H.; Gomez, Capera A.; Stucchi, M.
2011-01-01
Three independent techniques (Bakun and Wentworth, 1997; Boxer from Gasperini et al., 1999; and Macroseismic Estimation of Earthquake Parameters [MEEP; see Data and Resources section, deliverable D3] from R.M.W. Musson and M.J. Jimenez) have been proposed for estimating an earthquake location and magnitude from intensity data alone. The locations and magnitudes obtained for a given set of intensity data are almost always different, and no one technique is consistently best at matching instrumental locations and magnitudes of recent well-recorded earthquakes in Italy. Rather than attempting to select one of the three solutions as best, we use all three techniques to estimate the location and the magnitude and the epistemic uncertainties among them. The estimates are calculated using bootstrap resampled data sets with Monte Carlo sampling of a decision tree. The decision-tree branch weights are based on goodness-of-fit measures of location and magnitude for recent earthquakes. The location estimates are based on the spatial distribution of locations calculated from the bootstrap resampled data. The preferred source location is the locus of the maximum bootstrap location spatial density. The location uncertainty is obtained from contours of the bootstrap spatial density: 68% of the bootstrap locations are within the 68% confidence region, and so on. For large earthquakes, our preferred location is not associated with the epicenter but with a location on the extended rupture surface. For small earthquakes, the epicenters are generally consistent with the location uncertainties inferred from the intensity data if an epicenter inaccuracy of 2-3 km is allowed. The preferred magnitude is the median of the distribution of bootstrap magnitudes. As with location uncertainties, the uncertainties in magnitude are obtained from the distribution of bootstrap magnitudes: the bounds of the 68% uncertainty range enclose 68% of the bootstrap magnitudes, and so on. The instrumental magnitudes for large and small earthquakes are generally consistent with the confidence intervals inferred from the distribution of bootstrap resampled magnitudes.
Confidence Intervals for the Mean: To Bootstrap or Not to Bootstrap
ERIC Educational Resources Information Center
Calzada, Maria E.; Gardner, Holly
2011-01-01
The results of a simulation conducted by a research team involving undergraduate and high school students indicate that when data is symmetric the student's "t" confidence interval for a mean is superior to the studied non-parametric bootstrap confidence intervals. When data is skewed and for sample sizes n greater than or equal to 10,…
The Beginner's Guide to the Bootstrap Method of Resampling.
ERIC Educational Resources Information Center
Lane, Ginny G.
The bootstrap method of resampling can be useful in estimating the replicability of study results. The bootstrap procedure creates a mock population from a given sample of data from which multiple samples are then drawn. The method extends the usefulness of the jackknife procedure as it allows for computation of a given statistic across a maximal…
Application of a New Resampling Method to SEM: A Comparison of S-SMART with the Bootstrap
ERIC Educational Resources Information Center
Bai, Haiyan; Sivo, Stephen A.; Pan, Wei; Fan, Xitao
2016-01-01
Among the commonly used resampling methods of dealing with small-sample problems, the bootstrap enjoys the widest applications because it often outperforms its counterparts. However, the bootstrap still has limitations when its operations are contemplated. Therefore, the purpose of this study is to examine an alternative, new resampling method…
A Primer on Bootstrap Factor Analysis as Applied to Health Studies Research
ERIC Educational Resources Information Center
Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.
2014-01-01
Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…
Forecasting drought risks for a water supply storage system using bootstrap position analysis
Tasker, Gary; Dunne, Paul
1997-01-01
Forecasting the likelihood of drought conditions is an integral part of managing a water supply storage and delivery system. Position analysis uses a large number of possible flow sequences as inputs to a simulation of a water supply storage and delivery system. For a given set of operating rules and water use requirements, water managers can use such a model to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows a few months ahead conditioned on the current reservoir levels and streamflows. The large number of possible flow sequences are generated using a stochastic streamflow model with a random resampling of innovations. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality and it allows incorporation of long-range weather forecasts into the analysis.
Bootstrap position analysis for forecasting low flow frequency
Tasker, Gary D.; Dunne, P.
1997-01-01
A method of random resampling of residuals from stochastic models is used to generate a large number of 12-month-long traces of natural monthly runoff to be used in a position analysis model for a water-supply storage and delivery system. Position analysis uses the traces to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows conditioned on the current reservoir levels and streamflows. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality, fewer parameters need to be estimated directly from the data, and accounting for parameter uncertainty is easily done. For a given set of operating rules and water-use requirements for a system, water managers can use such a model as a decision-making tool to evaluate different operating rules. ?? ASCE,.
Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A
2017-06-30
Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Extensions to the visual predictive check to facilitate model performance evaluation.
Post, Teun M; Freijer, Jan I; Ploeger, Bart A; Danhof, Meindert
2008-04-01
The Visual Predictive Check (VPC) is a valuable and supportive instrument for evaluating model performance. However in its most commonly applied form, the method largely depends on a subjective comparison of the distribution of the simulated data with the observed data, without explicitly quantifying and relating the information in both. In recent adaptations to the VPC this drawback is taken into consideration by presenting the observed and predicted data as percentiles. In addition, in some of these adaptations the uncertainty in the predictions is represented visually. However, it is not assessed whether the expected random distribution of the observations around the predicted median trend is realised in relation to the number of observations. Moreover the influence of and the information residing in missing data at each time point is not taken into consideration. Therefore, in this investigation the VPC is extended with two methods to support a less subjective and thereby more adequate evaluation of model performance: (i) the Quantified Visual Predictive Check (QVPC) and (ii) the Bootstrap Visual Predictive Check (BVPC). The QVPC presents the distribution of the observations as a percentage, thus regardless the density of the data, above and below the predicted median at each time point, while also visualising the percentage of unavailable data. The BVPC weighs the predicted median against the 5th, 50th and 95th percentiles resulting from a bootstrap of the observed data median at each time point, while accounting for the number and the theoretical position of unavailable data. The proposed extensions to the VPC are illustrated by a pharmacokinetic simulation example and applied to a pharmacodynamic disease progression example.
Essex, Holly; Parrott, Steve; Atkin, Karl; Ballard, Kathleen; Bland, Martin; Eldred, Janet; Hewitt, Catherine; Hopton, Ann; Keding, Ada; Lansdown, Harriet; Richmond, Stewart; Tilbrook, Helen; Torgerson, David; Watt, Ian; Wenham, Aniela; Woodman, Julia; MacPherson, Hugh
2017-01-01
To assess the cost-effectiveness of acupuncture and usual care, and Alexander Technique lessons and usual care, compared with usual GP care alone for chronic neck pain patients. An economic evaluation was undertaken alongside the ATLAS trial, taking both NHS and wider societal viewpoints. Participants were offered up to twelve acupuncture sessions or twenty Alexander lessons (equivalent overall contact time). Costs were in pounds sterling. Effectiveness was measured using the generic EQ-5D to calculate quality adjusted life years (QALYs), as well as using a specific neck pain measure-the Northwick Park Neck Pain Questionnaire (NPQ). In the base case analysis, incremental QALY gains were 0.032 and 0.025 in the acupuncture and Alexander groups, respectively, in comparison to usual GP care, indicating moderate health benefits for both interventions. Incremental costs were £451 for acupuncture and £667 for Alexander, mainly driven by intervention costs. Acupuncture was likely to be cost-effective (ICER = £18,767/QALY bootstrapped 95% CI £4,426 to £74,562) and was robust to most sensitivity analyses. Alexander lessons were not cost-effective at the lower NICE threshold of £20,000/QALY (£25,101/QALY bootstrapped 95% CI -£150,208 to £248,697) but may be at £30,000/QALY, however, there was considerable statistical uncertainty in all tested scenarios. In comparison with usual care, acupuncture is likely to be cost-effective for chronic neck pain, whereas, largely due to higher intervention costs, Alexander lessons are unlikely to be cost-effective. However, there were high levels of missing data and further research is needed to assess the long-term cost-effectiveness of these interventions.
Chatterton, Mary Lou; Chambers, Suzanne; Occhipinti, Stefano; Girgis, Afaf; Dunn, Jeffrey; Carter, Rob; Shih, Sophy; Mihalopoulos, Cathrine
2016-07-01
This study compared the cost-effectiveness of a psychologist-led, individualised cognitive behavioural intervention (PI) to a nurse-led, minimal contact self-management condition for highly distressed cancer patients and carers. This was an economic evaluation conducted alongside a randomised trial of highly distressed adult cancer patients and carers calling cancer helplines. Services used by participants were measured using a resource use questionnaire, and quality-adjusted life years were measured using the assessment of quality of life - eight-dimension - instrument collected through a computer-assisted telephone interview. The base case analysis stratified participants based on the baseline score on the Brief Symptom Inventory. Incremental cost-effectiveness ratio confidence intervals were calculated with a nonparametric bootstrap to reflect sampling uncertainty. The results were subjected to sensitivity analysis by varying unit costs for resource use and the method for handling missing data. No significant differences were found in overall total costs or quality-adjusted life years (QALYs) between intervention groups. Bootstrapped data suggest the PI had a higher probability of lower cost and greater QALYs for both carers and patients with high distress at baseline. For patients with low levels of distress at baseline, the PI had a higher probability of greater QALYs but at additional cost. Sensitivity analysis showed the results were robust. The PI may be cost-effective compared with the nurse-led, minimal contact self-management condition for highly distressed cancer patients and carers. More intensive psychological intervention for patients with greater levels of distress appears warranted. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Essex, Holly; Parrott, Steve; Atkin, Karl; Ballard, Kathleen; Bland, Martin; Eldred, Janet; Hewitt, Catherine; Hopton, Ann; Keding, Ada; Lansdown, Harriet; Richmond, Stewart; Tilbrook, Helen; Torgerson, David; Watt, Ian; Wenham, Aniela; Woodman, Julia; MacPherson, Hugh
2017-01-01
Objectives To assess the cost-effectiveness of acupuncture and usual care, and Alexander Technique lessons and usual care, compared with usual GP care alone for chronic neck pain patients. Methods An economic evaluation was undertaken alongside the ATLAS trial, taking both NHS and wider societal viewpoints. Participants were offered up to twelve acupuncture sessions or twenty Alexander lessons (equivalent overall contact time). Costs were in pounds sterling. Effectiveness was measured using the generic EQ-5D to calculate quality adjusted life years (QALYs), as well as using a specific neck pain measure–the Northwick Park Neck Pain Questionnaire (NPQ). Results In the base case analysis, incremental QALY gains were 0.032 and 0.025 in the acupuncture and Alexander groups, respectively, in comparison to usual GP care, indicating moderate health benefits for both interventions. Incremental costs were £451 for acupuncture and £667 for Alexander, mainly driven by intervention costs. Acupuncture was likely to be cost-effective (ICER = £18,767/QALY bootstrapped 95% CI £4,426 to £74,562) and was robust to most sensitivity analyses. Alexander lessons were not cost-effective at the lower NICE threshold of £20,000/QALY (£25,101/QALY bootstrapped 95% CI -£150,208 to £248,697) but may be at £30,000/QALY, however, there was considerable statistical uncertainty in all tested scenarios. Conclusions In comparison with usual care, acupuncture is likely to be cost-effective for chronic neck pain, whereas, largely due to higher intervention costs, Alexander lessons are unlikely to be cost-effective. However, there were high levels of missing data and further research is needed to assess the long-term cost-effectiveness of these interventions. PMID:29211741
ERIC Educational Resources Information Center
Cui, Zhongmin; Kolen, Michael J.
2008-01-01
This article considers two methods of estimating standard errors of equipercentile equating: the parametric bootstrap method and the nonparametric bootstrap method. Using a simulation study, these two methods are compared under three sample sizes (300, 1,000, and 3,000), for two test content areas (the Iowa Tests of Basic Skills Maps and Diagrams…
Non-inductive current drive and transport in high βN plasmas in JET
NASA Astrophysics Data System (ADS)
Voitsekhovitch, I.; Alper, B.; Brix, M.; Budny, R. V.; Buratti, P.; Challis, C. D.; Ferron, J.; Giroud, C.; Joffrin, E.; Laborde, L.; Luce, T. C.; McCune, D.; Menard, J.; Murakami, M.; Park, J. M.; JET-EFDA contributors
2009-05-01
A route to stationary MHD stable operation at high βN has been explored at the Joint European Torus (JET) by optimizing the current ramp-up, heating start time and the waveform of neutral beam injection (NBI) power. In these scenarios the current ramp-up has been accompanied by plasma pre-heat (or the NBI has been started before the current flat-top) and NBI power up to 22 MW has been applied during the current flat-top. In the discharges considered transient total βN ≈ 3.3 and stationary (during high power phase) βN ≈ 3 have been achieved by applying the feedback control of βN with the NBI power in configurations with monotonic or flat core safety factor profile and without an internal transport barrier (ITB). The transport and current drive in this scenario is analysed here by using the TRANSP and ASTRA codes. The interpretative analysis performed with TRANSP shows that 50-70% of current is driven non-inductively; half of this current is due to the bootstrap current which has a broad profile since an ITB was deliberately avoided. The GLF23 transport model predicts the temperature profiles within a ±22% discrepancy with the measurements over the explored parameter space. Predictive simulations with this model show that the E × B rotational shear plays an important role for thermal ion transport in this scenario, producing up to a 40% increase of the ion temperature. By applying transport and current drive models validated in self-consistent simulations of given reference scenarios in a wider parameter space, the requirements for fully non-inductive stationary operation at JET are estimated. It is shown that the strong stiffness of the temperature profiles predicted by the GLF23 model restricts the bootstrap current at larger heating power. In this situation full non-inductive operation without an ITB can be rather expensive strongly relying on the external non-inductive current drive sources.
NASA Astrophysics Data System (ADS)
Coupon, Jean; Leauthaud, Alexie; Kilbinger, Martin; Medezinski, Elinor
2017-07-01
SWOT (Super W Of Theta) computes two-point statistics for very large data sets, based on “divide and conquer” algorithms, mainly, but not limited to data storage in binary trees, approximation at large scale, parellelization (open MPI), and bootstrap and jackknife resampling methods “on the fly”. It currently supports projected and 3D galaxy auto and cross correlations, galaxy-galaxy lensing, and weighted histograms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. Brooks; A.H. Reiman; G.H. Neilson
High-beta, low-aspect-ratio (compact) stellarators are promising solutions to the problem of developing a magnetic plasma configuration for magnetic fusion power plants that can be sustained in steady-state without disrupting. These concepts combine features of stellarators and advanced tokamaks and have aspect ratios similar to those of tokamaks (2-4). They are based on computed plasma configurations that are shaped in three dimensions to provide desired stability and transport properties. Experiments are planned as part of a program to develop this concept. A beta = 4% quasi-axisymmetric plasma configuration has been evaluated for the National Compact Stellarator Experiment (NCSX). It has amore » substantial bootstrap current and is shaped to stabilize ballooning, external kink, vertical, and neoclassical tearing modes without feedback or close-fitting conductors. Quasi-omnigeneous plasma configurations stable to ballooning modes at beta = 4% have been evaluated for the Quasi-Omnigeneous Stellarator (QOS) experiment. These equilibria have relatively low bootstrap currents and are insensitive to changes in beta. Coil configurations have been calculated that reconstruct these plasma configurations, preserving their important physics properties. Theory- and experiment-based confinement analyses are used to evaluate the technical capabilities needed to reach target plasma conditions. The physics basis for these complementary experiments is described.« less
They Remember the "Lost" People.
ERIC Educational Resources Information Center
Klages, Karen
Estimates of the number of children currently missing in the United States are only approximate because there is no effective central data bank to collect information on missing persons and unidentified bodies. However, the problem appears to have reached epidemic proportions. Some parents of missing persons have formed organizations in different…
Cárdenas Castro, Manuel; Arnoso Martínez, Maitane; Faúndez Abarca, Ximena
2016-04-06
This study examines the role of coping strategies related to positive reappraisal versus other cognitive strategies (deliberate rumination) as mediators between life impact and posttraumatic growth in survivors of the military dictatorship in Chile between 1973 and 1990 (tortured political prisoners and family members of political prisoners executed and missing). Survey data from 251 political violence survivors were analyzed using the SPSS PROCESS macro for bootstrapping indirect effects (Hayes, 2013). Results indicated that positive reappraisal (or reframing) coping mediated the relationship between life impact and posttraumatic growth. A serial multiple mediation model indicates that in the life impact to growth moderation process, rumination must be followed by positive reappraisal to drive this growth. These findings suggest that positive reappraisal of the traumatic experience is essential to achieve growth reports. Implications of these more complex relations are discussed for both counseling interventions and further research. © The Author(s) 2016.
Progress and pitfalls in finding the 'missing proteins' from the human proteome map.
Segura, Victor; Garin-Muga, Alba; Guruceaga, Elizabeth; Corrales, Fernando J
2017-01-01
The Human Proteome Project was launched with two main goals: the comprehensive and systematic definition of the human proteome map and the development of ready to use analytical tools to measure relevant proteins in their biological context in health and disease. Despite the great progress in this endeavour, there is still a group of reluctant proteins with no, or scarce, experimental evidence supporting their existence. These are called the 'missing proteins' and represent one of the biggest challenges to complete the human proteome map. Areas covered: This review focuses on the description of the missing proteome based on the HUPO standards, the analysis of the reasons explaining the difficulty of detecting missing proteins and the strategies currently used in the search for missing proteins. The present and future of the quest for the missing proteins is critically revised hereafter. Expert commentary: An overarching multidisciplinary effort is currently being done under the HUPO umbrella to allow completion of the human proteome map. It is expected that the detection of missing proteins will grow in the coming years since the methods and the best tissue/cell type sample for their search are already on the table.
NASA Astrophysics Data System (ADS)
Komachi, Mamoru; Kudo, Taku; Shimbo, Masashi; Matsumoto, Yuji
Bootstrapping has a tendency, called semantic drift, to select instances unrelated to the seed instances as the iteration proceeds. We demonstrate the semantic drift of Espresso-style bootstrapping has the same root as the topic drift of Kleinberg's HITS, using a simplified graph-based reformulation of bootstrapping. We confirm that two graph-based algorithms, the von Neumann kernels and the regularized Laplacian, can reduce the effect of semantic drift in the task of word sense disambiguation (WSD) on Senseval-3 English Lexical Sample Task. Proposed algorithms achieve superior performance to Espresso and previous graph-based WSD methods, even though the proposed algorithms have less parameters and are easy to calibrate.
Confidence limit calculation for antidotal potency ratio derived from lethal dose 50
Manage, Ananda; Petrikovics, Ilona
2013-01-01
AIM: To describe confidence interval calculation for antidotal potency ratios using bootstrap method. METHODS: We can easily adapt the nonparametric bootstrap method which was invented by Efron to construct confidence intervals in such situations like this. The bootstrap method is a resampling method in which the bootstrap samples are obtained by resampling from the original sample. RESULTS: The described confidence interval calculation using bootstrap method does not require the sampling distribution antidotal potency ratio. This can serve as a substantial help for toxicologists, who are directed to employ the Dixon up-and-down method with the application of lower number of animals to determine lethal dose 50 values for characterizing the investigated toxic molecules and eventually for characterizing the antidotal protections by the test antidotal systems. CONCLUSION: The described method can serve as a useful tool in various other applications. Simplicity of the method makes it easier to do the calculation using most of the programming software packages. PMID:25237618
Topics in Statistical Calibration
2014-03-27
on a parametric bootstrap where, instead of sampling directly from the residuals , samples are drawn from a normal distribution. This procedure will...addition to centering them (Davison and Hinkley, 1997). When there are outliers in the residuals , the bootstrap distribution of x̂0 can become skewed or...based and inversion methods using the linear mixed-effects model. Then, a simple parametric bootstrap algorithm is proposed that can be used to either
Assessing uncertainties in superficial water provision by different bootstrap-based techniques
NASA Astrophysics Data System (ADS)
Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo Mario
2014-05-01
An assessment of water security can incorporate several water-related concepts, characterizing the interactions between societal needs, ecosystem functioning, and hydro-climatic conditions. The superficial freshwater provision level depends on the methods chosen for 'Environmental Flow Requirement' estimations, which integrate the sources of uncertainty in the understanding of how water-related threats to aquatic ecosystem security arise. Here, we develop an uncertainty assessment of superficial freshwater provision based on different bootstrap techniques (non-parametric resampling with replacement). To illustrate this approach, we use an agricultural basin (291 km2) within the Cantareira water supply system in Brazil monitored by one daily streamflow gage (24-year period). The original streamflow time series has been randomly resampled for different times or sample sizes (N = 500; ...; 1000), then applied to the conventional bootstrap approach and variations of this method, such as: 'nearest neighbor bootstrap'; and 'moving blocks bootstrap'. We have analyzed the impact of the sampling uncertainty on five Environmental Flow Requirement methods, based on: flow duration curves or probability of exceedance (Q90%, Q75% and Q50%); 7-day 10-year low-flow statistic (Q7,10); and presumptive standard (80% of the natural monthly mean ?ow). The bootstrap technique has been also used to compare those 'Environmental Flow Requirement' (EFR) methods among themselves, considering the difference between the bootstrap estimates and the "true" EFR characteristic, which has been computed averaging the EFR values of the five methods and using the entire streamflow record at monitoring station. This study evaluates the bootstrapping strategies, the representativeness of streamflow series for EFR estimates and their confidence intervals, in addition to overview of the performance differences between the EFR methods. The uncertainties arisen during EFR methods assessment will be propagated through water security indicators referring to water scarcity and vulnerability, seeking to provide meaningful support to end-users and water managers facing the incorporation of uncertainties in the decision making process.
Flynn-Evans, Erin E.; Lockley, Steven W.
2016-01-01
Study Objectives: There is currently no questionnaire-based pre-screening tool available to detect non-24-hour sleep-wake rhythm disorder (N24HSWD) among blind patients. Our goal was to develop such a tool, derived from gold standard, objective hormonal measures of circadian entrainment status, for the detection of N24HSWD among those with visual impairment. Methods: We evaluated the contribution of 40 variables in their ability to predict N24HSWD among 127 blind women, classified using urinary 6-sulfatoxymelatonin period, an objective marker of circadian entrainment status in this population. We subjected the 40 candidate predictors to 1,000 bootstrapped iterations of a logistic regression forward selection model to predict N24HSWD, with model inclusion set at the p < 0.05 level. We removed any predictors that were not selected at least 1% of the time in the 1,000 bootstrapped models and applied a second round of 1,000 bootstrapped logistic regression forward selection models to the remaining 23 candidate predictors. We included all questions that were selected at least 10% of the time in the final model. We subjected the selected predictors to a final logistic regression model to predict N24SWD over 1,000 bootstrapped models to calculate the concordance statistic and adjusted optimism of the final model. We used this information to generate a predictive model and determined the sensitivity and specificity of the model. Finally, we applied the model to a cohort of 1,262 blind women who completed the survey, but did not collect urine samples. Results: The final model consisted of eight questions. The concordance statistic, adjusted for bootstrapping, was 0.85. The positive predictive value was 88%, the negative predictive value was 79%. Applying this model to our larger dataset of women, we found that 61% of those without light perception, and 27% with some degree of light perception, would be referred for further screening for N24HSWD. Conclusions: Our model has predictive utility sufficient to serve as a pre-screening questionnaire for N24HSWD among the blind. Citation: Flynn-Evans EE, Lockley SW. A pre-screening questionnaire to predict non-24-hour sleep-wake rhythm disorder (N24HSWD) among the blind. J Clin Sleep Med 2016;12(5):703–710. PMID:26951421
Counting conformal correlators
NASA Astrophysics Data System (ADS)
Kravchuk, Petr; Simmons-Duffin, David
2018-02-01
We introduce simple group-theoretic techniques for classifying conformallyinvariant tensor structures. With them, we classify tensor structures of general n-point functions of non-conserved operators, and n ≥ 4-point functions of general conserved currents, with or without permutation symmetries, and in any spacetime dimension d. Our techniques are useful for bootstrap applications. The rules we derive simultaneously count tensor structures for flat-space scattering amplitudes in d + 1 dimensions.
Principled Missing Data Treatments.
Lang, Kyle M; Little, Todd D
2018-04-01
We review a number of issues regarding missing data treatments for intervention and prevention researchers. Many of the common missing data practices in prevention research are still, unfortunately, ill-advised (e.g., use of listwise and pairwise deletion, insufficient use of auxiliary variables). Our goal is to promote better practice in the handling of missing data. We review the current state of missing data methodology and recent missing data reporting in prevention research. We describe antiquated, ad hoc missing data treatments and discuss their limitations. We discuss two modern, principled missing data treatments: multiple imputation and full information maximum likelihood, and we offer practical tips on how to best employ these methods in prevention research. The principled missing data treatments that we discuss are couched in terms of how they improve causal and statistical inference in the prevention sciences. Our recommendations are firmly grounded in missing data theory and well-validated statistical principles for handling the missing data issues that are ubiquitous in biosocial and prevention research. We augment our broad survey of missing data analysis with references to more exhaustive resources.
Toma, Tudor; Bosman, Robert-Jan; Siebes, Arno; Peek, Niels; Abu-Hanna, Ameen
2010-08-01
An important problem in the Intensive Care is how to predict on a given day of stay the eventual hospital mortality for a specific patient. A recent approach to solve this problem suggested the use of frequent temporal sequences (FTSs) as predictors. Methods following this approach were evaluated in the past by inducing a model from a training set and validating the prognostic performance on an independent test set. Although this evaluative approach addresses the validity of the specific models induced in an experiment, it falls short of evaluating the inductive method itself. To achieve this, one must account for the inherent sources of variation in the experimental design. The main aim of this work is to demonstrate a procedure based on bootstrapping, specifically the .632 bootstrap procedure, for evaluating inductive methods that discover patterns, such as FTSs. A second aim is to apply this approach to find out whether a recently suggested inductive method that discovers FTSs of organ functioning status is superior over a traditional method that does not use temporal sequences when compared on each successive day of stay at the Intensive Care Unit. The use of bootstrapping with logistic regression using pre-specified covariates is known in the statistical literature. Using inductive methods of prognostic models based on temporal sequence discovery within the bootstrap procedure is however novel at least in predictive models in the Intensive Care. Our results of applying the bootstrap-based evaluative procedure demonstrate the superiority of the FTS-based inductive method over the traditional method in terms of discrimination as well as accuracy. In addition we illustrate the insights gained by the analyst into the discovered FTSs from the bootstrap samples. Copyright 2010 Elsevier Inc. All rights reserved.
Elkomy, Mohammed H; Elmenshawe, Shahira F; Eid, Hussein M; Ali, Ahmed M A
2016-11-01
This work aimed at investigating the potential of solid lipid nanoparticles (SLN) as carriers for topical delivery of Ketoprofen (KP); evaluating a novel technique incorporating Artificial Neural Network (ANN) and clustered bootstrap for optimization of KP-loaded SLN (KP-SLN); and demonstrating a longitudinal dose response (LDR) modeling-based approach to compare the activity of topical non-steroidal anti-inflammatory drug formulations. KP-SLN was fabricated by a modified emulsion/solvent evaporation method. Box-Behnken design was implemented to study the influence of glycerylpalmitostearate-to-KP ratio, Tween 80, and lecithin concentrations on particle size, entrapment efficiency, and amount of drug permeated through rat skin in 24 hours. Following clustered bootstrap ANN optimization, the optimized KP-SLN was incorporated into an aqueous gel and evaluated for rheology, in vitro release, permeability, skin irritation and in vivo activity using carrageenan-induced rat paw edema model and LDR mathematical model to analyze the time course of anti-inflammatory effect at various application durations. Lipid-to-drug ratio of 7.85 [bootstrap 95%CI: 7.63-8.51], Tween 80 of 1.27% [bootstrap 95%CI: 0.601-2.40%], and Lecithin of 0.263% [bootstrap 95%CI: 0.263-0.328%] were predicted to produce optimal characteristics. Compared with profenid® gel, the optimized KP-SLN gel exhibited slower release, faster permeability, better texture properties, greater efficacy, and similar potency. SLNs are safe and effective permeation enhancers. ANN coupled with clustered bootstrap is a useful method for finding optimal solutions and estimating uncertainty associated with them. LDR models allow mechanistic understanding of comparative in vivo performances of different topical formulations, and help design efficient dermatological bioequivalence assessment methods.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things.
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-03-11
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-01-01
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length. PMID:26978362
Vorburger, Robert S; Habeck, Christian G; Narkhede, Atul; Guzman, Vanessa A; Manly, Jennifer J; Brickman, Adam M
2016-01-01
Diffusion tensor imaging suffers from an intrinsic low signal-to-noise ratio. Bootstrap algorithms have been introduced to provide a non-parametric method to estimate the uncertainty of the measured diffusion parameters. To quantify the variability of the principal diffusion direction, bootstrap-derived metrics such as the cone of uncertainty have been proposed. However, bootstrap-derived metrics are not independent of the underlying diffusion profile. A higher mean diffusivity causes a smaller signal-to-noise ratio and, thus, increases the measurement uncertainty. Moreover, the goodness of the tensor model, which relies strongly on the complexity of the underlying diffusion profile, influences bootstrap-derived metrics as well. The presented simulations clearly depict the cone of uncertainty as a function of the underlying diffusion profile. Since the relationship of the cone of uncertainty and common diffusion parameters, such as the mean diffusivity and the fractional anisotropy, is not linear, the cone of uncertainty has a different sensitivity. In vivo analysis of the fornix reveals the cone of uncertainty to be a predictor of memory function among older adults. No significant correlation occurs with the common diffusion parameters. The present work not only demonstrates the cone of uncertainty as a function of the actual diffusion profile, but also discloses the cone of uncertainty as a sensitive predictor of memory function. Future studies should incorporate bootstrap-derived metrics to provide more comprehensive analysis.
Using Cluster Bootstrapping to Analyze Nested Data With a Few Clusters.
Huang, Francis L
2018-04-01
Cluster randomized trials involving participants nested within intact treatment and control groups are commonly performed in various educational, psychological, and biomedical studies. However, recruiting and retaining intact groups present various practical, financial, and logistical challenges to evaluators and often, cluster randomized trials are performed with a low number of clusters (~20 groups). Although multilevel models are often used to analyze nested data, researchers may be concerned of potentially biased results due to having only a few groups under study. Cluster bootstrapping has been suggested as an alternative procedure when analyzing clustered data though it has seen very little use in educational and psychological studies. Using a Monte Carlo simulation that varied the number of clusters, average cluster size, and intraclass correlations, we compared standard errors using cluster bootstrapping with those derived using ordinary least squares regression and multilevel models. Results indicate that cluster bootstrapping, though more computationally demanding, can be used as an alternative procedure for the analysis of clustered data when treatment effects at the group level are of primary interest. Supplementary material showing how to perform cluster bootstrapped regressions using R is also provided.
Prospects for steady-state scenarios on JET
NASA Astrophysics Data System (ADS)
Litaudon, X.; Bizarro, J. P. S.; Challis, C. D.; Crisanti, F.; DeVries, P. C.; Lomas, P.; Rimini, F. G.; Tala, T. J. J.; Akers, R.; Andrew, Y.; Arnoux, G.; Artaud, J. F.; Baranov, Yu F.; Beurskens, M.; Brix, M.; Cesario, R.; DeLa Luna, E.; Fundamenski, W.; Giroud, C.; Hawkes, N. C.; Huber, A.; Joffrin, E.; Pitts, R. A.; Rachlew, E.; Reyes-Cortes, S. D. A.; Sharapov, S. E.; Zastrow, K. D.; Zimmermann, O.; JET EFDA contributors, the
2007-09-01
In the 2006 experimental campaign, progress has been made on JET to operate non-inductive scenarios at higher applied powers (31 MW) and density (nl ~ 4 × 1019 m-3), with ITER-relevant safety factor (q95 ~ 5) and plasma shaping, taking advantage of the new divertor capabilities. The extrapolation of the performance using transport modelling benchmarked on the experimental database indicates that the foreseen power upgrade (~45 MW) will allow the development of non-inductive scenarios where the bootstrap current is maximized together with the fusion yield and not, as in present-day experiments, at its expense. The tools for the long-term JET programme are the new ITER-like ICRH antenna (~15 MW), an upgrade of the NB power (35 MW/20 s or 17.5 MW/40 s), a new ITER-like first wall, a new pellet injector for edge localized mode control together with improved diagnostic and control capability. Operation with the new wall will set new constraints on non-inductive scenarios that are already addressed experimentally and in the modelling. The fusion performance and driven current that could be reached at high density and power have been estimated using either 0D or 1-1/2D validated transport models. In the high power case (45 MW), the calculations indicate the potential for the operational space of the non-inductive regime to be extended in terms of current (~2.5 MA) and density (nl > 5 × 1019 m-3), with high βN (βN > 3.0) and a fraction of the bootstrap current within 60-70% at high toroidal field (~3.5 T).
Bootstrapping Least Squares Estimates in Biochemical Reaction Networks
Linder, Daniel F.
2015-01-01
The paper proposes new computational methods of computing confidence bounds for the least squares estimates (LSEs) of rate constants in mass-action biochemical reaction network and stochastic epidemic models. Such LSEs are obtained by fitting the set of deterministic ordinary differential equations (ODEs), corresponding to the large volume limit of a reaction network, to network’s partially observed trajectory treated as a continuous-time, pure jump Markov process. In the large volume limit the LSEs are asymptotically Gaussian, but their limiting covariance structure is complicated since it is described by a set of nonlinear ODEs which are often ill-conditioned and numerically unstable. The current paper considers two bootstrap Monte-Carlo procedures, based on the diffusion and linear noise approximations for pure jump processes, which allow one to avoid solving the limiting covariance ODEs. The results are illustrated with both in-silico and real data examples from the LINE 1 gene retrotranscription model and compared with those obtained using other methods. PMID:25898769
Percolation in education and application in the 21st century
NASA Astrophysics Data System (ADS)
Adler, Joan; Elfenbaum, Shaked; Sharir, Liran
2017-03-01
Percolation, "so simple you could teach it to your wife" (Chuck Newman, last century) is an ideal system to introduce young students to phase transitions. Two recent projects in the Computational Physics group at the Technion make this easy. One is a set of analog models to be mounted on our walls and enable visitors to switch between samples to see which mixtures of glass and metal objects have a percolating current. The second is a website enabling the creation of stereo samples of two and three dimensional clusters (suited for viewing with Oculus rift) on desktops, tablets and smartphones. Although there have been many physical applications for regular percolation in the past, for Bootstrap Percolation, where only sites with sufficient occupied neighbours remain active, there have not been a surfeit of condensed matter applications. We have found that the creation of diamond membranes for quantum computers can be modeled with a bootstrap process of graphitization in diamond, enabling prediction of optimal processing procedures.
NASA Astrophysics Data System (ADS)
Georgopoulos, A. P.; Tan, H.-R. M.; Lewis, S. M.; Leuthold, A. C.; Winskowski, A. M.; Lynch, J. K.; Engdahl, B.
2010-02-01
Traumatic experiences can produce post-traumatic stress disorder (PTSD) which is a debilitating condition and for which no biomarker currently exists (Institute of Medicine (US) 2006 Posttraumatic Stress Disorder: Diagnosis and Assessment (Washington, DC: National Academies)). Here we show that the synchronous neural interactions (SNI) test which assesses the functional interactions among neural populations derived from magnetoencephalographic (MEG) recordings (Georgopoulos A P et al 2007 J. Neural Eng. 4 349-55) can successfully differentiate PTSD patients from healthy control subjects. Externally cross-validated, bootstrap-based analyses yielded >90% overall accuracy of classification. In addition, all but one of 18 patients who were not receiving medications for their disease were correctly classified. Altogether, these findings document robust differences in brain function between the PTSD and control groups that can be used for differential diagnosis and which possess the potential for assessing and monitoring disease progression and effects of therapy.
NASA Astrophysics Data System (ADS)
Brandic, Ivona; Music, Dejan; Dustdar, Schahram
Nowadays, novel computing paradigms as for example Cloud Computing are gaining more and more on importance. In case of Cloud Computing users pay for the usage of the computing power provided as a service. Beforehand they can negotiate specific functional and non-functional requirements relevant for the application execution. However, providing computing power as a service bears different research challenges. On one hand dynamic, versatile, and adaptable services are required, which can cope with system failures and environmental changes. On the other hand, human interaction with the system should be minimized. In this chapter we present the first results in establishing adaptable, versatile, and dynamic services considering negotiation bootstrapping and service mediation achieved in context of the Foundations of Self-Governing ICT Infrastructures (FoSII) project. We discuss novel meta-negotiation and SLA mapping solutions for Cloud services bridging the gap between current QoS models and Cloud middleware and representing important prerequisites for the establishment of autonomic Cloud services.
Phylogenetic relationships among arecoid palms (Arecaceae: Arecoideae)
Baker, William J.; Norup, Maria V.; Clarkson, James J.; Couvreur, Thomas L. P.; Dowe, John L.; Lewis, Carl E.; Pintaud, Jean-Christophe; Savolainen, Vincent; Wilmot, Tomas; Chase, Mark W.
2011-01-01
Background and Aims The Arecoideae is the largest and most diverse of the five subfamilies of palms (Arecaceae/Palmae), containing >50 % of the species in the family. Despite its importance, phylogenetic relationships among Arecoideae are poorly understood. Here the most densely sampled phylogenetic analysis of Arecoideae available to date is presented. The results are used to test the current classification of the subfamily and to identify priority areas for future research. Methods DNA sequence data for the low-copy nuclear genes PRK and RPB2 were collected from 190 palm species, covering 103 (96 %) genera of Arecoideae. The data were analysed using the parsimony ratchet, maximum likelihood, and both likelihood and parsimony bootstrapping. Key Results and Conclusions Despite the recovery of paralogues and pseudogenes in a small number of taxa, PRK and RPB2 were both highly informative, producing well-resolved phylogenetic trees with many nodes well supported by bootstrap analyses. Simultaneous analyses of the combined data sets provided additional resolution and support. Two areas of incongruence between PRK and RPB2 were strongly supported by the bootstrap relating to the placement of tribes Chamaedoreeae, Iriarteeae and Reinhardtieae; the causes of this incongruence remain uncertain. The current classification within Arecoideae was strongly supported by the present data. Of the 14 tribes and 14 sub-tribes in the classification, only five sub-tribes from tribe Areceae (Basseliniinae, Linospadicinae, Oncospermatinae, Rhopalostylidinae and Verschaffeltiinae) failed to receive support. Three major higher level clades were strongly supported: (1) the RRC clade (Roystoneeae, Reinhardtieae and Cocoseae), (2) the POS clade (Podococceae, Oranieae and Sclerospermeae) and (3) the core arecoid clade (Areceae, Euterpeae, Geonomateae, Leopoldinieae, Manicarieae and Pelagodoxeae). However, new data sources are required to elucidate ambiguities that remain in phylogenetic relationships among and within the major groups of Arecoideae, as well as within the Areceae, the largest tribe in the palm family. PMID:21325340
Comparison of parametric and bootstrap method in bioequivalence test.
Ahn, Byung-Jin; Yim, Dong-Seok
2009-10-01
The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption.
Comparison of Parametric and Bootstrap Method in Bioequivalence Test
Ahn, Byung-Jin
2009-01-01
The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption. PMID:19915699
Prenatal Drug Exposure and Adolescent Cortisol Reactivity: Association with Behavioral Concerns.
Buckingham-Howes, Stacy; Mazza, Dayna; Wang, Yan; Granger, Douglas A; Black, Maureen M
2016-09-01
To examine stress reactivity in a sample of adolescents with prenatal drug exposure (PDE) by examining the consequences of PDE on stress-related adrenocortical reactivity, behavioral problems, and drug experimentation during adolescence. Participants (76 PDE, 61 non-drug exposed [NE]; 99% African-American; 50% male; mean age = 14.17 yr, SD = 1.17) provided a urine sample, completed a drug use questionnaire, and provided saliva samples (later assayed for cortisol) before and after a mild laboratory stress task. Caregivers completed the Behavior Assessment System for Children, Second Edition (BASC II) and reported their relationship to the adolescent. The NE group was more likely to exhibit task-related cortisol reactivity compared to the PDE group. Overall behavior problems and drug experimentation were comparable across groups with no differences between PDE and NE groups. In unadjusted mediation analyses, cortisol reactivity mediated the association between PDE and BASC II aggression scores (95% bootstrap confidence interval [CI], 0.04-4.28), externalizing problems scores (95% bootstrap CI, 0.03-4.50), and drug experimentation (95% bootstrap CI, 0.001-0.54). The associations remain with the inclusion of gender as a covariate but not when age is included. Findings support and expand current research in cortisol reactivity and PDE by demonstrating that cortisol reactivity attenuates the association between PDE and behavioral problems (aggression) and drug experimentation. If replicated, PDE may have long-lasting effects on stress-sensitive physiological mechanisms associated with behavioral problems (aggression) and drug experimentation in adolescence.
Lucidi, Valerio; Hendlisz, Alain; Van Laethem, Jean-Luc; Donckier, Vincent
2016-04-21
In oncosurgical approach to colorectal liver metastases, surgery remains considered as the only potentially curative option, while chemotherapy alone represents a strictly palliative treatment. However, missing metastases, defined as metastases disappearing after chemotherapy, represent a unique model to evaluate the curative potential of chemotherapy and to challenge current therapeutic algorithms. We reviewed recent series on missing colorectal liver metastases to evaluate incidence of this phenomenon, predictive factors and rates of cure defined by complete pathologic response in resected missing metastases and sustained clinical response when they were left unresected. According to the progresses in the efficacy of chemotherapeutic regimen, the incidence of missing liver metastases regularly increases these last years. Main predictive factors are small tumor size, low marker level, duration of chemotherapy, and use of intra-arterial chemotherapy. Initial series showed low rates of complete pathologic response in resected missing metastases and high recurrence rates when unresected. However, recent reports describe complete pathologic responses and sustained clinical responses reaching 50%, suggesting that chemotherapy could be curative in some cases. Accordingly, in case of missing colorectal liver metastases, the classical recommendation to resect initial tumor sites might have become partially obsolete. Furthermore, the curative effect of chemotherapy in selected cases could lead to a change of paradigm in patients with unresectable liver-only metastases, using intensive first-line chemotherapy to intentionally induce missing metastases, followed by adjuvant surgery on remnant chemoresistant tumors and close surveillance of initial sites that have been left unresected.
de Vrieze, Nynke Hesselina Neeltje; van Rooijen, Martijn; Speksnijder, Arjen Gerard Cornelis Lambertus; de Vries, Henry John C
2013-08-01
Urethral lymphogranuloma venereum (LGV) is not screened routinely. We found that in 341 men having sex with men with anorectal LGV, 7 (2.1%) had concurrent urethral LGV. Among 59 partners, 4 (6.8%) had urethral LGV infections. Urethral LGV is common, probably key in transmission, and missed in current routine LGV screening algorithms.
Bootstrap investigation of the stability of a Cox regression model.
Altman, D G; Andersen, P K
1989-07-01
We describe a bootstrap investigation of the stability of a Cox proportional hazards regression model resulting from the analysis of a clinical trial of azathioprine versus placebo in patients with primary biliary cirrhosis. We have considered stability to refer both to the choice of variables included in the model and, more importantly, to the predictive ability of the model. In stepwise Cox regression analyses of 100 bootstrap samples using 17 candidate variables, the most frequently selected variables were those selected in the original analysis, and no other important variable was identified. Thus there was no reason to doubt the model obtained in the original analysis. For each patient in the trial, bootstrap confidence intervals were constructed for the estimated probability of surviving two years. It is shown graphically that these intervals are markedly wider than those obtained from the original model.
NASA Astrophysics Data System (ADS)
Artemenko, M. V.; Chernetskaia, I. E.; Kalugina, N. M.; Shchekina, E. N.
2018-04-01
This article describes the solution of the actual problem of the productive formation of a cortege of informative measured features of the object of observation and / or control using author's algorithms for the use of bootstraps and counter-bootstraps technologies for processing the results of measurements of various states of the object on the basis of different volumes of the training sample. The work that is presented in this paper considers aggregation by specific indicators of informative capacity by linear, majority, logical and “greedy” methods, applied both individually and integrally. The results of the computational experiment are discussed, and in conclusion is drawn that the application of the proposed methods contributes to an increase in the efficiency of classification of the states of the object from the results of measurements.
How bootstrap can help in forecasting time series with more than one seasonal pattern
NASA Astrophysics Data System (ADS)
Cordeiro, Clara; Neves, M. Manuela
2012-09-01
The search for the future is an appealing challenge in time series analysis. The diversity of forecasting methodologies is inevitable and is still in expansion. Exponential smoothing methods are the launch platform for modelling and forecasting in time series analysis. Recently this methodology has been combined with bootstrapping revealing a good performance. The algorithm (Boot. EXPOS) using exponential smoothing and bootstrap methodologies, has showed promising results for forecasting time series with one seasonal pattern. In case of more than one seasonal pattern, the double seasonal Holt-Winters methods and the exponential smoothing methods were developed. A new challenge was now to combine these seasonal methods with bootstrap and carry over a similar resampling scheme used in Boot. EXPOS procedure. The performance of such partnership will be illustrated for some well-know data sets existing in software.
Phu, Jack; Bui, Bang V; Kalloniatis, Michael; Khuu, Sieu K
2018-03-01
The number of subjects needed to establish the normative limits for visual field (VF) testing is not known. Using bootstrap resampling, we determined whether the ground truth mean, distribution limits, and standard deviation (SD) could be approximated using different set size ( x ) levels, in order to provide guidance for the number of healthy subjects required to obtain robust VF normative data. We analyzed the 500 Humphrey Field Analyzer (HFA) SITA-Standard results of 116 healthy subjects and 100 HFA full threshold results of 100 psychophysically experienced healthy subjects. These VFs were resampled (bootstrapped) to determine mean sensitivity, distribution limits (5th and 95th percentiles), and SD for different ' x ' and numbers of resamples. We also used the VF results of 122 glaucoma patients to determine the performance of ground truth and bootstrapped results in identifying and quantifying VF defects. An x of 150 (for SITA-Standard) and 60 (for full threshold) produced bootstrapped descriptive statistics that were no longer different to the original distribution limits and SD. Removing outliers produced similar results. Differences between original and bootstrapped limits in detecting glaucomatous defects were minimized at x = 250. Ground truth statistics of VF sensitivities could be approximated using set sizes that are significantly smaller than the original cohort. Outlier removal facilitates the use of Gaussian statistics and does not significantly affect the distribution limits. We provide guidance for choosing the cohort size for different levels of error when performing normative comparisons with glaucoma patients.
A bootstrap estimation scheme for chemical compositional data with nondetects
Palarea-Albaladejo, J; Martín-Fernández, J.A; Olea, Ricardo A.
2014-01-01
The bootstrap method is commonly used to estimate the distribution of estimators and their associated uncertainty when explicit analytic expressions are not available or are difficult to obtain. It has been widely applied in environmental and geochemical studies, where the data generated often represent parts of whole, typically chemical concentrations. This kind of constrained data is generically called compositional data, and they require specialised statistical methods to properly account for their particular covariance structure. On the other hand, it is not unusual in practice that those data contain labels denoting nondetects, that is, concentrations falling below detection limits. Nondetects impede the implementation of the bootstrap and represent an additional source of uncertainty that must be taken into account. In this work, a bootstrap scheme is devised that handles nondetects by adding an imputation step within the resampling process and conveniently propagates their associated uncertainly. In doing so, it considers the constrained relationships between chemical concentrations originated from their compositional nature. Bootstrap estimates using a range of imputation methods, including new stochastic proposals, are compared across scenarios of increasing difficulty. They are formulated to meet compositional principles following the log-ratio approach, and an adjustment is introduced in the multivariate case to deal with nonclosed samples. Results suggest that nondetect bootstrap based on model-based imputation is generally preferable. A robust approach based on isometric log-ratio transformations appears to be particularly suited in this context. Computer routines in the R statistical programming language are provided.
Feder, Paul I; Ma, Zhenxu J; Bull, Richard J; Teuschler, Linda K; Rice, Glenn
2009-01-01
In chemical mixtures risk assessment, the use of dose-response data developed for one mixture to estimate risk posed by a second mixture depends on whether the two mixtures are sufficiently similar. While evaluations of similarity may be made using qualitative judgments, this article uses nonparametric statistical methods based on the "bootstrap" resampling technique to address the question of similarity among mixtures of chemical disinfectant by-products (DBP) in drinking water. The bootstrap resampling technique is a general-purpose, computer-intensive approach to statistical inference that substitutes empirical sampling for theoretically based parametric mathematical modeling. Nonparametric, bootstrap-based inference involves fewer assumptions than parametric normal theory based inference. The bootstrap procedure is appropriate, at least in an asymptotic sense, whether or not the parametric, distributional assumptions hold, even approximately. The statistical analysis procedures in this article are initially illustrated with data from 5 water treatment plants (Schenck et al., 2009), and then extended using data developed from a study of 35 drinking-water utilities (U.S. EPA/AMWA, 1989), which permits inclusion of a greater number of water constituents and increased structure in the statistical models.
Seol, Hyunsoo
2016-06-01
The purpose of this study was to apply the bootstrap procedure to evaluate how the bootstrapped confidence intervals (CIs) for polytomous Rasch fit statistics might differ according to sample sizes and test lengths in comparison with the rule-of-thumb critical value of misfit. A total of 25 simulated data sets were generated to fit the Rasch measurement and then a total of 1,000 replications were conducted to compute the bootstrapped CIs under each of 25 testing conditions. The results showed that rule-of-thumb critical values for assessing the magnitude of misfit were not applicable because the infit and outfit mean square error statistics showed different magnitudes of variability over testing conditions and the standardized fit statistics did not exactly follow the standard normal distribution. Further, they also do not share the same critical range for the item and person misfit. Based on the results of the study, the bootstrapped CIs can be used to identify misfitting items or persons as they offer a reasonable alternative solution, especially when the distributions of the infit and outfit statistics are not well known and depend on sample size. © The Author(s) 2016.
Part Marking and Identification Materials on MISSE
NASA Technical Reports Server (NTRS)
Finckenor, Miria M.; Roxby, Donald L.
2008-01-01
Many different spacecraft materials were flown as part of the Materials on International Space Station Experiment (MISSE), including several materials used in part marking and identification. The experiment contained Data Matrix symbols applied using laser bonding, vacuum arc vapor deposition, gas assisted laser etch, chemical etch, mechanical dot peening, laser shot peening, and laser induced surface improvement. The effects of ultraviolet radiation on nickel acetate seal versus hot water seal on sulfuric acid anodized aluminum are discussed. These samples were exposed on the International Space Station to the low Earth orbital environment of atomic oxygen, ultraviolet radiation, thermal cycling, and hard vacuum, though atomic oxygen exposure was very limited for some samples. Results from the one-year exposure on MISSE-3 and MISSE-4 are compared to those from MISSE-1 and MISSE-2, which were exposed for four years. Part marking and identification materials on the current MISSE -6 experiment are also discussed.
NASA Astrophysics Data System (ADS)
Poli, Francesca M.; Kessel, Charles E.
2013-05-01
Plasmas with internal transport barriers (ITBs) are a potential and attractive route to steady-state operation in ITER. These plasmas exhibit radially localized regions of improved confinement with steep pressure gradients in the plasma core, which drive large bootstrap current and generate hollow current profiles and negative magnetic shear. This work examines the formation and sustainment of ITBs in ITER with electron cyclotron heating and current drive. The time-dependent transport simulations indicate that, with a trade-off of the power delivered to the equatorial and to the upper launcher, the sustainment of steady-state ITBs can be demonstrated in ITER with the baseline heating configuration.
1993-09-10
1993). A bootstrap generalizedlikelihood ratio test in discriminant analysis, Proc. 15th Annual Seismic Research Symposium, in press. I Hedlin, M., J... ratio indicate that the event does not belong to the first class. The bootstrap technique is used here as well to set the critical value of the test ...Methodist University. Baek, J., H. L. Gray, W. A. Woodward and M.D. Fisk (1993). A Bootstrap Generalized Likelihood Ratio Test in Discriminant
Cui, Ming; Xu, Lili; Wang, Huimin; Ju, Shaoqing; Xu, Shuizhu; Jing, Rongrong
2017-12-01
Measurement uncertainty (MU) is a metrological concept, which can be used for objectively estimating the quality of test results in medical laboratories. The Nordtest guide recommends an approach that uses both internal quality control (IQC) and external quality assessment (EQA) data to evaluate the MU. Bootstrap resampling is employed to simulate the unknown distribution based on the mathematical statistics method using an existing small sample of data, where the aim is to transform the small sample into a large sample. However, there have been no reports of the utilization of this method in medical laboratories. Thus, this study applied the Nordtest guide approach based on bootstrap resampling for estimating the MU. We estimated the MU for the white blood cell (WBC) count, red blood cell (RBC) count, hemoglobin (Hb), and platelets (Plt). First, we used 6months of IQC data and 12months of EQA data to calculate the MU according to the Nordtest method. Second, we combined the Nordtest method and bootstrap resampling with the quality control data and calculated the MU using MATLAB software. We then compared the MU results obtained using the two approaches. The expanded uncertainty results determined for WBC, RBC, Hb, and Plt using the bootstrap resampling method were 4.39%, 2.43%, 3.04%, and 5.92%, respectively, and 4.38%, 2.42%, 3.02%, and 6.00% with the existing quality control data (U [k=2]). For WBC, RBC, Hb, and Plt, the differences between the results obtained using the two methods were lower than 1.33%. The expanded uncertainty values were all less than the target uncertainties. The bootstrap resampling method allows the statistical analysis of the MU. Combining the Nordtest method and bootstrap resampling is considered a suitable alternative method for estimating the MU. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Xiaojing; Yu, Qingquan; Zhang, Xiaodong; Zhang, Yang; Zhu, Sizheng; Wang, Xiaoguang; Wu, Bin
2018-04-01
Numerical studies on the stabilization of neoclassical tearing modes (NTMs) by electron cyclotron current drive (ECCD) have been carried out based on reduced MHD equations, focusing on the amount of the required driven current for mode stabilization and the comparison with analytical results. The dependence of the minimum driven current required for NTM stabilization on some parameters, including the bootstrap current density, radial width of the driven current, radial deviation of the driven current from the resonant surface, and the island width when applying ECCD, are studied. By fitting the numerical results, simple expressions for these dependences are obtained. Analysis based on the modified Rutherford equation (MRE) has also been carried out, and the corresponding results have the same trend as numerical ones, while a quantitative difference between them exists. This difference becomes smaller when the applied radio frequency (rf) current is smaller.
Exploring the Replicability of a Study's Results: Bootstrap Statistics for the Multivariate Case.
ERIC Educational Resources Information Center
Thompson, Bruce
1995-01-01
Use of the bootstrap method in a canonical correlation analysis to evaluate the replicability of a study's results is illustrated. More confidence may be vested in research results that replicate. (SLD)
The Role of GRAIL Orbit Determination in Preprocessing of Gravity Science Measurements
NASA Technical Reports Server (NTRS)
Kruizinga, Gerhard; Asmar, Sami; Fahnestock, Eugene; Harvey, Nate; Kahan, Daniel; Konopliv, Alex; Oudrhiri, Kamal; Paik, Meegyeong; Park, Ryan; Strekalov, Dmitry;
2013-01-01
The Gravity Recovery And Interior Laboratory (GRAIL) mission has constructed a lunar gravity field with unprecedented uniform accuracy on the farside and nearside of the Moon. GRAIL lunar gravity field determination begins with preprocessing of the gravity science measurements by applying corrections for time tag error, general relativity, measurement noise and biases. Gravity field determination requires the generation of spacecraft ephemerides of an accuracy not attainable with the pre-GRAIL lunar gravity fields. Therefore, a bootstrapping strategy was developed, iterating between science data preprocessing and lunar gravity field estimation in order to construct sufficiently accurate orbit ephemerides.This paper describes the GRAIL measurements, their dependence on the spacecraft ephemerides and the role of orbit determination in the bootstrapping strategy. Simulation results will be presented that validate the bootstrapping strategy followed by bootstrapping results for flight data, which have led to the latest GRAIL lunar gravity fields.
The economics of bootstrapping space industries - Development of an analytic computer model
NASA Technical Reports Server (NTRS)
Goldberg, A. H.; Criswell, D. R.
1982-01-01
A simple economic model of 'bootstrapping' industrial growth in space and on the Moon is presented. An initial space manufacturing facility (SMF) is assumed to consume lunar materials to enlarge the productive capacity in space. After reaching a predetermined throughput, the enlarged SMF is devoted to products which generate revenue continuously in proportion to the accumulated output mass (such as space solar power stations). Present discounted value and physical estimates for the general factors of production (transport, capital efficiency, labor, etc.) are combined to explore optimum growth in terms of maximized discounted revenues. It is found that 'bootstrapping' reduces the fractional cost to a space industry of transport off-Earth, permits more efficient use of a given transport fleet. It is concluded that more attention should be given to structuring 'bootstrapping' scenarios in which 'learning while doing' can be more fully incorporated in program analysis.
Towards a bootstrap approach to higher orders of epsilon expansion
NASA Astrophysics Data System (ADS)
Dey, Parijat; Kaviraj, Apratim
2018-02-01
We employ a hybrid approach in determining the anomalous dimension and OPE coefficient of higher spin operators in the Wilson-Fisher theory. First we do a large spin analysis for CFT data where we use results obtained from the usual and the Mellin bootstrap and also from Feynman diagram literature. This gives new predictions at O( ɛ 4) and O( ɛ 5) for anomalous dimensions and OPE coefficients, and also provides a cross-check for the results from Mellin bootstrap. These higher orders get contributions from all higher spin operators in the crossed channel. We also use the bootstrap in Mellin space method for ϕ 3 in d = 6 - ɛ CFT where we calculate general higher spin OPE data. We demonstrate a higher loop order calculation in this approach by summing over contributions from higher spin operators of the crossed channel in the same spirit as before.
Point Set Denoising Using Bootstrap-Based Radial Basis Function.
Liew, Khang Jie; Ramli, Ahmad; Abd Majid, Ahmad
2016-01-01
This paper examines the application of a bootstrap test error estimation of radial basis functions, specifically thin-plate spline fitting, in surface smoothing. The presence of noisy data is a common issue of the point set model that is generated from 3D scanning devices, and hence, point set denoising is one of the main concerns in point set modelling. Bootstrap test error estimation, which is applied when searching for the smoothing parameters of radial basis functions, is revisited. The main contribution of this paper is a smoothing algorithm that relies on a bootstrap-based radial basis function. The proposed method incorporates a k-nearest neighbour search and then projects the point set to the approximated thin-plate spline surface. Therefore, the denoising process is achieved, and the features are well preserved. A comparison of the proposed method with other smoothing methods is also carried out in this study.
Rivers, Caitlin M; Majumder, Maimuna S; Lofgren, Eric T
2016-09-15
Middle East respiratory syndrome coronavirus (MERS-CoV) is an emerging pathogen, first recognized in 2012, with a high case fatality risk, no vaccine, and no treatment beyond supportive care. We estimated the relative risks of death and severe disease among MERS-CoV patients in the Middle East between 2012 and 2015 for several risk factors, using Poisson regression with robust variance and a bootstrap-based expectation maximization algorithm to handle extensive missing data. Increased age and underlying comorbidity were risk factors for both death and severe disease, while cases arising in Saudi Arabia were more likely to be severe. Cases occurring later in the emergence of MERS-CoV and among health-care workers were less serious. This study represents an attempt to estimate risk factors for an emerging infectious disease using open data and to address some of the uncertainty surrounding MERS-CoV epidemiology. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Character evolution and missing (morphological) data across the core asterids (Gentianidae)
USDA-ARS?s Scientific Manuscript database
Character evolution and missing (morphological) data across Asteridae. Premise of the study: Our current understanding of flowering plant phylogeny provides an excellent framework for exploring various aspects of character evolution through comparative analyses. However, attempts to synthesize this ...
Abstract: Inference and Interval Estimation for Indirect Effects With Latent Variable Models.
Falk, Carl F; Biesanz, Jeremy C
2011-11-30
Models specifying indirect effects (or mediation) and structural equation modeling are both popular in the social sciences. Yet relatively little research has compared methods that test for indirect effects among latent variables and provided precise estimates of the effectiveness of different methods. This simulation study provides an extensive comparison of methods for constructing confidence intervals and for making inferences about indirect effects with latent variables. We compared the percentile (PC) bootstrap, bias-corrected (BC) bootstrap, bias-corrected accelerated (BC a ) bootstrap, likelihood-based confidence intervals (Neale & Miller, 1997), partial posterior predictive (Biesanz, Falk, and Savalei, 2010), and joint significance tests based on Wald tests or likelihood ratio tests. All models included three reflective latent variables representing the independent, dependent, and mediating variables. The design included the following fully crossed conditions: (a) sample size: 100, 200, and 500; (b) number of indicators per latent variable: 3 versus 5; (c) reliability per set of indicators: .7 versus .9; (d) and 16 different path combinations for the indirect effect (α = 0, .14, .39, or .59; and β = 0, .14, .39, or .59). Simulations were performed using a WestGrid cluster of 1680 3.06GHz Intel Xeon processors running R and OpenMx. Results based on 1,000 replications per cell and 2,000 resamples per bootstrap method indicated that the BC and BC a bootstrap methods have inflated Type I error rates. Likelihood-based confidence intervals and the PC bootstrap emerged as methods that adequately control Type I error and have good coverage rates.
Combining test statistics and models in bootstrapped model rejection: it is a balancing act
2014-01-01
Background Model rejections lie at the heart of systems biology, since they provide conclusive statements: that the corresponding mechanistic assumptions do not serve as valid explanations for the experimental data. Rejections are usually done using e.g. the chi-square test (χ2) or the Durbin-Watson test (DW). Analytical formulas for the corresponding distributions rely on assumptions that typically are not fulfilled. This problem is partly alleviated by the usage of bootstrapping, a computationally heavy approach to calculate an empirical distribution. Bootstrapping also allows for a natural extension to estimation of joint distributions, but this feature has so far been little exploited. Results We herein show that simplistic combinations of bootstrapped tests, like the max or min of the individual p-values, give inconsistent, i.e. overly conservative or liberal, results. A new two-dimensional (2D) approach based on parametric bootstrapping, on the other hand, is found both consistent and with a higher power than the individual tests, when tested on static and dynamic examples where the truth is known. In the same examples, the most superior test is a 2D χ2vsχ2, where the second χ2-value comes from an additional help model, and its ability to describe bootstraps from the tested model. This superiority is lost if the help model is too simple, or too flexible. If a useful help model is found, the most powerful approach is the bootstrapped log-likelihood ratio (LHR). We show that this is because the LHR is one-dimensional, because the second dimension comes at a cost, and because LHR has retained most of the crucial information in the 2D distribution. These approaches statistically resolve a previously published rejection example for the first time. Conclusions We have shown how to, and how not to, combine tests in a bootstrap setting, when the combination is advantageous, and when it is advantageous to include a second model. These results also provide a deeper insight into the original motivation for formulating the LHR, for the more general setting of nonlinear and non-nested models. These insights are valuable in cases when accuracy and power, rather than computational speed, are prioritized. PMID:24742065
Trends and Correlation Estimation in Climate Sciences: Effects of Timescale Errors
NASA Astrophysics Data System (ADS)
Mudelsee, M.; Bermejo, M. A.; Bickert, T.; Chirila, D.; Fohlmeister, J.; Köhler, P.; Lohmann, G.; Olafsdottir, K.; Scholz, D.
2012-12-01
Trend describes time-dependence in the first moment of a stochastic process, and correlation measures the linear relation between two random variables. Accurately estimating the trend and correlation, including uncertainties, from climate time series data in the uni- and bivariate domain, respectively, allows first-order insights into the geophysical process that generated the data. Timescale errors, ubiquitious in paleoclimatology, where archives are sampled for proxy measurements and dated, poses a problem to the estimation. Statistical science and the various applied research fields, including geophysics, have almost completely ignored this problem due to its theoretical almost-intractability. However, computational adaptations or replacements of traditional error formulas have become technically feasible. This contribution gives a short overview of such an adaptation package, bootstrap resampling combined with parametric timescale simulation. We study linear regression, parametric change-point models and nonparametric smoothing for trend estimation. We introduce pairwise-moving block bootstrap resampling for correlation estimation. Both methods share robustness against autocorrelation and non-Gaussian distributional shape. We shortly touch computing-intensive calibration of bootstrap confidence intervals and consider options to parallelize the related computer code. Following examples serve not only to illustrate the methods but tell own climate stories: (1) the search for climate drivers of the Agulhas Current on recent timescales, (2) the comparison of three stalagmite-based proxy series of regional, western German climate over the later part of the Holocene, and (3) trends and transitions in benthic oxygen isotope time series from the Cenozoic. Financial support by Deutsche Forschungsgemeinschaft (FOR 668, FOR 1070, MU 1595/4-1) and the European Commission (MC ITN 238512, MC ITN 289447) is acknowledged.
Expanding Access to BRCA1/2 Genetic Counseling with Telephone Delivery: A Cluster Randomized Trial
Butler, Karin M.; Schwartz, Marc D.; Mandelblatt, Jeanne S.; Boucher, Kenneth M.; Pappas, Lisa M.; Gammon, Amanda; Kohlmann, Wendy; Edwards, Sandra L.; Stroup, Antoinette M.; Buys, Saundra S.; Flores, Kristina G.; Campo, Rebecca A.
2014-01-01
Background The growing demand for cancer genetic services underscores the need to consider approaches that enhance access and efficiency of genetic counseling. Telephone delivery of cancer genetic services may improve access to these services for individuals experiencing geographic (rural areas) and structural (travel time, transportation, childcare) barriers to access. Methods This cluster-randomized clinical trial used population-based sampling of women at risk for BRCA1/2 mutations to compare telephone and in-person counseling for: 1) equivalency of testing uptake and 2) noninferiority of changes in psychosocial measures. Women 25 to 74 years of age with personal or family histories of breast or ovarian cancer and who were able to travel to one of 14 outreach clinics were invited to participate. Randomization was by family. Assessments were conducted at baseline one week after pretest and post-test counseling and at six months. Of the 988 women randomly assigned, 901 completed a follow-up assessment. Cluster bootstrap methods were used to estimate the 95% confidence interval (CI) for the difference between test uptake proportions, using a 10% equivalency margin. Differences in psychosocial outcomes for determining noninferiority were estimated using linear models together with one-sided 97.5% bootstrap CIs. Results Uptake of BRCA1/2 testing was lower following telephone (21.8%) than in-person counseling (31.8%, difference = 10.2%, 95% CI = 3.9% to 16.3%; after imputation of missing data: difference = 9.2%, 95% CI = -0.1% to 24.6%). Telephone counseling fulfilled the criteria for noninferiority to in-person counseling for all measures. Conclusions BRCA1/2 telephone counseling, although leading to lower testing uptake, appears to be safe and as effective as in-person counseling with regard to minimizing adverse psychological reactions, promoting informed decision making, and delivering patient-centered communication for both rural and urban women. PMID:25376862
Kim, Dongchul; Kang, Mingon; Biswas, Ashis; Liu, Chunyu; Gao, Jean
2016-08-10
Inferring gene regulatory networks is one of the most interesting research areas in the systems biology. Many inference methods have been developed by using a variety of computational models and approaches. However, there are two issues to solve. First, depending on the structural or computational model of inference method, the results tend to be inconsistent due to innately different advantages and limitations of the methods. Therefore the combination of dissimilar approaches is demanded as an alternative way in order to overcome the limitations of standalone methods through complementary integration. Second, sparse linear regression that is penalized by the regularization parameter (lasso) and bootstrapping-based sparse linear regression methods were suggested in state of the art methods for network inference but they are not effective for a small sample size data and also a true regulator could be missed if the target gene is strongly affected by an indirect regulator with high correlation or another true regulator. We present two novel network inference methods based on the integration of three different criteria, (i) z-score to measure the variation of gene expression from knockout data, (ii) mutual information for the dependency between two genes, and (iii) linear regression-based feature selection. Based on these criterion, we propose a lasso-based random feature selection algorithm (LARF) to achieve better performance overcoming the limitations of bootstrapping as mentioned above. In this work, there are three main contributions. First, our z score-based method to measure gene expression variations from knockout data is more effective than similar criteria of related works. Second, we confirmed that the true regulator selection can be effectively improved by LARF. Lastly, we verified that an integrative approach can clearly outperform a single method when two different methods are effectively jointed. In the experiments, our methods were validated by outperforming the state of the art methods on DREAM challenge data, and then LARF was applied to inferences of gene regulatory network associated with psychiatric disorders.
Simulating realistic predator signatures in quantitative fatty acid signature analysis
Bromaghin, Jeffrey F.
2015-01-01
Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.
Migration of the ATLAS Metadata Interface (AMI) to Web 2.0 and cloud
NASA Astrophysics Data System (ADS)
Odier, J.; Albrand, S.; Fulachier, J.; Lambert, F.
2015-12-01
The ATLAS Metadata Interface (AMI), a mature application of more than 10 years of existence, is currently under adaptation to some recently available technologies. The web interfaces, which previously manipulated XML documents using XSL transformations, are being migrated to Asynchronous JavaScript (AJAX). Web development is considerably simplified by the introduction of a framework based on JQuery and Twitter Bootstrap. Finally, the AMI services are being migrated to an OpenStack cloud infrastructure.
Wallace, Meredith L; Anderson, Stewart J; Mazumdar, Sati
2010-12-20
Missing covariate data present a challenge to tree-structured methodology due to the fact that a single tree model, as opposed to an estimated parameter value, may be desired for use in a clinical setting. To address this problem, we suggest a multiple imputation algorithm that adds draws of stochastic error to a tree-based single imputation method presented by Conversano and Siciliano (Technical Report, University of Naples, 2003). Unlike previously proposed techniques for accommodating missing covariate data in tree-structured analyses, our methodology allows the modeling of complex and nonlinear covariate structures while still resulting in a single tree model. We perform a simulation study to evaluate our stochastic multiple imputation algorithm when covariate data are missing at random and compare it to other currently used methods. Our algorithm is advantageous for identifying the true underlying covariate structure when complex data and larger percentages of missing covariate observations are present. It is competitive with other current methods with respect to prediction accuracy. To illustrate our algorithm, we create a tree-structured survival model for predicting time to treatment response in older, depressed adults. Copyright © 2010 John Wiley & Sons, Ltd.
Finite Beta Boundary Magnetic Fields of NCSX
NASA Astrophysics Data System (ADS)
Grossman, A.; Kaiser, T.; Mioduszewski, P.
2004-11-01
The magnetic field between the plasma surface and wall of the National Compact Stellarator (NCSX), which uses quasi-symmetry to combine the best features of the tokamak and stellarator in a configuration of low aspect ratio is mapped via field line tracing in a range of finite beta in which part of the rotational transform is generated by the bootstrap current. We adopt the methodology developed for W7-X, in which an equilibrium solution is computed by an inverse equilibrium solver based on an energy minimizing variational moments code, VMEC2000[1], which solves directly for the shape of the flux surfaces given the external coils and their currents as well as a bootstrap current provided by a separate transport calculation. The VMEC solution and the Biot-Savart vacuum fields are coupled to the magnetic field solver for finite-beta equilibrium (MFBE2001)[2] code to determine the magnetic field on a 3D grid over a computational domain. It is found that the edge plasma is more stellarator-like, with a complex 3D structure, and less like the ordered 2D symmetric structure of a tokamak. The field lines make a transition from ergodically covering a surface to ergodically covering a volume, as the distance from the last closed magnetic surface is increased. The results are compared with the PIES[3] calculations. [1] S.P. Hirshman et al. Comput. Phys. Commun. 43 (1986) 143. [2] E. Strumberger, et al. Nucl. Fusion 42 (2002) 827. [3] A.H. Reiman and H.S. Greenside, Comput. Phys. Commun. 43, 157 (1986).
Bootstrap Methods: A Very Leisurely Look.
ERIC Educational Resources Information Center
Hinkle, Dennis E.; Winstead, Wayland H.
The Bootstrap method, a computer-intensive statistical method of estimation, is illustrated using a simple and efficient Statistical Analysis System (SAS) routine. The utility of the method for generating unknown parameters, including standard errors for simple statistics, regression coefficients, discriminant function coefficients, and factor…
Bootstrapping Student Understanding of What Is Going on in Econometrics.
ERIC Educational Resources Information Center
Kennedy, Peter E.
2001-01-01
Explains that econometrics is an intellectual game played by rules based on the sampling distribution concept. Contains explanations for why many students are uncomfortable with econometrics. Encourages instructors to use explain-how-to-bootstrap exercises to promote student understanding. (RLH)
2013-01-01
Background Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Methods Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups of patients compared to the most discriminating (reference) measure. Statistical significance of RV was quantified by the 95% bootstrap confidence interval. Simulations examined the effects of sample size, denominator F-statistic, correlation between comparator and reference measures, and number of bootstrap replicates. Results The statistical significance of the RV increased as the magnitude of denominator F-statistic increased or as the correlation between comparator and reference measures increased. A denominator F-statistic of 57 conveyed sufficient power (80%) to detect an RV of 0.6 for two measures correlated at r = 0.7. Larger denominator F-statistics or higher correlations provided greater power. Larger sample size with a fixed denominator F-statistic or more bootstrap replicates (beyond 500) had minimal impact. Conclusions The bootstrap is valuable for establishing the statistical significance of RV estimates. A reasonably large denominator F-statistic (F > 57) is required for adequate power when using the RV to compare the validity of measures with small or moderate correlations (r < 0.7). Substantially greater power can be achieved when comparing measures of a very high correlation (r > 0.9). PMID:23721463
Four Bootstrap Confidence Intervals for the Binomial-Error Model.
ERIC Educational Resources Information Center
Lin, Miao-Hsiang; Hsiung, Chao A.
1992-01-01
Four bootstrap methods are identified for constructing confidence intervals for the binomial-error model. The extent to which similar results are obtained and the theoretical foundation of each method and its relevance and ranges of modeling the true score uncertainty are discussed. (SLD)
Nonparametric Regression and the Parametric Bootstrap for Local Dependence Assessment.
ERIC Educational Resources Information Center
Habing, Brian
2001-01-01
Discusses ideas underlying nonparametric regression and the parametric bootstrap with an overview of their application to item response theory and the assessment of local dependence. Illustrates the use of the method in assessing local dependence that varies with examinee trait levels. (SLD)
Application of the Bootstrap Statistical Method in Deriving Vibroacoustic Specifications
NASA Technical Reports Server (NTRS)
Hughes, William O.; Paez, Thomas L.
2006-01-01
This paper discusses the Bootstrap Method for specification of vibroacoustic test specifications. Vibroacoustic test specifications are necessary to properly accept or qualify a spacecraft and its components for the expected acoustic, random vibration and shock environments seen on an expendable launch vehicle. Traditionally, NASA and the U.S. Air Force have employed methods of Normal Tolerance Limits to derive these test levels based upon the amount of data available, and the probability and confidence levels desired. The Normal Tolerance Limit method contains inherent assumptions about the distribution of the data. The Bootstrap is a distribution-free statistical subsampling method which uses the measured data themselves to establish estimates of statistical measures of random sources. This is achieved through the computation of large numbers of Bootstrap replicates of a data measure of interest and the use of these replicates to derive test levels consistent with the probability and confidence desired. The comparison of the results of these two methods is illustrated via an example utilizing actual spacecraft vibroacoustic data.
The Reliability and Stability of an Inferred Phylogenetic Tree from Empirical Data.
Katsura, Yukako; Stanley, Craig E; Kumar, Sudhir; Nei, Masatoshi
2017-03-01
The reliability of a phylogenetic tree obtained from empirical data is usually measured by the bootstrap probability (Pb) of interior branches of the tree. If the bootstrap probability is high for most branches, the tree is considered to be reliable. If some interior branches show relatively low bootstrap probabilities, we are not sure that the inferred tree is really reliable. Here, we propose another quantity measuring the reliability of the tree called the stability of a subtree. This quantity refers to the probability of obtaining a subtree (Ps) of an inferred tree obtained. We then show that if the tree is to be reliable, both Pb and Ps must be high. We also show that Ps is given by a bootstrap probability of the subtree with the closest outgroup sequence, and computer program RESTA for computing the Pb and Ps values will be presented. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Closure of the operator product expansion in the non-unitary bootstrap
DOE Office of Scientific and Technical Information (OSTI.GOV)
Esterlis, Ilya; Fitzpatrick, A. Liam; Ramirez, David M.
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a specialmore » case of the “Gliozzi” bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.« less
Lin, Jyh-Jiuan; Chang, Ching-Hui; Pal, Nabendu
2015-01-01
To test the mutual independence of two qualitative variables (or attributes), it is a common practice to follow the Chi-square tests (Pearson's as well as likelihood ratio test) based on data in the form of a contingency table. However, it should be noted that these popular Chi-square tests are asymptotic in nature and are useful when the cell frequencies are "not too small." In this article, we explore the accuracy of the Chi-square tests through an extensive simulation study and then propose their bootstrap versions that appear to work better than the asymptotic Chi-square tests. The bootstrap tests are useful even for small-cell frequencies as they maintain the nominal level quite accurately. Also, the proposed bootstrap tests are more convenient than the Fisher's exact test which is often criticized for being too conservative. Finally, all test methods are applied to a few real-life datasets for demonstration purposes.
Closure of the operator product expansion in the non-unitary bootstrap
Esterlis, Ilya; Fitzpatrick, A. Liam; Ramirez, David M.
2016-11-07
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a specialmore » case of the “Gliozzi” bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.« less
NASA Astrophysics Data System (ADS)
Poli, Francesca
2012-10-01
Steady state scenarios envisaged for ITER aim at optimizing the bootstrap current, while maintaining sufficient confinement and stability to provide the necessary fusion yield. Non-inductive scenarios will need to operate with Internal Transport Barriers (ITBs) in order to reach adequate fusion gain at typical currents of 9 MA. However, the large pressure gradients associated with ITBs in regions of weak or negative magnetic shear can be conducive to ideal MHD instabilities in a wide range of βN, reducing the no-wall limit. Scenarios are established as relaxed flattop states with time-dependent transport simulations with TSC [1]. Fully non-inductive configurations with current in the range of 7-10 MA and various heating mixes (NB, EC, IC and LH) have been studied against variations of the pressure profile peaking and of the Greenwald fraction. It is found that stable equilibria have qmin> 2 and moderate ITBs at 2/3 of the minor radius [2]. The ExB flow shear from toroidal plasma rotation is expected to be low in ITER, with a major role in the ITB dynamics being played by magnetic geometry. Combinations of H&CD sources that maintain reverse or weak magnetic shear profiles throughout the discharge and ρ(qmin)>=0.5 are the focus of this work. The ITER EC upper launcher, designed for NTM control, can provide enough current drive off-axis to sustain moderate ITBs at mid-radius and maintain a non-inductive current of 8-9MA and H98>=1.5 with the day one heating mix. LH heating and current drive is effective in modifying the current profile off-axis, facilitating the formation of stronger ITBs in the rampup phase, their sustainment at larger radii and larger bootstrap fraction. The implications for steady state operation and fusion performance are discussed.[4pt] [1] Jardin S.C. et al, J. Comput. Phys. 66 (1986) 481[0pt] [2] Poli F.M. et al, Nucl. Fusion 52 (2012) 063027.
Edge Currents and Stability in DIII-D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, D M; Fenstermacher, M E; Finkenthal, D K
2004-12-01
Understanding the stability physics of the H-mode pedestal in tokamak devices requires an accurate measurement of plasma current in the pedestal region with good spatial resolution. Theoretically, the high pressure gradients achieved in the edge of H-mode plasmas should lead to generation of a significant edge current density peak through bootstrap and Pfirsh-Schl{umlt u}ter effects. This edge current is important for the achievement of second stability in the context of coupled magneto hydrodynamic (MHD) modes which are both pressure (ballooning) and current (peeling) driven. Many aspects of edge localized mode (ELM) behavior can be accounted for in terms of anmore » edge current density peak, with the identification of Type 1 ELMs as intermediate-n toroidal mode number MHD modes being a natural feature of this model. The development of a edge localized instabilities in tokamak experiments code (ELITE) based on this model allows one to efficiently calculate the stability and growth of the relevant modes for a broad range of plasma parameters and thus provides a framework for understanding the limits on pedestal height. This however requires an accurate assessment of the edge current. While estimates of j{sub edge} can be made based on specific bootstrap models, their validity may be limited in the edge (gradient scalelengths comparable to orbit size, large changes in collisionality, etc.). Therefore it is highly desirable to have an actual measurement. Such measurements have been made on the DIII-D tokamak using combined polarimetry and spectroscopy of an injected lithium beam. By analyzing one of the Zeeman-split 2S-2P lithium resonance line components, one can obtain direct information on the local magnetic field components. These values allow one to infer details of the edge current density. Because of the negligible Stark mixing of the relevant atomic levels in lithium, this method of determining j(r) is insensitive to the large local electric fields typically found in enhanced confinement (H-mode) edges, and thus avoids an ambiguity common to MSE measurements of B{sub pol}.« less
Edge Currents and Stability in DIII-D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, D M; Fenstermacher, M E; Finkenthal, D K
2005-05-05
Understanding the stability physics of the H-mode pedestal in tokamak devices requires an accurate measurement of plasma current in the pedestal region with good spatial resolution. Theoretically, the high pressure gradients achieved in the edge of H-mode plasmas should lead to generation of a significant edge current density peak through bootstrap and Pfirsh-Schlueter effects. This edge current is important for the achievement of second stability in the context of coupled magneto hydrodynamic (MHD) modes which are both pressure (ballooning) and current (peeling) driven [1]. Many aspects of edge localized mode (ELM) behavior can be accounted for in terms of anmore » edge current density peak, with the identification of Type 1 ELMs as intermediate-n toroidal mode number MHD modes being a natural feature of this model [2]. The development of a edge localized instabilities in tokamak experiments code (ELITE) based on this model allows one to efficiently calculate the stability and growth of the relevant modes for a broad range of plasma parameters [3,4] and thus provides a framework for understanding the limits on pedestal height. This however requires an accurate assessment of the edge current. While estimates of j{sub edge} can be made based on specific bootstrap models, their validity may be limited in the edge (gradient scale lengths comparable to orbit size, large changes in collisionality, etc.). Therefore it is highly desirable to have an actual measurement. Such measurements have been made on the DIII-D tokamak using combined polarimetry and spectroscopy of an injected lithium beam. [5,6]. By analyzing one of the Zeeman-split 2S-2P lithium resonance line components, one can obtain direct information on the local magnetic field components. These values allow one to infer details of the edge current density. Because of the negligible Stark mixing of the relevant atomic levels in lithium, this method of determining j(r) is insensitive to the large local electric fields typically found in enhanced confinement (H-mode) edges, and thus avoids an ambiguity common to MSE measurements of B{sub pol}.« less
Why are they missing? : Bioinformatics characterization of missing human proteins.
Elguoshy, Amr; Magdeldin, Sameh; Xu, Bo; Hirao, Yoshitoshi; Zhang, Ying; Kinoshita, Naohiko; Takisawa, Yusuke; Nameta, Masaaki; Yamamoto, Keiko; El-Refy, Ali; El-Fiky, Fawzy; Yamamoto, Tadashi
2016-10-21
NeXtProt is a web-based protein knowledge platform that supports research on human proteins. NeXtProt (release 2015-04-28) lists 20,060 proteins, among them, 3373 canonical proteins (16.8%) lack credible experimental evidence at protein level (PE2:PE5). Therefore, they are considered as "missing proteins". A comprehensive bioinformatic workflow has been proposed to analyze these "missing" proteins. The aims of current study were to analyze physicochemical properties, existence and distribution of the tryptic cleavage sites, and to pinpoint the signature peptides of the missing proteins. Our findings showed that 23.7% of missing proteins were hydrophobic proteins possessing transmembrane domains (TMD). Also, forty missing entries generate tryptic peptides were either out of mass detection range (>30aa) or mapped to different proteins (<9aa). Additionally, 21% of missing entries didn't generate any unique tryptic peptides. In silico endopeptidase combination strategy increased the possibility of missing proteins identification. Coherently, using both mature protein database and signal peptidome database could be a promising option to identify some missing proteins by targeting their unique N-terminal tryptic peptide from mature protein database and or C-terminus tryptic peptide from signal peptidome database. In conclusion, Identification of missing protein requires additional consideration during sample preparation, extraction, digestion and data analysis to increase its incidence of identification. Copyright © 2016. Published by Elsevier B.V.
Confidence Interval Coverage for Cohen's Effect Size Statistic
ERIC Educational Resources Information Center
Algina, James; Keselman, H. J.; Penfield, Randall D.
2006-01-01
Kelley compared three methods for setting a confidence interval (CI) around Cohen's standardized mean difference statistic: the noncentral-"t"-based, percentile (PERC) bootstrap, and biased-corrected and accelerated (BCA) bootstrap methods under three conditions of nonnormality, eight cases of sample size, and six cases of population…
A Bootstrap Procedure of Propensity Score Estimation
ERIC Educational Resources Information Center
Bai, Haiyan
2013-01-01
Propensity score estimation plays a fundamental role in propensity score matching for reducing group selection bias in observational data. To increase the accuracy of propensity score estimation, the author developed a bootstrap propensity score. The commonly used propensity score matching methods: nearest neighbor matching, caliper matching, and…
Takada, M; Sugimoto, M; Ohno, S; Kuroi, K; Sato, N; Bando, H; Masuda, N; Iwata, H; Kondo, M; Sasano, H; Chow, L W C; Inamoto, T; Naito, Y; Tomita, M; Toi, M
2012-07-01
Nomogram, a standard technique that utilizes multiple characteristics to predict efficacy of treatment and likelihood of a specific status of an individual patient, has been used for prediction of response to neoadjuvant chemotherapy (NAC) in breast cancer patients. The aim of this study was to develop a novel computational technique to predict the pathological complete response (pCR) to NAC in primary breast cancer patients. A mathematical model using alternating decision trees, an epigone of decision tree, was developed using 28 clinicopathological variables that were retrospectively collected from patients treated with NAC (n = 150), and validated using an independent dataset from a randomized controlled trial (n = 173). The model selected 15 variables to predict the pCR with yielding area under the receiver operating characteristics curve (AUC) values of 0.766 [95 % confidence interval (CI)], 0.671-0.861, P value < 0.0001) in cross-validation using training dataset and 0.787 (95 % CI 0.716-0.858, P value < 0.0001) in the validation dataset. Among three subtypes of breast cancer, the luminal subgroup showed the best discrimination (AUC = 0.779, 95 % CI 0.641-0.917, P value = 0.0059). The developed model (AUC = 0.805, 95 % CI 0.716-0.894, P value < 0.0001) outperformed multivariate logistic regression (AUC = 0.754, 95 % CI 0.651-0.858, P value = 0.00019) of validation datasets without missing values (n = 127). Several analyses, e.g. bootstrap analysis, revealed that the developed model was insensitive to missing values and also tolerant to distribution bias among the datasets. Our model based on clinicopathological variables showed high predictive ability for pCR. This model might improve the prediction of the response to NAC in primary breast cancer patients.
Gussenhoven, A H M; van Wier, M F; Bosmans, J E; Dekkers, J C; van Mechelen, W
2013-01-01
The objective of this study was to determine whether a lifestyle intervention with individual counselling was cost-effective for reducing body weight compared with usual care from a company perspective. Overweight employees were recruited and randomly assigned to the intervention groups, either phone or Internet, or the control group. The intervention was based on a cognitive behavioural approach and addressed physical activity and diet. Self-reported body weight was collected at baseline and 12 months follow-up. Intervention costs and costs of sick leave days based on gross and net lost productivity days (GLPDs/NLPDs) obtained from the participating companies were calculated. Missing data were imputed using multiple imputation techniques. Uncertainty surrounding the differences in costs and the incremental cost-effectiveness ratios (ICER) was estimated by bootstrapping techniques, and presented on cost-effectiveness planes and cost-effectiveness acceptability curves. No statistically significant differences in total costs were found between the intervention groups and control group, though mean total costs in both intervention groups tended to be higher than those in the control group. The ICER of the Internet group compared with the control group was €59 per kilogram of weight loss based on GLPD costs. The probability of cost effectiveness of the Internet intervention was 45% at a willingness-to-pay of €0 per extra kilogram weight loss and 75% at a willingness-to-pay of €1500 per extra kilogram body weight loss. Comparable results were found for the phone intervention. The intervention was not cost effective in comparison with usual care from the company perspective. Due to the large amount of missing data, it is not possible to draw firm conclusions.
Liu, Dajiang J; Leal, Suzanne M
2012-10-05
Next-generation sequencing has led to many complex-trait rare-variant (RV) association studies. Although single-variant association analysis can be performed, it is grossly underpowered. Therefore, researchers have developed many RV association tests that aggregate multiple variant sites across a genetic region (e.g., gene), and test for the association between the trait and the aggregated genotype. After these aggregate tests detect an association, it is only possible to estimate the average genetic effect for a group of RVs. As a result of the "winner's curse," such an estimate can be biased. Although for common variants one can obtain unbiased estimates of genetic parameters by analyzing a replication sample, for RVs it is desirable to obtain unbiased genetic estimates for the study where the association is identified. This is because there can be substantial heterogeneity of RV sites and frequencies even among closely related populations. In order to obtain an unbiased estimate for aggregated RV analysis, we developed bootstrap-sample-split algorithms to reduce the bias of the winner's curse. The unbiased estimates are greatly important for understanding the population-specific contribution of RVs to the heritability of complex traits. We also demonstrate both theoretically and via simulations that for aggregate RV analysis the genetic variance for a gene or region will always be underestimated, sometimes substantially, because of the presence of noncausal variants or because of the presence of causal variants with effects of different magnitudes or directions. Therefore, even if RVs play a major role in the complex-trait etiologies, a portion of the heritability will remain missing, and the contribution of RVs to the complex-trait etiologies will be underestimated. Copyright © 2012 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.
The Sexual Exploitation of Missing Children: A Research Review.
ERIC Educational Resources Information Center
Hotaling, Gerald T.; Finkelhor, David
This paper evaluates current knowledge about the prevalence, dynamics, and short- and long-term effects of sexual exploitation among missing children. It is based upon empirical research findings from books, papers presented at professional meetings, doctoral dissertations, works in progress, and more than 75 articles in professional journals.…
26 CFR 601.901 - Missing children shown on penalty mail.
Code of Federal Regulations, 2014 CFR
2014-04-01
... those which have been updated to reflect a missing child's current age through computer enhancement... shelf-life guideline. (d) Reports and contact official. IRS shall compile and submit to OJJDP reports on... Division, Technical Publications Branch, OP:FS:FP:P:3, Room 5613, Internal Revenue Service, 1111...
26 CFR 601.901 - Missing children shown on penalty mail.
Code of Federal Regulations, 2010 CFR
2010-04-01
... those which have been updated to reflect a missing child's current age through computer enhancement... shelf-life guideline. (d) Reports and contact official. IRS shall compile and submit to OJJDP reports on... Division, Technical Publications Branch, OP:FS:FP:P:3, Room 5613, Internal Revenue Service, 1111...
26 CFR 601.901 - Missing children shown on penalty mail.
Code of Federal Regulations, 2011 CFR
2011-04-01
... those which have been updated to reflect a missing child's current age through computer enhancement... shelf-life guideline. (d) Reports and contact official. IRS shall compile and submit to OJJDP reports on... Division, Technical Publications Branch, OP:FS:FP:P:3, Room 5613, Internal Revenue Service, 1111...
26 CFR 601.901 - Missing children shown on penalty mail.
Code of Federal Regulations, 2013 CFR
2013-04-01
... those which have been updated to reflect a missing child's current age through computer enhancement... shelf-life guideline. (d) Reports and contact official. IRS shall compile and submit to OJJDP reports on... Division, Technical Publications Branch, OP:FS:FP:P:3, Room 5613, Internal Revenue Service, 1111...
26 CFR 601.901 - Missing children shown on penalty mail.
Code of Federal Regulations, 2012 CFR
2012-04-01
... those which have been updated to reflect a missing child's current age through computer enhancement... shelf-life guideline. (d) Reports and contact official. IRS shall compile and submit to OJJDP reports on... Division, Technical Publications Branch, OP:FS:FP:P:3, Room 5613, Internal Revenue Service, 1111...
1984-09-28
variables before simula- tion of model - Search for reality checks a, - Express uncertainty as a probability density distribution. a. H2 a, H-22 TWIF... probability that the software con- tains errors. This prior is updated as test failure data are accumulated. Only a p of 1 (software known to contain...discusssed; both parametric and nonparametric versions are presented. It is shown by the author that the bootstrap underlies the jackknife method and
A two-question method for assessing gender categories in the social and medical sciences.
Tate, Charlotte Chuck; Ledbetter, Jay N; Youssef, Cris P
2013-01-01
Three studies (N = 990) assessed the statistical reliability of two methods of determining gender identity that can capture transgender spectrum identities (i.e., current gender identities different from birth-assigned gender categories). Study 1 evaluated a single question with four response options (female, male, transgender, other) on university students. The missing data rate was higher than the valid response rates for transgender and other options using this method. Study 2 evaluated a method of asking two separate questions (i.e., one for current identity and another for birth-assigned category), with response options specific to each. Results showed no missing data and two times the transgender spectrum response rate compared to Study 1. Study 3 showed that the two-question method also worked in community samples, producing near-zero missing data. The two-question method also identified cisgender identities (same birth-assigned and current gender identity), making it a dynamic and desirable measurement tool for the social and medical sciences.
Crisis communication: learning from the 1998 LPG near miss in Stockholm.
Castenfors, K; Svedin, L
2001-12-14
The authors examine current trends in urban risks and resilience in relation to hazardous material transports in general, and crisis communication and the Stockholm liquefied petroleum gas (LPG) near miss in 1998 in particular. The article discusses how current dynamics affecting urban areas, such as the decay in terms of increased condensation and limited expansion alternatives combined with industry site contamination and transports of hazardous materials on old worn-out physical infrastructure, work together to produce high-risk factors and increase urban vulnerability in large parts of the world today. Crisis communication takes a particularly pronounced role in the article as challenges in communication and confidence maintenance under conditions of information uncertainty and limited information control are explored. The LPG near miss case illustrates a Swedish case of urban risk and the tight coupling to hazardous material transports. The case also serves as a current example of Swedish resilience and lack of preparedness in urban crises, with particular observations and lessons learned in regards to crisis communication.
Bootstrapping Methods Applied for Simulating Laboratory Works
ERIC Educational Resources Information Center
Prodan, Augustin; Campean, Remus
2005-01-01
Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…
ERIC Educational Resources Information Center
Zhang, Guangjian; Preacher, Kristopher J.; Luo, Shanhong
2010-01-01
This article is concerned with using the bootstrap to assign confidence intervals for rotated factor loadings and factor correlations in ordinary least squares exploratory factor analysis. Coverage performances of "SE"-based intervals, percentile intervals, bias-corrected percentile intervals, bias-corrected accelerated percentile…
Bootstrap Estimation and Testing for Variance Equality.
ERIC Educational Resources Information Center
Olejnik, Stephen; Algina, James
The purpose of this study was to develop a single procedure for comparing population variances which could be used for distribution forms. Bootstrap methodology was used to estimate the variability of the sample variance statistic when the population distribution was normal, platykurtic and leptokurtic. The data for the study were generated and…
Bootstrapping the Syntactic Bootstrapper: Probabilistic Labeling of Prosodic Phrases
ERIC Educational Resources Information Center
Gutman, Ariel; Dautriche, Isabelle; Crabbé, Benoît; Christophe, Anne
2015-01-01
The "syntactic bootstrapping" hypothesis proposes that syntactic structure provides children with cues for learning the meaning of novel words. In this article, we address the question of how children might start acquiring some aspects of syntax before they possess a sizeable lexicon. The study presents two models of early syntax…
ERIC Educational Resources Information Center
Larwin, Karen H.; Larwin, David A.
2011-01-01
Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…
Porter, Teresita M; Gibson, Joel F; Shokralla, Shadi; Baird, Donald J; Golding, G Brian; Hajibabaei, Mehrdad
2014-01-01
Current methods to identify unknown insect (class Insecta) cytochrome c oxidase (COI barcode) sequences often rely on thresholds of distances that can be difficult to define, sequence similarity cut-offs, or monophyly. Some of the most commonly used metagenomic classification methods do not provide a measure of confidence for the taxonomic assignments they provide. The aim of this study was to use a naïve Bayesian classifier (Wang et al. Applied and Environmental Microbiology, 2007; 73: 5261) to automate taxonomic assignments for large batches of insect COI sequences such as data obtained from high-throughput environmental sequencing. This method provides rank-flexible taxonomic assignments with an associated bootstrap support value, and it is faster than the blast-based methods commonly used in environmental sequence surveys. We have developed and rigorously tested the performance of three different training sets using leave-one-out cross-validation, two field data sets, and targeted testing of Lepidoptera, Diptera and Mantodea sequences obtained from the Barcode of Life Data system. We found that type I error rates, incorrect taxonomic assignments with a high bootstrap support, were already relatively low but could be lowered further by ensuring that all query taxa are actually present in the reference database. Choosing bootstrap support cut-offs according to query length and summarizing taxonomic assignments to more inclusive ranks can also help to reduce error while retaining the maximum number of assignments. Additionally, we highlight gaps in the taxonomic and geographic representation of insects in public sequence databases that will require further work by taxonomists to improve the quality of assignments generated using any method.
Identifying Heat Waves in Florida: Considerations of Missing Weather Data
Leary, Emily; Young, Linda J.; DuClos, Chris; Jordan, Melissa M.
2015-01-01
Background Using current climate models, regional-scale changes for Florida over the next 100 years are predicted to include warming over terrestrial areas and very likely increases in the number of high temperature extremes. No uniform definition of a heat wave exists. Most past research on heat waves has focused on evaluating the aftermath of known heat waves, with minimal consideration of missing exposure information. Objectives To identify and discuss methods of handling and imputing missing weather data and how those methods can affect identified periods of extreme heat in Florida. Methods In addition to ignoring missing data, temporal, spatial, and spatio-temporal models are described and utilized to impute missing historical weather data from 1973 to 2012 from 43 Florida weather monitors. Calculated thresholds are used to define periods of extreme heat across Florida. Results Modeling of missing data and imputing missing values can affect the identified periods of extreme heat, through the missing data itself or through the computed thresholds. The differences observed are related to the amount of missingness during June, July, and August, the warmest months of the warm season (April through September). Conclusions Missing data considerations are important when defining periods of extreme heat. Spatio-temporal methods are recommended for data imputation. A heat wave definition that incorporates information from all monitors is advised. PMID:26619198
Identifying Heat Waves in Florida: Considerations of Missing Weather Data.
Leary, Emily; Young, Linda J; DuClos, Chris; Jordan, Melissa M
2015-01-01
Using current climate models, regional-scale changes for Florida over the next 100 years are predicted to include warming over terrestrial areas and very likely increases in the number of high temperature extremes. No uniform definition of a heat wave exists. Most past research on heat waves has focused on evaluating the aftermath of known heat waves, with minimal consideration of missing exposure information. To identify and discuss methods of handling and imputing missing weather data and how those methods can affect identified periods of extreme heat in Florida. In addition to ignoring missing data, temporal, spatial, and spatio-temporal models are described and utilized to impute missing historical weather data from 1973 to 2012 from 43 Florida weather monitors. Calculated thresholds are used to define periods of extreme heat across Florida. Modeling of missing data and imputing missing values can affect the identified periods of extreme heat, through the missing data itself or through the computed thresholds. The differences observed are related to the amount of missingness during June, July, and August, the warmest months of the warm season (April through September). Missing data considerations are important when defining periods of extreme heat. Spatio-temporal methods are recommended for data imputation. A heat wave definition that incorporates information from all monitors is advised.
Overview of physics research on the TCV tokamak
NASA Astrophysics Data System (ADS)
Fasoli, A.; TCV Team
2009-10-01
The Tokamak à Configuration Variable (TCV) tokamak is equipped with high-power (4.5 MW), real-time-controllable EC systems and flexible shaping, and plays an important role in fusion research by broadening the parameter range of reactor relevant regimes, by investigating tokamak physics questions and by developing new control tools. Steady-state discharges are achieved, in which the current is entirely self-generated through the bootstrap mechanism, a fundamental ingredient for ITER steady-state operation. The discharge remains quiescent over several current redistribution times, demonstrating that a self-consistent, 'bootstrap-aligned' equilibrium state is possible. Electron internal transport barrier regimes sustained by EC current drive have also been explored. MHD activity is shown to be crucial in scenarios characterized by large and slow oscillations in plasma confinement, which in turn can be modified by small Ohmic current perturbations altering the barrier strength. In studies of the relation between anomalous transport and plasma shape, the observed dependences of the electron thermal diffusivity on triangularity (direct) and collisionality (inverse) are qualitatively reproduced by non-linear gyro-kinetic simulations and shown to be governed by TEM turbulence. Parallel SOL flows are studied for their importance for material migration. Flow profiles are measured using a reciprocating Mach probe by changing from lower to upper single-null diverted equilibria and shifting the plasmas vertically. The dominant, field-direction-dependent Pfirsch-Schlüter component is found to be in good agreement with theoretical predictions. A field-direction-independent component is identified and is consistent with flows generated by transient over-pressure due to ballooning-like interchange turbulence. Initial high-resolution infrared images confirm that ELMs have a filamentary structure, while fast, localized radiation measurements reveal that ELM activity first appears in the X-point region. Real time control techniques are currently being applied to EC multiple independent power supplies and beam launchers, e.g. to control the plasma current in fully non-inductive conditions, and the plasma elongation through current broadening by far-off-axis heating at constant shaping field.
Predicting missing links in complex networks based on common neighbors and distance
Yang, Jinxuan; Zhang, Xiao-Dong
2016-01-01
The algorithms based on common neighbors metric to predict missing links in complex networks are very popular, but most of these algorithms do not account for missing links between nodes with no common neighbors. It is not accurate enough to reconstruct networks by using these methods in some cases especially when between nodes have less common neighbors. We proposed in this paper a new algorithm based on common neighbors and distance to improve accuracy of link prediction. Our proposed algorithm makes remarkable effect in predicting the missing links between nodes with no common neighbors and performs better than most existing currently used methods for a variety of real-world networks without increasing complexity. PMID:27905526
Performance Testing of Lidar Components Subjected to Space Exposure in Space via MISSE 7 Mission
NASA Technical Reports Server (NTRS)
Prasad, Narasimha S.
2012-01-01
.The objective of the Materials International Space Station Experiment (MISSE) is to study the performance of novel materials when subjected to the synergistic effects of the harsh space environment for several months. MISSE missions provide an opportunity for developing space qualifiable materials. Several laser and lidar components were sent by NASA Langley Research Center (LaRC) as a part of the MISSE 7 mission. The MISSE 7 module was transported to the international space station (ISS) via STS 129 mission that was launched on Nov 16, 2009. Later, the MISSE 7 module was brought back to the earth via the STS 134 that landed on June 1, 2011. The MISSE 7 module that was subjected to exposure in space environment for more than one and a half year included fiber laser, solid-state laser gain materials, detectors, and semiconductor laser diode. Performance testing of these components is now progressing. In this paper, the current progress on post-flight performance testing of a high-speed photodetector and a balanced receiver is discussed. Preliminary findings show that detector characteristics did not undergo any significant degradation.
Bootstrapping N=2 chiral correlators
NASA Astrophysics Data System (ADS)
Lemos, Madalena; Liendo, Pedro
2016-01-01
We apply the numerical bootstrap program to chiral operators in four-dimensional N=2 SCFTs. In the first part of this work we study four-point functions in which all fields have the same conformal dimension. We give special emphasis to bootstrapping a specific theory: the simplest Argyres-Douglas fixed point with no flavor symmetry. In the second part we generalize our setup and consider correlators of fields with unequal dimension. This is an example of a mixed correlator and allows us to probe new regions in the parameter space of N=2 SCFTs. In particular, our results put constraints on relations in the Coulomb branch chiral ring and on the curvature of the Zamolodchikov metric.
Kania, Rachel; Cale, Jesse
2018-03-01
The concept of bystander intervention is gaining popularity in universities as a mechanism to prevent sexual violence. Prior research has focused on correlates of bystanders' intentions to intervene and intervention behaviors in situations where there is a risk of sexual violence. The current study builds on this literature by exploring the nature of missed opportunities, including perceived barriers to intervention. In all, 380 Australian undergraduate university students completed an online survey. Measures included a rape myth acceptance scale, bystander intentions to intervene, actual intervention behaviors, missed opportunities for intervention, and perceived barriers for missed opportunities. Promisingly, students reported high levels of intentions to intervene in situations where there was a risk of sexual violence and reported relatively few missed opportunities to do so when these situations did occur. Intervention behaviors varied by important demographic characteristics such as gender, age, attitudes toward sexual violence, and the nature of the situation. Younger female students, with lower levels of rape myth acceptance, who had previously engaged in bystander intervention behaviors were more likely to report intentions to intervene in future risky situations, and female international students reported fewer missed opportunities for intervention. The most common barrier to intervention for identified missed opportunities was a failure to recognize situations as having a potential risk for sexual violence, and students were most likely to intervene in situations when the opportunity to help a friend in distress arose. This study provides some preliminary empirical evidence about bystander intervention against sexual violence among Australian university students, and identifies unique contexts for intervention and what current barriers to intervention may be.
ERIC Educational Resources Information Center
Hochstetler, Sarah
2014-01-01
This article explores a fundamental indicator missing from current assessments of preservice teacher effectiveness, namely the importance of competencies central to professionalism and to impact on student learning. Drawing from her classroom experiences and relevant research, the author makes the case that evaluation of English teacher…
Improving the Utilisation of Management Information Systems in Secondary Schools
ERIC Educational Resources Information Center
Bosker, R. J.; Branderhorst, E. M.; Visscher, A. J.
2007-01-01
Although most secondary schools do use management information systems (MISs), these systems tend not to be used to support higher order managerial activities but are currently primarily used for clerical purposes. This situation is unsatisfactory as MISs fully utilised could offer invaluable support to schools, which are increasingly being granted…
The effect of anisotropic heat transport on magnetic islands in 3-D configurations
NASA Astrophysics Data System (ADS)
Schlutt, M. G.; Hegna, C. C.
2012-08-01
An analytic theory of nonlinear pressure-induced magnetic island formation using a boundary layer analysis is presented. This theory extends previous work by including the effects of finite parallel heat transport and is applicable to general three dimensional magnetic configurations. In this work, particular attention is paid to the role of finite parallel heat conduction in the context of pressure-induced island physics. It is found that localized currents that require self-consistent deformation of the pressure profile, such as resistive interchange and bootstrap currents, are attenuated by finite parallel heat conduction when the magnetic islands are sufficiently small. However, these anisotropic effects do not change saturated island widths caused by Pfirsch-Schlüter current effects. Implications for finite pressure-induced island healing are discussed.
Exploring the Replicability of a Study's Results: Bootstrap Statistics for the Multivariate Case.
ERIC Educational Resources Information Center
Thompson, Bruce
Conventional statistical significance tests do not inform the researcher regarding the likelihood that results will replicate. One strategy for evaluating result replication is to use a "bootstrap" resampling of a study's data so that the stability of results across numerous configurations of the subjects can be explored. This paper…
Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization
ERIC Educational Resources Information Center
Lock, Robin H.; Lock, Patti Frazer
2008-01-01
Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…
Computing Robust, Bootstrap-Adjusted Fit Indices for Use with Nonnormal Data
ERIC Educational Resources Information Center
Walker, David A.; Smith, Thomas J.
2017-01-01
Nonnormality of data presents unique challenges for researchers who wish to carry out structural equation modeling. The subsequent SPSS syntax program computes bootstrap-adjusted fit indices (comparative fit index, Tucker-Lewis index, incremental fit index, and root mean square error of approximation) that adjust for nonnormality, along with the…
Forgetski Vygotsky: Or, a Plea for Bootstrapping Accounts of Learning
ERIC Educational Resources Information Center
Luntley, Michael
2017-01-01
This paper argues that sociocultural accounts of learning fail to answer the key question about learning--how is it possible? Accordingly, we should adopt an individualist bootstrapping methodology in providing a theory of learning. Such a methodology takes seriously the idea that learning is staged and distinguishes between a non-comprehending…
Higher curvature gravities, unlike GR, cannot be bootstrapped from their (usual) linearizations
NASA Astrophysics Data System (ADS)
Deser, S.
2017-12-01
We show that higher curvature order gravities, in particular the propagating quadratic curvature models, cannot be derived by self-coupling from their linear, flat space, forms, except through an unphysical version of linearization; only GR can. Separately, we comment on an early version of the self-coupling bootstrap.
The new version of EPA’s positive matrix factorization (EPA PMF) software, 5.0, includes three error estimation (EE) methods for analyzing factor analytic solutions: classical bootstrap (BS), displacement of factor elements (DISP), and bootstrap enhanced by displacement (BS-DISP)...
Bootsie: estimation of coefficient of variation of AFLP data by bootstrap analysis
USDA-ARS?s Scientific Manuscript database
Bootsie is an English-native replacement for ASG Coelho’s “DBOOT” utility for estimating coefficient of variation of a population of AFLP marker data using bootstrapping. Bootsie improves on DBOOT by supporting batch processing, time-to-completion estimation, built-in graphs, and a suite of export t...
How to Bootstrap a Human Communication System
ERIC Educational Resources Information Center
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified…
Li, Hao; Dong, Siping
2015-01-01
China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. © The Author(s) 2015.
Calia, Clara; Darling, Stephen; Havelka, Jelena; Allen, Richard J
2018-05-01
Immediate serial recall of digits is better when the digits are shown by highlighting them in a familiar array, such as a phone keypad, compared with presenting them serially in a single location, a pattern referred to as "visuospatial bootstrapping." This pattern implies the establishment of temporary links between verbal and spatial working memory, alongside access to information in long-term memory. However, the role of working memory control processes like those implied by the "Central Executive" in bootstrapping has not been directly investigated. Here, we report a study addressing this issue, focusing on executive processes of attentional shifting. Tasks in which information has to be sequenced are thought to be heavily dependent on shifting. Memory for digits presented in keypads versus single locations was assessed under two secondary task load conditions, one with and one without a sequencing requirement, and hence differing in the degree to which they invoke shifting. Results provided clear evidence that multimodal binding (visuospatial bootstrapping) can operate independently of this form of executive control process.
NASA Astrophysics Data System (ADS)
Chatthong, B.; Onjun, T.
2016-01-01
A set of heat and particle transport equations with the inclusion of E × B flow and magnetic shear is used to understand the formation and behaviors of edge transport barriers (ETBs) and internal transport barriers (ITBs) in tokamak plasmas based on two-field bifurcation concept. A simple model that can describe the E × B flow shear and magnetic shear effect in tokamak plasma is used for anomalous transport suppression with the effect of bootstrap current included. Consequently, conditions and formations of ETB and ITB can be visualized and studied. It can be seen that the ETB formation depends sensitively on the E × B flow shear suppression with small dependence on the magnetic shear suppression. However, the ITB formation depends sensitively on the magnetic shear suppression with a small dependence on the E × B flow shear suppression. Once the H-mode is achieved, the s-curve bifurcation diagram is modified due to an increase of bootstrap current at the plasma edge, resulting in reductions of both L-H and H-L transition thresholds with stronger hysteresis effects. It is also found that both ITB and ETB widths appear to be governed by heat or particle sources and the location of the current peaking. In addition, at a marginal flux just below the L-H threshold, a small perturbation in terms of heat or density fluctuation can result in a transition, which can remain after the perturbation is removed due to the hysteresis effect.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, L. J.; Kotschenreuther, M. T.; Valanju, P.
2014-06-15
The diamagnetic drift effects on the low-n magnetohydrodynamic instabilities at the high-mode (H-mode) pedestal are investigated in this paper with the inclusion of bootstrap current for equilibrium and rotation effects for stability, where n is the toroidal mode number. The AEGIS (Adaptive EiGenfunction Independent Solutions) code [L. J. Zheng and M. T. Kotschenreuther, J. Comp. Phys. 211 (2006)] is extended to include the diamagnetic drift effects. This can be viewed as the lowest order approximation of the finite Larmor radius effects in consideration of the pressure gradient steepness at the pedestal. The H-mode discharges at Jointed European Torus is reconstructedmore » numerically using the VMEC code [P. Hirshman and J. C. Whitson, Phys. Fluids 26, 3553 (1983)], with bootstrap current taken into account. Generally speaking, the diamagnetic drift effects are stabilizing. Our results show that the effectiveness of diamagnetic stabilization depends sensitively on the safe factor value (q{sub s}) at the safety-factor reversal or plateau region. The diamagnetic stabilization are weaker, when q{sub s} is larger than an integer; while stronger, when q{sub s} is smaller or less larger than an integer. We also find that the diamagnetic drift effects also depend sensitively on the rotation direction. The diamagnetic stabilization in the co-rotation case is stronger than in the counter rotation case with respect to the ion diamagnetic drift direction.« less
Generalized Bootstrap Method for Assessment of Uncertainty in Semivariogram Inference
Olea, R.A.; Pardo-Iguzquiza, E.
2011-01-01
The semivariogram and its related function, the covariance, play a central role in classical geostatistics for modeling the average continuity of spatially correlated attributes. Whereas all methods are formulated in terms of the true semivariogram, in practice what can be used are estimated semivariograms and models based on samples. A generalized form of the bootstrap method to properly model spatially correlated data is used to advance knowledge about the reliability of empirical semivariograms and semivariogram models based on a single sample. Among several methods available to generate spatially correlated resamples, we selected a method based on the LU decomposition and used several examples to illustrate the approach. The first one is a synthetic, isotropic, exhaustive sample following a normal distribution, the second example is also a synthetic but following a non-Gaussian random field, and a third empirical sample consists of actual raingauge measurements. Results show wider confidence intervals than those found previously by others with inadequate application of the bootstrap. Also, even for the Gaussian example, distributions for estimated semivariogram values and model parameters are positively skewed. In this sense, bootstrap percentile confidence intervals, which are not centered around the empirical semivariogram and do not require distributional assumptions for its construction, provide an achieved coverage similar to the nominal coverage. The latter cannot be achieved by symmetrical confidence intervals based on the standard error, regardless if the standard error is estimated from a parametric equation or from bootstrap. ?? 2010 International Association for Mathematical Geosciences.
Arnold, Robert W; Jacob, Jack; Matrix, Zinnia
2012-01-01
Screening by neonatologists and staging by ophthalmologists is a cost-effective intervention, but inadvertent missed examinations create a high liability. Paper tracking, bedside schedule reminders, and a computer scheduling and reminder program were compared for speed of input and retrospective missed examination rate. A neonatal intensive care unit (NICU) process was then programmed for cloud-based distribution for inpatient and outpatient retinopathy of prematurity monitoring. Over 11 years, 367 premature infants in one NICU were prospectively monitored. The initial paper system missed 11% of potential examinations, the Windows server-based system missed 2%, and the current cloud-based system missed 0% of potential inpatient and outpatient examinations. Computer input of examinations took the same or less time than paper recording. A computer application with a deliberate NICU process improved the proportion of eligible neonates getting their scheduled eye examinations in a timely manner. Copyright 2012, SLACK Incorporated.
Hypothesis testing of a change point during cognitive decline among Alzheimer's disease patients.
Ji, Ming; Xiong, Chengjie; Grundman, Michael
2003-10-01
In this paper, we present a statistical hypothesis test for detecting a change point over the course of cognitive decline among Alzheimer's disease patients. The model under the null hypothesis assumes a constant rate of cognitive decline over time and the model under the alternative hypothesis is a general bilinear model with an unknown change point. When the change point is unknown, however, the null distribution of the test statistics is not analytically tractable and has to be simulated by parametric bootstrap. When the alternative hypothesis that a change point exists is accepted, we propose an estimate of its location based on the Akaike's Information Criterion. We applied our method to a data set from the Neuropsychological Database Initiative by implementing our hypothesis testing method to analyze Mini Mental Status Exam scores based on a random-slope and random-intercept model with a bilinear fixed effect. Our result shows that despite large amount of missing data, accelerated decline did occur for MMSE among AD patients. Our finding supports the clinical belief of the existence of a change point during cognitive decline among AD patients and suggests the use of change point models for the longitudinal modeling of cognitive decline in AD research.
Sieberg, Christine B; Manganella, Juliana; Manalo, Gem; Simons, Laura E; Hresko, M Timothy
2017-12-01
There is a need to better assess patient satisfaction and surgical outcomes. The purpose of the current study is to identify how preoperative expectations can impact postsurgical satisfaction among youth with adolescent idiopathic scoliosis undergoing spinal fusion surgery. The present study includes patients with adolescent idiopathic scoliosis undergoing spinal fusion surgery enrolled in a prospective, multicentered registry examining postsurgical outcomes. The Scoliosis Research Society Questionnaire-Version 30, which assesses pain, self-image, mental health, and satisfaction with management, along with the Spinal Appearance Questionnaire, which measures surgical expectations was administered to 190 patients before surgery and 1 and 2 years postoperatively. Regression analyses with bootstrapping (with n=5000 bootstrap samples) were conducted with 99% bias-corrected confidence intervals to examine the extent to which preoperative expectations for spinal appearance mediated the relationship between presurgical mental health and pain and 2-year postsurgical satisfaction. Results indicate that preoperative mental health, pain, and expectations are predictive of postsurgical satisfaction. With the shifting health care system, physicians may want to consider patient mental health, pain, and expectations before surgery to optimize satisfaction and ultimately improve clinical care and patient outcomes. Level I-prognostic study.
Ishwaran, Hemant; Lu, Min
2018-06-04
Random forests are a popular nonparametric tree ensemble procedure with broad applications to data analysis. While its widespread popularity stems from its prediction performance, an equally important feature is that it provides a fully nonparametric measure of variable importance (VIMP). A current limitation of VIMP, however, is that no systematic method exists for estimating its variance. As a solution, we propose a subsampling approach that can be used to estimate the variance of VIMP and for constructing confidence intervals. The method is general enough that it can be applied to many useful settings, including regression, classification, and survival problems. Using extensive simulations, we demonstrate the effectiveness of the subsampling estimator and in particular find that the delete-d jackknife variance estimator, a close cousin, is especially effective under low subsampling rates due to its bias correction properties. These 2 estimators are highly competitive when compared with the .164 bootstrap estimator, a modified bootstrap procedure designed to deal with ties in out-of-sample data. Most importantly, subsampling is computationally fast, thus making it especially attractive for big data settings. Copyright © 2018 John Wiley & Sons, Ltd.
Palmer, Tom M; Holmes, Michael V; Keating, Brendan J; Sheehan, Nuala A
2017-01-01
Abstract Mendelian randomization studies use genotypes as instrumental variables to test for and estimate the causal effects of modifiable risk factors on outcomes. Two-stage residual inclusion (TSRI) estimators have been used when researchers are willing to make parametric assumptions. However, researchers are currently reporting uncorrected or heteroscedasticity-robust standard errors for these estimates. We compared several different forms of the standard error for linear and logistic TSRI estimates in simulations and in real-data examples. Among others, we consider standard errors modified from the approach of Newey (1987), Terza (2016), and bootstrapping. In our simulations Newey, Terza, bootstrap, and corrected 2-stage least squares (in the linear case) standard errors gave the best results in terms of coverage and type I error. In the real-data examples, the Newey standard errors were 0.5% and 2% larger than the unadjusted standard errors for the linear and logistic TSRI estimators, respectively. We show that TSRI estimators with modified standard errors have correct type I error under the null. Researchers should report TSRI estimates with modified standard errors instead of reporting unadjusted or heteroscedasticity-robust standard errors. PMID:29106476
Peace of Mind, Academic Motivation, and Academic Achievement in Filipino High School Students.
Datu, Jesus Alfonso D
2017-04-09
Recent literature has recognized the advantageous role of low-arousal positive affect such as feelings of peacefulness and internal harmony in collectivist cultures. However, limited research has explored the benefits of low-arousal affective states in the educational setting. The current study examined the link of peace of mind (PoM) to academic motivation (i.e., amotivation, controlled motivation, and autonomous motivation) and academic achievement among 525 Filipino high school students. Findings revealed that PoM was positively associated with academic achievement β = .16, p < .05, autonomous motivation β = .48, p < .001, and controlled motivation β = .25, p < .01. As expected, PoM was negatively related to amotivation β = -.19, p < .05, and autonomous motivation was positively associated with academic achievement β = .52, p < .01. Furthermore, the results of bias-corrected bootstrap analyses at 95% confidence interval based on 5,000 bootstrapped resamples demonstrated that peace of mind had an indirect influence on academic achievement through the mediating effects of autonomous motivation. In terms of the effect sizes, the findings showed that PoM explained about 1% to 18% of the variance in academic achievement and motivation. The theoretical and practical implications of the results are elucidated.
NASA Technical Reports Server (NTRS)
Patterson, Richard; Hammoud, Ahmad
2009-01-01
Electronic systems designed for use in deep space and planetary exploration missions are expected to encounter extreme temperatures and wide thermal swings. Silicon-based devices are limited in their wide-temperature capability and usually require extra measures, such as cooling or heating mechanisms, to provide adequate ambient temperature for proper operation. Silicon-On-Insulator (SOI) technology, on the other hand, lately has been gaining wide spread use in applications where high temperatures are encountered. Due to their inherent design, SOI-based integrated circuit chips are able to operate at temperatures higher than those of the silicon devices by virtue of reducing leakage currents, eliminating parasitic junctions, and limiting internal heating. In addition, SOI devices provide faster switching, consume less power, and offer improved radiation-tolerance. Very little data, however, exist on the performance of such devices and circuits under cryogenic temperatures. In this work, the performance of an SOI bootstrapped, full-bridge driver integrated circuit was evaluated under extreme temperatures and thermal cycling. The investigations were carried out to establish a baseline on the functionality and to determine suitability of this device for use in space exploration missions under extreme temperature conditions.
Pulling Econometrics Students up by Their Bootstraps
ERIC Educational Resources Information Center
O'Hara, Michael E.
2014-01-01
Although the concept of the sampling distribution is at the core of much of what we do in econometrics, it is a concept that is often difficult for students to grasp. The thought process behind bootstrapping provides a way for students to conceptualize the sampling distribution in a way that is intuitive and visual. However, teaching students to…
Accuracy assessment of percent canopy cover, cover type, and size class
H. T. Schreuder; S. Bain; R. C. Czaplewski
2003-01-01
Truth for vegetation cover percent and type is obtained from very large-scale photography (VLSP), stand structure as measured by size classes, and vegetation types from a combination of VLSP and ground sampling. We recommend using the Kappa statistic with bootstrap confidence intervals for overall accuracy, and similarly bootstrap confidence intervals for percent...
ERIC Educational Resources Information Center
Barner, David; Chow, Katherine; Yang, Shu-Ju
2009-01-01
We explored children's early interpretation of numerals and linguistic number marking, in order to test the hypothesis (e.g., Carey (2004). Bootstrapping and the origin of concepts. "Daedalus", 59-68) that children's initial distinction between "one" and other numerals (i.e., "two," "three," etc.) is bootstrapped from a prior distinction between…
A Class of Population Covariance Matrices in the Bootstrap Approach to Covariance Structure Analysis
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Hayashi, Kentaro; Yanagihara, Hirokazu
2007-01-01
Model evaluation in covariance structure analysis is critical before the results can be trusted. Due to finite sample sizes and unknown distributions of real data, existing conclusions regarding a particular statistic may not be applicable in practice. The bootstrap procedure automatically takes care of the unknown distribution and, for a given…
ERIC Educational Resources Information Center
Hand, Michael L.
1990-01-01
Use of the bootstrap resampling technique (BRT) is assessed in its application to resampling analysis associated with measurement of payment allocation errors by federally funded Family Assistance Programs. The BRT is applied to a food stamp quality control database in Oregon. This analysis highlights the outlier-sensitivity of the…
Donald B.K. English
2000-01-01
In this paper I use bootstrap procedures to develop confidence intervals for estimates of total industrial output generated per thousand tourist visits. Mean expenditures from replicated visitor expenditure data included weights to correct for response bias. Impacts were estimated with IMPLAN. Ninety percent interval endpoints were 6 to 16 percent above or below the...
Comparison of Methods for Estimating Low Flow Characteristics of Streams
Tasker, Gary D.
1987-01-01
Four methods for estimating the 7-day, 10-year and 7-day, 20-year low flows for streams are compared by the bootstrap method. The bootstrap method is a Monte Carlo technique in which random samples are drawn from an unspecified sampling distribution defined from observed data. The nonparametric nature of the bootstrap makes it suitable for comparing methods based on a flow series for which the true distribution is unknown. Results show that the two methods based on hypothetical distribution (Log-Pearson III and Weibull) had lower mean square errors than did the G. E. P. Box-D. R. Cox transformation method or the Log-W. C. Boughton method which is based on a fit of plotting positions.
Missing Data Treatments at the Second Level of Hierarchical Linear Models
ERIC Educational Resources Information Center
St. Clair, Suzanne W.
2011-01-01
The current study evaluated the performance of traditional versus modern MDTs in the estimation of fixed-effects and variance components for data missing at the second level of an hierarchical linear model (HLM) model across 24 different study conditions. Variables manipulated in the analysis included, (a) number of Level-2 variables with missing…
An Introduction to Missing Data in the Context of Differential Item Functioning
ERIC Educational Resources Information Center
Banks, Kathleen
2015-01-01
This article introduces practitioners and researchers to the topic of missing data in the context of differential item functioning (DIF), reviews the current literature on the issue, discusses implications of the review, and offers suggestions for future research. A total of nine studies were reviewed. All of these studies determined what effect…
Chapter 5 - Missing Feedback in Payments for Ecosystem Services
Trista M. Patterson; Dana L. Coelho
2008-01-01
A general systems analysis of current approaches to payments for ecosystem services reveals a weakness, a missing feedback that ought to be in place pushing the system toward its goal of balancing human needs with the adaptive capacity of ecosystems. In situations of rising demand for ecosystem services among limited means for producing them, the likelihood that...
Definitions of Success: Girls at Miss Porter's School Share Their Hopes, Dreams, and Fears
ERIC Educational Resources Information Center
Windsor, Katherine Gladstone
2010-01-01
This study explores how girls currently enrolled and recently graduated from Miss Porter's School, Farmington, Connecticut, define success and the role gender plays in their definition(s). Data were collected from semi-structured student interviews, written responses by the students to a prompt designed to elicit personal conceptions of success,…
Exact Mass-Coupling Relation for the Homogeneous Sine-Gordon Model.
Bajnok, Zoltán; Balog, János; Ito, Katsushi; Satoh, Yuji; Tóth, Gábor Zsolt
2016-05-06
We derive the exact mass-coupling relation of the simplest multiscale quantum integrable model, i.e., the homogeneous sine-Gordon model with two mass scales. The relation is obtained by comparing the perturbed conformal field theory description of the model valid at short distances to the large distance bootstrap description based on the model's integrability. In particular, we find a differential equation for the relation by constructing conserved tensor currents, which satisfy a generalization of the Θ sum rule Ward identity. The mass-coupling relation is written in terms of hypergeometric functions.
NASA Technical Reports Server (NTRS)
Banks, Bruce A.; Miller, Sharon K.; Waters, Deborah L.
2010-01-01
An atomic oxygen fluence monitor was flown as part of the Materials International Space Station Experiment-6 (MISSE-6). The monitor was designed to measure the accumulation of atomic oxygen fluence with time as it impinged upon the ram surface of the MISSE 6B Passive Experiment Container (PEC). This was an active experiment for which data was to be stored on a battery-powered data logger for post-flight retrieval and analysis. The atomic oxygen fluence measurement was accomplished by allowing atomic oxygen to erode two opposing wedges of pyrolytic graphite that partially covered a photodiode. As the wedges of pyrolytic graphite erode, the area of the photodiode that is illuminated by the Sun increases. The short circuit current, which is proportional to the area of illumination, was to be measured and recorded as a function of time. The short circuit current from a different photodiode, which was oriented in the same direction and had an unobstructed view of the Sun, was also to be recorded as a reference current. The ratio of the two separate recorded currents should bear a linear relationship with the accumulated atomic oxygen fluence and be independent of the intensity of solar illumination. Ground hyperthermal atomic oxygen exposure facilities were used to evaluate the linearity of the ratio of short circuit current to the atomic oxygen fluence. In flight, the current measurement circuitry failed to operate properly, thus the overall atomic oxygen mission fluence could only be estimated based on the physical erosion of the pyrolytic graphite wedges. The atomic oxygen fluence was calculated based on the knowledge of the space atomic oxygen erosion yield of pyrolytic graphite measured from samples on the MISSE 2. The atomic oxygen fluence monitor, the expected result and comparison of mission atomic oxygen fluence based on the erosion of the pyrolytic graphite and Kapton H atomic oxygen fluence witness samples are presented in this paper.
Technical and scale efficiency in public and private Irish nursing homes - a bootstrap DEA approach.
Ni Luasa, Shiovan; Dineen, Declan; Zieba, Marta
2016-10-27
This article provides methodological and empirical insights into the estimation of technical efficiency in the nursing home sector. Focusing on long-stay care and using primary data, we examine technical and scale efficiency in 39 public and 73 private Irish nursing homes by applying an input-oriented data envelopment analysis (DEA). We employ robust bootstrap methods to validate our nonparametric DEA scores and to integrate the effects of potential determinants in estimating the efficiencies. Both the homogenous and two-stage double bootstrap procedures are used to obtain confidence intervals for the bias-corrected DEA scores. Importantly, the application of the double bootstrap approach affords true DEA technical efficiency scores after adjusting for the effects of ownership, size, case-mix, and other determinants such as location, and quality. Based on our DEA results for variable returns to scale technology, the average technical efficiency score is 62 %, and the mean scale efficiency is 88 %, with nearly all units operating on the increasing returns to scale part of the production frontier. Moreover, based on the double bootstrap results, Irish nursing homes are less technically efficient, and more scale efficient than the conventional DEA estimates suggest. Regarding the efficiency determinants, in terms of ownership, we find that private facilities are less efficient than the public units. Furthermore, the size of the nursing home has a positive effect, and this reinforces our finding that Irish homes produce at increasing returns to scale. Also, notably, we find that a tendency towards quality improvements can lead to poorer technical efficiency performance.
Empirical single sample quantification of bias and variance in Q-ball imaging.
Hainline, Allison E; Nath, Vishwesh; Parvathaneni, Prasanna; Blaber, Justin A; Schilling, Kurt G; Anderson, Adam W; Kang, Hakmook; Landman, Bennett A
2018-02-06
The bias and variance of high angular resolution diffusion imaging methods have not been thoroughly explored in the literature and may benefit from the simulation extrapolation (SIMEX) and bootstrap techniques to estimate bias and variance of high angular resolution diffusion imaging metrics. The SIMEX approach is well established in the statistics literature and uses simulation of increasingly noisy data to extrapolate back to a hypothetical case with no noise. The bias of calculated metrics can then be computed by subtracting the SIMEX estimate from the original pointwise measurement. The SIMEX technique has been studied in the context of diffusion imaging to accurately capture the bias in fractional anisotropy measurements in DTI. Herein, we extend the application of SIMEX and bootstrap approaches to characterize bias and variance in metrics obtained from a Q-ball imaging reconstruction of high angular resolution diffusion imaging data. The results demonstrate that SIMEX and bootstrap approaches provide consistent estimates of the bias and variance of generalized fractional anisotropy, respectively. The RMSE for the generalized fractional anisotropy estimates shows a 7% decrease in white matter and an 8% decrease in gray matter when compared with the observed generalized fractional anisotropy estimates. On average, the bootstrap technique results in SD estimates that are approximately 97% of the true variation in white matter, and 86% in gray matter. Both SIMEX and bootstrap methods are flexible, estimate population characteristics based on single scans, and may be extended for bias and variance estimation on a variety of high angular resolution diffusion imaging metrics. © 2018 International Society for Magnetic Resonance in Medicine.
Uncertainty Estimation using Bootstrapped Kriging Predictions for Precipitation Isoscapes
NASA Astrophysics Data System (ADS)
Ma, C.; Bowen, G. J.; Vander Zanden, H.; Wunder, M.
2017-12-01
Isoscapes are spatial models representing the distribution of stable isotope values across landscapes. Isoscapes of hydrogen and oxygen in precipitation are now widely used in a diversity of fields, including geology, biology, hydrology, and atmospheric science. To generate isoscapes, geostatistical methods are typically applied to extend predictions from limited data measurements. Kriging is a popular method in isoscape modeling, but quantifying the uncertainty associated with the resulting isoscapes is challenging. Applications that use precipitation isoscapes to determine sample origin require estimation of uncertainty. Here we present a simple bootstrap method (SBM) to estimate the mean and uncertainty of the krigged isoscape and compare these results with a generalized bootstrap method (GBM) applied in previous studies. We used hydrogen isotopic data from IsoMAP to explore these two approaches for estimating uncertainty. We conducted 10 simulations for each bootstrap method and found that SBM results in more kriging predictions (9/10) compared to GBM (4/10). Prediction from SBM was closer to the original prediction generated without bootstrapping and had less variance than GBM. SBM was tested on different datasets from IsoMAP with different numbers of observation sites. We determined that predictions from the datasets with fewer than 40 observation sites using SBM were more variable than the original prediction. The approaches we used for estimating uncertainty will be compiled in an R package that is under development. We expect that these robust estimates of precipitation isoscape uncertainty can be applied in diagnosing the origin of samples ranging from various type of waters to migratory animals, food products, and humans.
Bootstrapping rapidity anomalous dimensions for transverse-momentum resummation
Li, Ye; Zhu, Hua Xing
2017-01-11
Soft function relevant for transverse-momentum resummation for Drell-Yan or Higgs production at hadron colliders are computed through to three loops in the expansion of strong coupling, with the help of bootstrap technique and supersymmetric decomposition. The corresponding rapidity anomalous dimension is extracted. Furthermore, an intriguing relation between anomalous dimensions for transverse-momentum resummation and threshold resummation is found.
H. T. Schreuder; M. S. Williams
2000-01-01
In simulation sampling from forest populations using sample sizes of 20, 40, and 60 plots respectively, confidence intervals based on the bootstrap (accelerated, percentile, and t-distribution based) were calculated and compared with those based on the classical t confidence intervals for mapped populations and subdomains within those populations. A 68.1 ha mapped...
ERIC Educational Resources Information Center
Ural, A. Engin; Yuret, Deniz; Ketrez, F. Nihan; Kocbas, Dilara; Kuntay, Aylin C.
2009-01-01
The syntactic bootstrapping mechanism of verb learning was evaluated against child-directed speech in Turkish, a language with rich morphology, nominal ellipsis and free word order. Machine-learning algorithms were run on transcribed caregiver speech directed to two Turkish learners (one hour every two weeks between 0;9 to 1;10) of different…
ERIC Educational Resources Information Center
Seco, Guillermo Vallejo; Izquierdo, Marcelino Cuesta; Garcia, M. Paula Fernandez; Diez, F. Javier Herrero
2006-01-01
The authors compare the operating characteristics of the bootstrap-F approach, a direct extension of the work of Berkovits, Hancock, and Nevitt, with Huynh's improved general approximation (IGA) and the Brown-Forsythe (BF) multivariate approach in a mixed repeated measures design when normality and multisample sphericity assumptions do not hold.…
Sample-based estimation of tree species richness in a wet tropical forest compartment
Steen Magnussen; Raphael Pelissier
2007-01-01
Petersen's capture-recapture ratio estimator and the well-known bootstrap estimator are compared across a range of simulated low-intensity simple random sampling with fixed-area plots of 100 m? in a rich wet tropical forest compartment with 93 tree species in the Western Ghats of India. Petersen's ratio estimator was uniformly superior to the bootstrap...
Common Ground between Form and Content: The Pragmatic Solution to the Bootstrapping Problem
ERIC Educational Resources Information Center
Oller, John W.
2005-01-01
The frame of reference for this article is second or foreign language (L2 or FL) acquisition, but the pragmatic bootstrapping hypothesis applies to language processing and acquisition in any context or modality. It is relevant to teaching children to read. It shows how connections between target language surface forms and their content can be made…
2006-06-13
with arithmetic mean ( UPGMA ) using random tie breaking and uncorrected pairwise distances in MacVector 7.0 (Oxford Molecular). Numbers on branches...denote the UPGMA bootstrap percentage using a highly stringent number (1000) of replications (Felsenstein, 1985). All bootstrap values are 50%, as shown
A Comparison of Single Sample and Bootstrap Methods to Assess Mediation in Cluster Randomized Trials
ERIC Educational Resources Information Center
Pituch, Keenan A.; Stapleton, Laura M.; Kang, Joo Youn
2006-01-01
A Monte Carlo study examined the statistical performance of single sample and bootstrap methods that can be used to test and form confidence interval estimates of indirect effects in two cluster randomized experimental designs. The designs were similar in that they featured random assignment of clusters to one of two treatment conditions and…
Multilingual Phoneme Models for Rapid Speech Processing System Development
2006-09-01
processes are used to develop an Arabic speech recognition system starting from monolingual English models, In- ternational Phonetic Association (IPA...clusters. It was found that multilingual bootstrapping methods out- perform monolingual English bootstrapping methods on the Arabic evaluation data initially...International Phonetic Alphabet . . . . . . . . . 7 2.3.2 Multilingual vs. Monolingual Speech Recognition 7 2.3.3 Data-Driven Approaches
Ramírez-Prado, Dolores; Cortés, Ernesto; Aguilar-Segura, María Soledad; Gil-Guillén, Vicente Francisco
2016-01-01
In January 2012, a review of the cases of chromosome 15q24 microdeletion syndrome was published. However, this study did not include inferential statistics. The aims of the present study were to update the literature search and calculate confidence intervals for the prevalence of each phenotype using bootstrap methodology. Published case reports of patients with the syndrome that included detailed information about breakpoints and phenotype were sought and 36 were included. Deletions in megabase (Mb) pairs were determined to calculate the size of the interstitial deletion of the phenotypes studied in 2012. To determine confidence intervals for the prevalence of the phenotype and the interstitial loss, we used bootstrap methodology. Using the bootstrap percentiles method, we found wide variability in the prevalence of the different phenotypes (3–100%). The mean interstitial deletion size was 2.72 Mb (95% CI [2.35–3.10 Mb]). In comparison with our work, which expanded the literature search by 45 months, there were differences in the prevalence of 17% of the phenotypes, indicating that more studies are needed to analyze this rare disease. PMID:26925314
van Walraven, Carl
2017-04-01
Diagnostic codes used in administrative databases cause bias due to misclassification of patient disease status. It is unclear which methods minimize this bias. Serum creatinine measures were used to determine severe renal failure status in 50,074 hospitalized patients. The true prevalence of severe renal failure and its association with covariates were measured. These were compared to results for which renal failure status was determined using surrogate measures including the following: (1) diagnostic codes; (2) categorization of probability estimates of renal failure determined from a previously validated model; or (3) bootstrap methods imputation of disease status using model-derived probability estimates. Bias in estimates of severe renal failure prevalence and its association with covariates were minimal when bootstrap methods were used to impute renal failure status from model-based probability estimates. In contrast, biases were extensive when renal failure status was determined using codes or methods in which model-based condition probability was categorized. Bias due to misclassification from inaccurate diagnostic codes can be minimized using bootstrap methods to impute condition status using multivariable model-derived probability estimates. Copyright © 2017 Elsevier Inc. All rights reserved.
Reference interval computation: which method (not) to choose?
Pavlov, Igor Y; Wilson, Andrew R; Delgado, Julio C
2012-07-11
When different methods are applied to reference interval (RI) calculation the results can sometimes be substantially different, especially for small reference groups. If there are no reliable RI data available, there is no way to confirm which method generates results closest to the true RI. We randomly drawn samples obtained from a public database for 33 markers. For each sample, RIs were calculated by bootstrapping, parametric, and Box-Cox transformed parametric methods. Results were compared to the values of the population RI. For approximately half of the 33 markers, results of all 3 methods were within 3% of the true reference value. For other markers, parametric results were either unavailable or deviated considerably from the true values. The transformed parametric method was more accurate than bootstrapping for sample size of 60, very close to bootstrapping for sample size 120, but in some cases unavailable. We recommend against using parametric calculations to determine RIs. The transformed parametric method utilizing Box-Cox transformation would be preferable way of RI calculation, if it satisfies normality test. If not, the bootstrapping is always available, and is almost as accurate and precise as the transformed parametric method. Copyright © 2012 Elsevier B.V. All rights reserved.
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution
Imai, Mutsumi; Kita, Sotaro
2014-01-01
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. PMID:25092666
Estimating uncertainty in respondent-driven sampling using a tree bootstrap method.
Baraff, Aaron J; McCormick, Tyler H; Raftery, Adrian E
2016-12-20
Respondent-driven sampling (RDS) is a network-based form of chain-referral sampling used to estimate attributes of populations that are difficult to access using standard survey tools. Although it has grown quickly in popularity since its introduction, the statistical properties of RDS estimates remain elusive. In particular, the sampling variability of these estimates has been shown to be much higher than previously acknowledged, and even methods designed to account for RDS result in misleadingly narrow confidence intervals. In this paper, we introduce a tree bootstrap method for estimating uncertainty in RDS estimates based on resampling recruitment trees. We use simulations from known social networks to show that the tree bootstrap method not only outperforms existing methods but also captures the high variability of RDS, even in extreme cases with high design effects. We also apply the method to data from injecting drug users in Ukraine. Unlike other methods, the tree bootstrap depends only on the structure of the sampled recruitment trees, not on the attributes being measured on the respondents, so correlations between attributes can be estimated as well as variability. Our results suggest that it is possible to accurately assess the high level of uncertainty inherent in RDS.
Analysis of International Space Station Vehicle Materials on MISSE 6
NASA Technical Reports Server (NTRS)
Finckenor, Miria; Golden, Johnny; Kravchenko, Michael; O'Rourke, Mary Jane
2010-01-01
The International Space Station Materials and Processes team has multiple material samples on MISSE 6, 7 and 8 to observe Low Earth Orbit (LEO) environmental effects on Space Station materials. Optical properties, thickness/mass loss, surface elemental analysis, visual and microscopic analysis for surface change are some of the techniques employed in this investigation. Results for the following MISSE 6 samples materials will be presented: deionized water sealed anodized aluminum; Hyzod(tm) polycarbonate used to temporarily protect ISS windows; Russian quartz window material; Beta Cloth with Teflon(tm) reformulated without perfluorooctanoic acid (PFOA), and electroless nickel. Discussion for current and future MISSE materials experiments will be presented. MISSE 7 samples are: more deionized water sealed anodized aluminum, including Photofoil(tm); indium tin oxide (ITO) over-coated Kapton(tm) used as thermo-optical surfaces; mechanically scribed tin-plated beryllium-copper samples for "tin pest" growth (alpha/beta transformation); and beta cloth backed with a black coating rather than aluminization. MISSE 8 samples are: exposed "scrim cloth" (fiberglass weave) from the ISS solar array wing material, protective fiberglass tapes and sleeve materials, and optical witness samples to monitor contamination.
NASA Technical Reports Server (NTRS)
Finckenor, Miria
2006-01-01
Marshall Space Flight Center worked with the Air Force Research Laboratory, Naval Research Laboratory, Langley Research Center, Glenn Research Center, the Jet Propulsion Laboratory, Johnson Space Center, Boeing, Lockheed Martin, TRW, the Aerospace Corporation, Triton Systems, AZ Technology, Alion (formerly IITRI), ENTECH, and L'Garde to bring together the first external materials exposure experiment on International Space Station (ISS). MISSE re-uses hardware from the MEEP flown on the Russian space station Mir. MISSE has returned a treasure trove of materials data that will be useful not only for ISS but also for programs as diverse as the new Crew Exploration Vehicle, the James Webb Telescope, the Lunar Surface Access Module, the Robotic Lunar Exploration Program, High Altitude Airships, and solar sails. MISSE-1 and -2 (Figure 1) were attached to the Quest airlock on ISS for 4 years and were retrieved during STS-114. MISSE-3 and -4 were bumped fr-om STS-114 and are currently slated for deployment during STS-121. MISSE-5 (Figure 2) was deployed during STS-114.
Missing persons-missing data: the need to collect antemortem dental records of missing persons.
Blau, Soren; Hill, Anthony; Briggs, Christopher A; Cordner, Stephen M
2006-03-01
The subject of missing persons is of great concern to the community with numerous associated emotional, financial, and health costs. This paper examines the forensic medical issues raised by the delayed identification of individuals classified as "missing" and highlights the importance of including dental data in the investigation of missing persons. Focusing on Australia, the current approaches employed in missing persons investigations are outlined. Of particular significance is the fact that each of the eight Australian states and territories has its own Missing Persons Unit that operates within distinct state and territory legislation. Consequently, there is a lack of uniformity within Australia about the legal and procedural framework within which investigations of missing persons are conducted, and the interaction of that framework with coronial law procedures. One of the main investigative problems in missing persons investigations is the lack of forensic medical, particularly, odontological input. Forensic odontology has been employed in numerous cases in Australia where identity is unknown or uncertain because of remains being skeletonized, incinerated, or partly burnt. The routine employment of the forensic odontologist to assist in missing person inquiries, has however, been ignored. The failure to routinely employ forensic odontology in missing persons inquiries has resulted in numerous delays in identification. Three Australian cases are presented where the investigation of individuals whose identity was uncertain or unknown was prolonged due to the failure to utilize the appropriate (and available) dental resources. In light of the outcomes of these cases, we suggest that a national missing persons dental records database be established for future missing persons investigations. Such a database could be easily managed between a coronial system and a forensic medical institute. In Australia, a national missing persons dental records database could be incorporated into the National Coroners Information System (NCIS) managed, on behalf of Australia's Coroners, by the Victorian Institute of Forensic Medicine. The existence of the NCIS would ensure operational collaboration in the implementation of the system and cost savings to Australian policing agencies involved in missing person inquiries. The implementation of such a database would facilitate timely and efficient reconciliation of clinical and postmortem dental records and have subsequent social and financial benefits.
Identifying research priorities for effective retention strategies in clinical trials.
Kearney, Anna; Daykin, Anne; Shaw, Alison R G; Lane, Athene J; Blazeby, Jane M; Clarke, Mike; Williamson, Paula; Gamble, Carrol
2017-08-31
The failure to retain patients or collect primary-outcome data is a common challenge for trials and reduces the statistical power and potentially introduces bias into the analysis. Identifying strategies to minimise missing data was the second highest methodological research priority in a Delphi survey of the Directors of UK Clinical Trial Units (CTUs) and is important to minimise waste in research. Our aim was to assess the current retention practices within the UK and priorities for future research to evaluate the effectiveness of strategies to reduce attrition. Seventy-five chief investigators of NIHR Health Technology Assessment (HTA)-funded trials starting between 2009 and 2012 were surveyed to elicit their awareness about causes of missing data within their trial and recommended practices for improving retention. Forty-seven CTUs registered within the UKCRC network were surveyed separately to identify approaches and strategies being used to mitigate missing data across trials. Responses from the current practice surveys were used to inform a subsequent two-round Delphi survey with registered CTUs. A consensus list of retention research strategies was produced and ranked by priority. Fifty out of seventy-five (67%) chief investigators and 33/47 (70%) registered CTUs completed the current practice surveys. Seventy-eight percent of trialists were aware of retention challenges and implemented strategies at trial design. Patient-initiated withdrawal was the most common cause of missing data. Registered CTUs routinely used newsletters, timeline of participant visits, and telephone reminders to mitigate missing data. Whilst 36 out of 59 strategies presented had been formally or informally evaluated, some frequently used strategies, such as site initiation training, have had no research to inform practice. Thirty-five registered CTUs (74%) participated in the Delphi survey. Research into the effectiveness of site initiation training, frequency of patient contact during a trial, the use of routinely collected data, the frequency and timing of reminders, triggered site training and the time needed to complete questionnaires was deemed critical. Research into the effectiveness of Christmas cards for site staff was not of critical importance. The surveys of current practices demonstrates that a variety of strategies are being used to mitigate missing data but with little evidence to support their use. Six retention strategies were deemed critically important within the Delphi survey and should be a primary focus of future retention research.
Attrition in Developmental Psychology: A Review of Modern Missing Data Reporting and Practices
ERIC Educational Resources Information Center
Nicholson, Jody S.; Deboeck, Pascal R.; Howard, Waylon
2017-01-01
Inherent in applied developmental sciences is the threat to validity and generalizability due to missing data as a result of participant drop-out. The current paper provides an overview of how attrition should be reported, which tests can examine the potential of bias due to attrition (e.g., t-tests, logistic regression, Little's MCAR test,…
ERIC Educational Resources Information Center
Petonito, Gina; Muschert, Glenn W.; Carr, Dawn C.; Kinney, Jennifer M.; Robbins, Emily J.; Brown, J. Scott
2013-01-01
As America ages, greater numbers of older adults will be living with Alzheimer's disease or a related dementia, leading to increased incidence of wandering. Currently there are several initiatives to assist older adults who go missing. We describe and critically examine three prominent and widespread programs: Safe Return, Project Lifesaver, and…
A bootstrap lunar base: Preliminary design review 2
NASA Technical Reports Server (NTRS)
1987-01-01
A bootstrap lunar base is the gateway to manned solar system exploration and requires new ideas and new designs on the cutting edge of technology. A preliminary design for a Bootstrap Lunar Base, the second provided by this contractor, is presented. An overview of the work completed is discussed as well as the technical, management, and cost strategies to complete the program requirements. The lunar base design stresses the transforming capabilities of its lander vehicles to aid in base construction. The design also emphasizes modularity and expandability in the base configuration to support the long-term goals of scientific research and profitable lunar resource exploitation. To successfully construct, develop, and inhabit a permanent lunar base, however, several technological advancements must first be realized. Some of these technological advancements are also discussed.
Spheres, charges, instantons, and bootstrap: A five-dimensional odyssey
NASA Astrophysics Data System (ADS)
Chang, Chi-Ming; Fluder, Martin; Lin, Ying-Hsuan; Wang, Yifan
2018-03-01
We combine supersymmetric localization and the conformal bootstrap to study five-dimensional superconformal field theories. To begin, we classify the admissible counter-terms and derive a general relation between the five-sphere partition function and the conformal and flavor central charges. Along the way, we discover a new superconformal anomaly in five dimensions. We then propose a precise triple factorization formula for the five-sphere partition function, that incorporates instantons and is consistent with flavor symmetry enhancement. We numerically evaluate the central charges for the rank-one Seiberg and Morrison-Seiberg theories, and find strong evidence for their saturation of bootstrap bounds, thereby determining the spectra of long multiplets in these theories. Lastly, our results provide new evidence for the F-theorem and possibly a C-theorem in five-dimensional superconformal theories.
Atreja, Ashish; Aggarwal, Ashish; Licata, Angelo A; Lashner, Bret A
2012-01-01
Patients with inflammatory bowel disease (IBD) are at high risk of developing osteoporosis. Our objective was to determine the usefulness of IBD guidelines in identifying patients at risk for developing osteoporosis. We utilized institutional repository to identify patients seen in IBD center and extracted data on demographics, disease history, conventional, and nonconventional risk factors for osteoporosis and Dual Energy X-ray Absorptiometry (DXA) findings. 59% of patients (1004/1703) in our IBD cohort had at least one risk factor for osteoporosis screening. DXA was documented in 263 patients with indication of screening (provider adherence, 26.2%), and of these, 196 patients had DXA completed ("at-risk" group). Ninety-five patients not meeting guidelines-based risk factors also had DXA completed ("not at-risk" group). 139 (70.9%) patients in "at-risk" group had low BMD, while 51 (53.7%) of "not-at-risk" patients had low BMD. Majority of the patients with osteoporosis (83.3%) missed by the current guidelines had low BMI. Multivariate logistic regression analysis showed that low BMI was the strongest risk factor for osteoporosis (OR 3.07; 95% CI, 1.47-6.42; P = 0.003). Provider adherence to current guidelines is suboptimal. Low BMI can identify majority of the patients with osteoporosis that are missed by current guidelines.
Peterson, Josh F.; Eden, Svetlana K.; Moons, Karel G.; Ikizler, T. Alp; Matheny, Michael E.
2013-01-01
Summary Background and objectives Baseline creatinine (BCr) is frequently missing in AKI studies. Common surrogate estimates can misclassify AKI and adversely affect the study of related outcomes. This study examined whether multiple imputation improved accuracy of estimating missing BCr beyond current recommendations to apply assumed estimated GFR (eGFR) of 75 ml/min per 1.73 m2 (eGFR 75). Design, setting, participants, & measurements From 41,114 unique adult admissions (13,003 with and 28,111 without BCr data) at Vanderbilt University Hospital between 2006 and 2008, a propensity score model was developed to predict likelihood of missing BCr. Propensity scoring identified 6502 patients with highest likelihood of missing BCr among 13,003 patients with known BCr to simulate a “missing” data scenario while preserving actual reference BCr. Within this cohort (n=6502), the ability of various multiple-imputation approaches to estimate BCr and classify AKI were compared with that of eGFR 75. Results All multiple-imputation methods except the basic one more closely approximated actual BCr than did eGFR 75. Total AKI misclassification was lower with multiple imputation (full multiple imputation + serum creatinine) (9.0%) than with eGFR 75 (12.3%; P<0.001). Improvements in misclassification were greater in patients with impaired kidney function (full multiple imputation + serum creatinine) (15.3%) versus eGFR 75 (40.5%; P<0.001). Multiple imputation improved specificity and positive predictive value for detecting AKI at the expense of modestly decreasing sensitivity relative to eGFR 75. Conclusions Multiple imputation can improve accuracy in estimating missing BCr and reduce misclassification of AKI beyond currently proposed methods. PMID:23037980
ERIC Educational Resources Information Center
Wagstaff, David A.; Elek, Elvira; Kulis, Stephen; Marsiglia, Flavio
2009-01-01
A nonparametric bootstrap was used to obtain an interval estimate of Pearson's "r," and test the null hypothesis that there was no association between 5th grade students' positive substance use expectancies and their intentions to not use substances. The students were participating in a substance use prevention program in which the unit of…
Bootstrapping a five-loop amplitude using Steinmann relations
Caron-Huot, Simon; Dixon, Lance J.; McLeod, Andrew; ...
2016-12-05
Here, the analytic structure of scattering amplitudes is restricted by Steinmann relations, which enforce the vanishing of certain discontinuities of discontinuities. We show that these relations dramatically simplify the function space for the hexagon function bootstrap in planar maximally supersymmetric Yang-Mills theory. Armed with this simplification, along with the constraints of dual conformal symmetry and Regge exponentiation, we obtain the complete five-loop six-particle amplitude.
A Bootstrap Algorithm for Mixture Models and Interval Data in Inter-Comparisons
2001-07-01
parametric bootstrap. The present algorithm will be applied to a thermometric inter-comparison, where data cannot be assumed to be normally distributed. 2 Data...experimental methods, used in each laboratory) often imply that the statistical assumptions are not satisfied, as for example in several thermometric ...triangular). Indeed, in thermometric experiments these three probabilistic models can represent several common stochastic variabilities for
ERIC Educational Resources Information Center
Choi, Sae Il
2009-01-01
This study used simulation (a) to compare the kernel equating method to traditional equipercentile equating methods under the equivalent-groups (EG) design and the nonequivalent-groups with anchor test (NEAT) design and (b) to apply the parametric bootstrap method for estimating standard errors of equating. A two-parameter logistic item response…
ERIC Educational Resources Information Center
Essid, Hedi; Ouellette, Pierre; Vigeant, Stephane
2010-01-01
The objective of this paper is to measure the efficiency of high schools in Tunisia. We use a statistical data envelopment analysis (DEA)-bootstrap approach with quasi-fixed inputs to estimate the precision of our measure. To do so, we developed a statistical model serving as the foundation of the data generation process (DGP). The DGP is…
BootGraph: probabilistic fiber tractography using bootstrap algorithms and graph theory.
Vorburger, Robert S; Reischauer, Carolin; Boesiger, Peter
2013-02-01
Bootstrap methods have recently been introduced to diffusion-weighted magnetic resonance imaging to estimate the measurement uncertainty of ensuing diffusion parameters directly from the acquired data without the necessity to assume a noise model. These methods have been previously combined with deterministic streamline tractography algorithms to allow for the assessment of connection probabilities in the human brain. Thereby, the local noise induced disturbance in the diffusion data is accumulated additively due to the incremental progression of streamline tractography algorithms. Graph based approaches have been proposed to overcome this drawback of streamline techniques. For this reason, the bootstrap method is in the present work incorporated into a graph setup to derive a new probabilistic fiber tractography method, called BootGraph. The acquired data set is thereby converted into a weighted, undirected graph by defining a vertex in each voxel and edges between adjacent vertices. By means of the cone of uncertainty, which is derived using the wild bootstrap, a weight is thereafter assigned to each edge. Two path finding algorithms are subsequently applied to derive connection probabilities. While the first algorithm is based on the shortest path approach, the second algorithm takes all existing paths between two vertices into consideration. Tracking results are compared to an established algorithm based on the bootstrap method in combination with streamline fiber tractography and to another graph based algorithm. The BootGraph shows a very good performance in crossing situations with respect to false negatives and permits incorporating additional constraints, such as a curvature threshold. By inheriting the advantages of the bootstrap method and graph theory, the BootGraph method provides a computationally efficient and flexible probabilistic tractography setup to compute connection probability maps and virtual fiber pathways without the drawbacks of streamline tractography algorithms or the assumption of a noise distribution. Moreover, the BootGraph can be applied to common DTI data sets without further modifications and shows a high repeatability. Thus, it is very well suited for longitudinal studies and meta-studies based on DTI. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Yang, P.; Ng, T. L.; Yang, W.
2015-12-01
Effective water resources management depends on the reliable estimation of the uncertainty of drought events. Confidence intervals (CIs) are commonly applied to quantify this uncertainty. A CI seeks to be at the minimal length necessary to cover the true value of the estimated variable with the desired probability. In drought analysis where two or more variables (e.g., duration and severity) are often used to describe a drought, copulas have been found suitable for representing the joint probability behavior of these variables. However, the comprehensive assessment of the parameter uncertainties of copulas of droughts has been largely ignored, and the few studies that have recognized this issue have not explicitly compared the various methods to produce the best CIs. Thus, the objective of this study to compare the CIs generated using two widely applied uncertainty estimation methods, bootstrapping and Markov Chain Monte Carlo (MCMC). To achieve this objective, (1) the marginal distributions lognormal, Gamma, and Generalized Extreme Value, and the copula functions Clayton, Frank, and Plackett are selected to construct joint probability functions of two drought related variables. (2) The resulting joint functions are then fitted to 200 sets of simulated realizations of drought events with known distribution and extreme parameters and (3) from there, using bootstrapping and MCMC, CIs of the parameters are generated and compared. The effect of an informative prior on the CIs generated by MCMC is also evaluated. CIs are produced for different sample sizes (50, 100, and 200) of the simulated drought events for fitting the joint probability functions. Preliminary results assuming lognormal marginal distributions and the Clayton copula function suggest that for cases with small or medium sample sizes (~50-100), MCMC to be superior method if an informative prior exists. Where an informative prior is unavailable, for small sample sizes (~50), both bootstrapping and MCMC yield the same level of performance, and for medium sample sizes (~100), bootstrapping is better. For cases with a large sample size (~200), there is little difference between the CIs generated using bootstrapping and MCMC regardless of whether or not an informative prior exists.
Missing-value estimation using linear and non-linear regression with Bayesian gene selection.
Zhou, Xiaobo; Wang, Xiaodong; Dougherty, Edward R
2003-11-22
Data from microarray experiments are usually in the form of large matrices of expression levels of genes under different experimental conditions. Owing to various reasons, there are frequently missing values. Estimating these missing values is important because they affect downstream analysis, such as clustering, classification and network design. Several methods of missing-value estimation are in use. The problem has two parts: (1) selection of genes for estimation and (2) design of an estimation rule. We propose Bayesian variable selection to obtain genes to be used for estimation, and employ both linear and nonlinear regression for the estimation rule itself. Fast implementation issues for these methods are discussed, including the use of QR decomposition for parameter estimation. The proposed methods are tested on data sets arising from hereditary breast cancer and small round blue-cell tumors. The results compare very favorably with currently used methods based on the normalized root-mean-square error. The appendix is available from http://gspsnap.tamu.edu/gspweb/zxb/missing_zxb/ (user: gspweb; passwd: gsplab).
Liu, Wei; Ding, Jinhui
2018-04-01
The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.
ERIC Educational Resources Information Center
Root, Jenny Rose
2016-01-01
The current study evaluated the effects of modified schema-based instruction (SBI) on the algebra problem solving skills of three middle school students with autism spectrum disorder and moderate intellectual disability (ASD/ID). Participants learned to solve two types of group word problems: missing-whole and missing-part. The themes of the word…
Reliability of dose volume constraint inference from clinical data.
Lutz, C M; Møller, D S; Hoffmann, L; Knap, M M; Alber, M
2017-04-21
Dose volume histogram points (DVHPs) frequently serve as dose constraints in radiotherapy treatment planning. An experiment was designed to investigate the reliability of DVHP inference from clinical data for multiple cohort sizes and complication incidence rates. The experimental background was radiation pneumonitis in non-small cell lung cancer and the DVHP inference method was based on logistic regression. From 102 NSCLC real-life dose distributions and a postulated DVHP model, an 'ideal' cohort was generated where the most predictive model was equal to the postulated model. A bootstrap and a Cohort Replication Monte Carlo (CoRepMC) approach were applied to create 1000 equally sized populations each. The cohorts were then analyzed to establish inference frequency distributions. This was applied to nine scenarios for cohort sizes of 102 (1), 500 (2) to 2000 (3) patients (by sampling with replacement) and three postulated DVHP models. The Bootstrap was repeated for a 'non-ideal' cohort, where the most predictive model did not coincide with the postulated model. The Bootstrap produced chaotic results for all models of cohort size 1 for both the ideal and non-ideal cohorts. For cohort size 2 and 3, the distributions for all populations were more concentrated around the postulated DVHP. For the CoRepMC, the inference frequency increased with cohort size and incidence rate. Correct inference rates >[Formula: see text] were only achieved by cohorts with more than 500 patients. Both Bootstrap and CoRepMC indicate that inference of the correct or approximate DVHP for typical cohort sizes is highly uncertain. CoRepMC results were less spurious than Bootstrap results, demonstrating the large influence that randomness in dose-response has on the statistical analysis.
Reliability of dose volume constraint inference from clinical data
NASA Astrophysics Data System (ADS)
Lutz, C. M.; Møller, D. S.; Hoffmann, L.; Knap, M. M.; Alber, M.
2017-04-01
Dose volume histogram points (DVHPs) frequently serve as dose constraints in radiotherapy treatment planning. An experiment was designed to investigate the reliability of DVHP inference from clinical data for multiple cohort sizes and complication incidence rates. The experimental background was radiation pneumonitis in non-small cell lung cancer and the DVHP inference method was based on logistic regression. From 102 NSCLC real-life dose distributions and a postulated DVHP model, an ‘ideal’ cohort was generated where the most predictive model was equal to the postulated model. A bootstrap and a Cohort Replication Monte Carlo (CoRepMC) approach were applied to create 1000 equally sized populations each. The cohorts were then analyzed to establish inference frequency distributions. This was applied to nine scenarios for cohort sizes of 102 (1), 500 (2) to 2000 (3) patients (by sampling with replacement) and three postulated DVHP models. The Bootstrap was repeated for a ‘non-ideal’ cohort, where the most predictive model did not coincide with the postulated model. The Bootstrap produced chaotic results for all models of cohort size 1 for both the ideal and non-ideal cohorts. For cohort size 2 and 3, the distributions for all populations were more concentrated around the postulated DVHP. For the CoRepMC, the inference frequency increased with cohort size and incidence rate. Correct inference rates >85 % were only achieved by cohorts with more than 500 patients. Both Bootstrap and CoRepMC indicate that inference of the correct or approximate DVHP for typical cohort sizes is highly uncertain. CoRepMC results were less spurious than Bootstrap results, demonstrating the large influence that randomness in dose-response has on the statistical analysis.
Emura, Takeshi; Konno, Yoshihiko; Michimae, Hirofumi
2015-07-01
Doubly truncated data consist of samples whose observed values fall between the right- and left- truncation limits. With such samples, the distribution function of interest is estimated using the nonparametric maximum likelihood estimator (NPMLE) that is obtained through a self-consistency algorithm. Owing to the complicated asymptotic distribution of the NPMLE, the bootstrap method has been suggested for statistical inference. This paper proposes a closed-form estimator for the asymptotic covariance function of the NPMLE, which is computationally attractive alternative to bootstrapping. Furthermore, we develop various statistical inference procedures, such as confidence interval, goodness-of-fit tests, and confidence bands to demonstrate the usefulness of the proposed covariance estimator. Simulations are performed to compare the proposed method with both the bootstrap and jackknife methods. The methods are illustrated using the childhood cancer dataset.
Heptagons from the Steinmann cluster bootstrap
Dixon, Lance J.; Drummond, James; Harrington, Thomas; ...
2017-02-28
We reformulate the heptagon cluster bootstrap to take advantage of the Steinmann relations, which require certain double discontinuities of any amplitude to vanish. These constraints vastly reduce the number of functions needed to bootstrap seven-point amplitudes in planarmore » $$ \\mathcal{N} $$ = 4 supersymmetric Yang-Mills theory, making higher-loop contributions to these amplitudes more computationally accessible. In particular, dual superconformal symmetry and well-defined collinear limits suffice to determine uniquely the symbols of the three-loop NMHV and four-loop MHV seven-point amplitudes. We also show that at three loops, relaxing the dual superconformal $$\\bar{Q}$$ relations and imposing dihedral symmetry (and for NMHV the absence of spurious poles) leaves only a single ambiguity in the heptagon amplitudes. These results point to a strong tension between the collinear properties of the amplitudes and the Steinmann relations.« less
Kepler Planet Detection Metrics: Statistical Bootstrap Test
NASA Technical Reports Server (NTRS)
Jenkins, Jon M.; Burke, Christopher J.
2016-01-01
This document describes the data produced by the Statistical Bootstrap Test over the final three Threshold Crossing Event (TCE) deliveries to NExScI: SOC 9.1 (Q1Q16)1 (Tenenbaum et al. 2014), SOC 9.2 (Q1Q17) aka DR242 (Seader et al. 2015), and SOC 9.3 (Q1Q17) aka DR253 (Twicken et al. 2016). The last few years have seen significant improvements in the SOC science data processing pipeline, leading to higher quality light curves and more sensitive transit searches. The statistical bootstrap analysis results presented here and the numerical results archived at NASAs Exoplanet Science Institute (NExScI) bear witness to these software improvements. This document attempts to introduce and describe the main features and differences between these three data sets as a consequence of the software changes.
Imaging with New Classic and Vision at the NPOI
NASA Astrophysics Data System (ADS)
Jorgensen, Anders
2018-04-01
The Navy Precision Optical Interferometer (NPOI) is unique among interferometric observatories for its ability to position telescopes in an equally-spaced array configuration. This configuration is optimal for interferometric imaging because it allows the use of bootstrapping to track fringes on long baselines with signal-to-noise ratio less than one. When combined with coherent integration techniques this can produce visibilities with acceptable SNR on baselines long enough to resolve features on the surfaces of stars. The stellar surface imaging project at NPOI combines the bootstrapping array configuration of the NPOI array, real-time fringe tracking, baseline- and wavelength bootstrapping with Earth rotation to provide dense coverage in the UV plane at a wide range of spatial frequencies. In this presentation, we provide an overview of the project and an update of the latest status and results from the project.
Bootstrapping and Maintaining Trust in the Cloud
2016-12-01
proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as Amazon Web Services and Google Compute Engine means...IaaS trusted computing system: • Secure Bootstrapping – the system should enable the tenant to securely install an initial root secret into each cloud ...elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features, but none achieve all. Excalibur [31] sup
Sample Reuse in Statistical Remodeling.
1987-08-01
as the jackknife and bootstrap, is an expansion of the functional, T(Fn), or of its distribution function or both. Frangos and Schucany (1987a) used...accelerated bootstrap. In the same report Frangos and Schucany demonstrated the small sample superiority of that approach over the proposals that take...higher order terms of an Edgeworth expansion into account. In a second report Frangos and Schucany (1987b) examined the small sample performance of
ERIC Educational Resources Information Center
Ramanarayanan, Vikram; Suendermann-Oeft, David; Lange, Patrick; Ivanov, Alexei V.; Evanini, Keelan; Yu, Zhou; Tsuprun, Eugene; Qian, Yao
2016-01-01
We propose a crowdsourcing-based framework to iteratively and rapidly bootstrap a dialog system from scratch for a new domain. We leverage the open-source modular HALEF dialog system to deploy dialog applications. We illustrate the usefulness of this framework using four different prototype dialog items with applications in the educational domain…
Allahbadia, Gautam N
2002-09-01
The epidemic of gender selection is ravaging countries like India & China. Approximately fifty million women are "missing" in the Indian population. Generally three principle causes are given: female infanticide, better food and health care for boys and maternal death at childbirth. Prenatal sex determination and the abortion of female fetuses threatens to skew the sex ratio to new highs. Estimates of the number of female fetuses being destroyed every year in India vary from two million to five million. This review from India attempts to summarize all the currently available methods of sex selection and also highlights the current medical practice regards the subject in south-east Asia.
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution.
Imai, Mutsumi; Kita, Sotaro
2014-09-19
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Guerrero, Erick G; Fenwick, Karissa; Kong, Yinfei
2017-11-14
Leadership style and specific organizational climates have emerged as critical mechanisms to implement targeted practices in organizations. Drawing from relevant theories, we propose that climate for implementation of cultural competence reflects how transformational leadership may enhance the organizational implementation of culturally responsive practices in health care organizations. Using multilevel data from 427 employees embedded in 112 addiction treatment programs collected in 2013, confirmatory factor analysis showed adequate fit statistics for our measure of climate for implementation of cultural competence (Cronbach's alpha = .88) and three outcomes: knowledge (Cronbach's alpha = .88), services (Cronbach's alpha = .86), and personnel (Cronbach's alpha = .86) practices. Results from multilevel path analyses indicate a positive relationship between employee perceptions of transformational leadership and climate for implementation of cultural competence (standardized indirect effect = .057, bootstrap p < .001). We also found a positive indirect effect between transformational leadership and each of the culturally competent practices: knowledge (standardized indirect effect = .006, bootstrap p = .004), services (standardized indirect effect = .019, bootstrap p < .001), and personnel (standardized indirect effect = .014, bootstrap p = .005). Findings contribute to implementation science. They build on leadership theory and offer evidence of the mediating role of climate in the implementation of cultural competence in addiction health service organizations.
Nixon, Richard M; Wonderling, David; Grieve, Richard D
2010-03-01
Cost-effectiveness analyses (CEA) alongside randomised controlled trials commonly estimate incremental net benefits (INB), with 95% confidence intervals, and compute cost-effectiveness acceptability curves and confidence ellipses. Two alternative non-parametric methods for estimating INB are to apply the central limit theorem (CLT) or to use the non-parametric bootstrap method, although it is unclear which method is preferable. This paper describes the statistical rationale underlying each of these methods and illustrates their application with a trial-based CEA. It compares the sampling uncertainty from using either technique in a Monte Carlo simulation. The experiments are repeated varying the sample size and the skewness of costs in the population. The results showed that, even when data were highly skewed, both methods accurately estimated the true standard errors (SEs) when sample sizes were moderate to large (n>50), and also gave good estimates for small data sets with low skewness. However, when sample sizes were relatively small and the data highly skewed, using the CLT rather than the bootstrap led to slightly more accurate SEs. We conclude that while in general using either method is appropriate, the CLT is easier to implement, and provides SEs that are at least as accurate as the bootstrap. (c) 2009 John Wiley & Sons, Ltd.
Elton-Marshall, Tara; Fong, Geoffrey T; Yong, Hua-Hie; Borland, Ron; Xu, Steve Shaowei; Quah, Anne C K; Feng, Guoze; Jiang, Yuan
2016-01-01
Background The sensory belief that ‘light/low tar’ cigarettes are smoother can also influence the belief that ‘light/low tar’ cigarettes are less harmful. However, the ‘light’ concept is one of several factors influencing beliefs. No studies have examined the impact of the sensory belief about one’s own brand of cigarettes on perceptions of harm. Objective The current study examines whether a smoker’s sensory belief that their brand is smoother is associated with the belief that their brand is less harmful and whether sensory beliefs mediate the relation between smoking a ‘light/low tar’ cigarette and relative perceptions of harm among smokers in China. Methods Data are from 5209 smokers who were recruited using a stratified multistage sampling design and participated in wave 3 of the International Tobacco Control (ITC) China Survey, a face-to-face survey of adult smokers and non-smokers in seven cities. Results Smokers who agreed that their brand of cigarettes was smoother were significantly more likely to say that their brand of cigarettes was less harmful (p<0.001, OR=6.86, 95% CI 5.64 to 8.33). Mediational analyses using the bootstrapping procedure indicated that both the direct effect of ‘light/low tar’ cigarette smokers on the belief that their cigarettes are less harmful (b=0.24, bootstrapped bias corrected 95% CI 0.13 to 0.34, p<0.001) and the indirect effect via their belief that their cigarettes are smoother were significant (b=0.32, bootstrapped bias-corrected 95% CI 0.28 to 0.37, p<0.001), suggesting that the mediation was partial. Conclusions These results demonstrate the importance of implementing tobacco control policies that address the impact that cigarette design and marketing can have in capitalising on the smoker’s natural associations between smoother sensations and lowered perceptions of harm. PMID:25370698
Elton-Marshall, Tara; Fong, Geoffrey T; Yong, Hua-Hie; Borland, Ron; Xu, Steve Shaowei; Quah, Anne C K; Feng, Guoze; Jiang, Yuan
2015-11-01
The sensory belief that 'light/low tar' cigarettes are smoother can also influence the belief that 'light/low tar' cigarettes are less harmful. However, the 'light' concept is one of several factors influencing beliefs. No studies have examined the impact of the sensory belief about one's own brand of cigarettes on perceptions of harm. The current study examines whether a smoker's sensory belief that their brand is smoother is associated with the belief that their brand is less harmful and whether sensory beliefs mediate the relation between smoking a 'light/low tar' cigarette and relative perceptions of harm among smokers in China. Data are from 5209 smokers who were recruited using a stratified multistage sampling design and participated in Wave 3 of the International Tobacco Control (ITC) China Survey, a face-to-face survey of adult smokers and non-smokers in seven cities. Smokers who agreed that their brand of cigarettes was smoother were significantly more likely to say that their brand of cigarettes was less harmful (p<0.001, OR=6.86, 95% CI 5.64 to 8.33). Mediational analyses using the bootstrapping procedure indicated that both the direct effect of 'light/low tar' cigarette smokers on the belief that their cigarettes are less harmful (b=0.24, bootstrapped bias corrected 95% CI 0.13 to 0.34, p<0.001) and the indirect effect via their belief that their cigarettes are smoother were significant (b=0.32, bootstrapped bias-corrected 95% CI 0.28 to 0.37, p<0.001), suggesting that the mediation was partial. These results demonstrate the importance of implementing tobacco control policies that address the impact that cigarette design and marketing can have in capitalising on the smoker's natural associations between smoother sensations and lowered perceptions of harm. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Missing data in FFQs: making assumptions about item non-response.
Lamb, Karen E; Olstad, Dana Lee; Nguyen, Cattram; Milte, Catherine; McNaughton, Sarah A
2017-04-01
FFQs are a popular method of capturing dietary information in epidemiological studies and may be used to derive dietary exposures such as nutrient intake or overall dietary patterns and diet quality. As FFQs can involve large numbers of questions, participants may fail to respond to all questions, leaving researchers to decide how to deal with missing data when deriving intake measures. The aim of the present commentary is to discuss the current practice for dealing with item non-response in FFQs and to propose a research agenda for reporting and handling missing data in FFQs. Single imputation techniques, such as zero imputation (assuming no consumption of the item) or mean imputation, are commonly used to deal with item non-response in FFQs. However, single imputation methods make strong assumptions about the missing data mechanism and do not reflect the uncertainty created by the missing data. This can lead to incorrect inference about associations between diet and health outcomes. Although the use of multiple imputation methods in epidemiology has increased, these have seldom been used in the field of nutritional epidemiology to address missing data in FFQs. We discuss methods for dealing with item non-response in FFQs, highlighting the assumptions made under each approach. Researchers analysing FFQs should ensure that missing data are handled appropriately and clearly report how missing data were treated in analyses. Simulation studies are required to enable systematic evaluation of the utility of various methods for handling item non-response in FFQs under different assumptions about the missing data mechanism.
Miss-distance indicator for tank main guns
NASA Astrophysics Data System (ADS)
Bornstein, Jonathan A.; Hillis, David B.
1996-06-01
Tank main gun systems must possess extremely high levels of accuracy to perform successfully in battle. Under some circumstances, the first round fired in an engagement may miss the intended target, and it becomes necessary to rapidly correct fire. A breadboard automatic miss-distance indicator system was previously developed to assist in this process. The system, which would be mounted on a 'wingman' tank, consists of a charged-coupled device (CCD) camera and computer-based image-processing system, coupled with a separate infrared sensor to detect muzzle flash. For the system to be successfully employed with current generation tanks, it must be reliable, be relatively low cost, and respond rapidly maintaining current firing rates. Recently, the original indicator system was developed further in an effort to assist in achieving these goals. Efforts have focused primarily upon enhanced image-processing algorithms, both to improve system reliability and to reduce processing requirements. Intelligent application of newly refined trajectory models has permitted examination of reduced areas of interest and enhanced rejection of false alarms, significantly improving system performance.
Maternal depression and trait anger as risk factors for escalated physical discipline.
Shay, Nicole L; Knutson, John F
2008-02-01
To test the hypothesized anger-mediated relation between maternal depression and escalation of physical discipline, 122 economically disadvantaged mothers were assessed for current and lifetime diagnoses of depression using the Current Depressive Episode, Past Depression, and Dysthymia sections of the Structured Clinical Interview for DSM-IV (SCID) and a measure of current depressive symptoms, the Beck Depression Inventory-Second Edition (BDI-II). Escalation of physical discipline was assessed using a video analog parenting task; maternal anger not specific to discipline was assessed using the Spielberger Trait Anger Expression Inventory. Reports of anger were associated with the diagnosis of depression and depressive symptoms. Bootstrap analyses of indirect effects indicated that the link between depression and escalated discipline was mediated by anger. Parallel analyses based on BDI-II scores identified a marginally significant indirect effect of depression on discipline. Findings suggest that anger and irritability are central to the putative link between depression and harsh discipline.
Some Aspects of Advanced Tokamak Modeling in DIII-D
NASA Astrophysics Data System (ADS)
St John, H. E.; Petty, C. C.; Murakami, M.; Kinsey, J. E.
2000-10-01
We extend previous work(M. Murakami, et al., General Atomics Report GA-A23310 (1999).) done on time dependent DIII-D advanced tokamak simulations by introducing theoretical confinement models rather than relying on power balance derived transport coefficients. We explore using NBCD and off axis ECCD together with a self-consistent aligned bootstrap current, driven by the internal transport barrier dynamics generated with the GLF23 confinement model, to shape the hollow current profile and to maintain MHD stable conditions. Our theoretical modeling approach uses measured DIII-D initial conditions to start off the simulations in a smooth consistent manner. This mitigates the troublesome long lived perturbations in the ohmic current profile that is normally caused by inconsistent initial data. To achieve this goal our simulation uses a sequence of time dependent eqdsks generated autonomously by the EFIT MHD equilibrium code in analyzing experimental data to supply the history for the simulation.
BOOTSTRAPPING THE CORONAL MAGNETIC FIELD WITH STEREO: UNIPOLAR POTENTIAL FIELD MODELING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aschwanden, Markus J.; Sandman, Anne W., E-mail: aschwanden@lmsal.co
We investigate the recently quantified misalignment of {alpha}{sub mis} {approx} 20{sup 0}-40{sup 0} between the three-dimensional geometry of stereoscopically triangulated coronal loops observed with STEREO/EUVI (in four active regions (ARs)) and theoretical (potential or nonlinear force-free) magnetic field models extrapolated from photospheric magnetograms. We develop an efficient method of bootstrapping the coronal magnetic field by forward fitting a parameterized potential field model to the STEREO-observed loops. The potential field model consists of a number of unipolar magnetic charges that are parameterized by decomposing a photospheric magnetogram from the Michelson Doppler Imager. The forward-fitting method yields a best-fit magnetic field modelmore » with a reduced misalignment of {alpha}{sub PF} {approx} 13{sup 0}-20{sup 0}. We also evaluate stereoscopic measurement errors and find a contribution of {alpha}{sub SE} {approx} 7{sup 0}-12{sup 0}, which constrains the residual misalignment to {alpha}{sub NP} {approx} 11{sup 0}-17{sup 0}, which is likely due to the nonpotentiality of the ARs. The residual misalignment angle, {alpha}{sub NP}, of the potential field due to nonpotentiality is found to correlate with the soft X-ray flux of the AR, which implies a relationship between electric currents and plasma heating.« less
Palmer, Tom M; Holmes, Michael V; Keating, Brendan J; Sheehan, Nuala A
2017-11-01
Mendelian randomization studies use genotypes as instrumental variables to test for and estimate the causal effects of modifiable risk factors on outcomes. Two-stage residual inclusion (TSRI) estimators have been used when researchers are willing to make parametric assumptions. However, researchers are currently reporting uncorrected or heteroscedasticity-robust standard errors for these estimates. We compared several different forms of the standard error for linear and logistic TSRI estimates in simulations and in real-data examples. Among others, we consider standard errors modified from the approach of Newey (1987), Terza (2016), and bootstrapping. In our simulations Newey, Terza, bootstrap, and corrected 2-stage least squares (in the linear case) standard errors gave the best results in terms of coverage and type I error. In the real-data examples, the Newey standard errors were 0.5% and 2% larger than the unadjusted standard errors for the linear and logistic TSRI estimators, respectively. We show that TSRI estimators with modified standard errors have correct type I error under the null. Researchers should report TSRI estimates with modified standard errors instead of reporting unadjusted or heteroscedasticity-robust standard errors. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health.
Qian, Jinping P.; Garofalo, Andrea M.; Gong, Xianzu Z.; ...
2017-03-20
Recent EAST/DIII-D joint experiments on the high poloidal betamore » $${{\\beta}_{\\text{P}}}$$ regime in DIII-D have extended operation with internal transport barriers (ITBs) and excellent energy confinement (H 98y2 ~ 1.6) to higher plasma current, for lower q 95 ≤ 7.0, and more balanced neutral beam injection (NBI) (torque injection < 2 Nm), for lower plasma rotation than previous results. Transport analysis and experimental measurements at low toroidal rotation suggest that the E × B shear effect is not key to the ITB formation in these high $${{\\beta}_{\\text{P}}}$$ discharges. Experiments and TGLF modeling show that the Shafranov shift has a key stabilizing effect on turbulence. Extrapolation of the DIII-D results using a 0D model shows that with the improved confinement, the high bootstrap fraction regime could achieve fusion gain Q = 5 in ITER at $${{\\beta}_{\\text{N}}}$$ ~ 2.9 and q 95 ~ 7. With the optimization of q(0), the required improved confinement is achievable when using 1.5D TGLF-SAT1 for transport simulations. Furthermore, results reported in this paper suggest that the DIII-D high $${{\\beta}_{\\text{P}}}$$ scenario could be a candidate for ITER steady state operation.« less
Fagerland, Morten W; Sandvik, Leiv; Mowinckel, Petter
2011-04-13
The number of events per individual is a widely reported variable in medical research papers. Such variables are the most common representation of the general variable type called discrete numerical. There is currently no consensus on how to compare and present such variables, and recommendations are lacking. The objective of this paper is to present recommendations for analysis and presentation of results for discrete numerical variables. Two simulation studies were used to investigate the performance of hypothesis tests and confidence interval methods for variables with outcomes {0, 1, 2}, {0, 1, 2, 3}, {0, 1, 2, 3, 4}, and {0, 1, 2, 3, 4, 5}, using the difference between the means as an effect measure. The Welch U test (the T test with adjustment for unequal variances) and its associated confidence interval performed well for almost all situations considered. The Brunner-Munzel test also performed well, except for small sample sizes (10 in each group). The ordinary T test, the Wilcoxon-Mann-Whitney test, the percentile bootstrap interval, and the bootstrap-t interval did not perform satisfactorily. The difference between the means is an appropriate effect measure for comparing two independent discrete numerical variables that has both lower and upper bounds. To analyze this problem, we encourage more frequent use of parametric hypothesis tests and confidence intervals.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on Education and the Workforce.
These hearings transcripts compile testimony regarding how programs authorized by the Runaway and Homeless Youth Act and the Missing Children's Assistance Act currently operate, in preparation for upcoming reauthorization. Opening statements by U.S. Representatives Peter Hoekstra (Michigan) and Ruben Hinojosa (Texas) underscore the obligation to…
ERIC Educational Resources Information Center
Longford, Nicholas T.
This study is a critical evaluation of the roles for coding and scoring of missing responses to multiple-choice items in educational tests. The focus is on tests in which the test-takers have little or no motivation; in such tests omitting and not reaching (as classified by the currently adopted operational rules) is quite frequent. Data from the…
The effect of sheared toroidal rotation on pressure driven magnetic islands in toroidal plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hegna, C. C.
2016-05-15
The impact of sheared toroidal rotation on the evolution of pressure driven magnetic islands in tokamak plasmas is investigated using a resistive magnetohydrodynamics model augmented by a neoclassical Ohm's law. Particular attention is paid to the asymptotic matching data as the Mercier indices are altered in the presence of sheared flow. Analysis of the nonlinear island Grad-Shafranov equation shows that sheared flows tend to amplify the stabilizing pressure/curvature contribution to pressure driven islands in toroidal tokamaks relative to the island bootstrap current contribution. As such, sheared toroidal rotation tends to reduce saturated magnetic island widths.
Bootstrapping and Maintaining Trust in the Cloud
2016-03-16
of infrastructure-as-a- service (IaaS) cloud computing services such as Ama- zon Web Services, Google Compute Engine, Rackspace, et. al. means that...Implementation We implemented keylime in ∼3.2k lines of Python in four components: registrar, node, CV, and tenant. The registrar offers a REST-based web ...bootstrap key K. It provides an unencrypted REST-based web service for these two functions. As described earlier, the pro- tocols for exchanging data
Reduced Power Laer Designation Systems
2008-06-20
200KD, Ri = = 60Kfl, and R 2 = R4 = 2K yields an overall transimpedance gain of 200K x 30 x 30 = 180MV/A. Figure 3. Three stage photodiode amplifier ...transistor circuit for bootstrap buffering of the input stage, comparing the noise performance of the candidate amplifier designs, selecting the two...transistor bootstrap design as the circuit of choice, and comparing the performance of this circuit against that of a basic transconductance amplifier
Molinos-Senante, María; Donoso, Guillermo; Sala-Garrido, Ramon; Villegas, Andrés
2018-03-01
Benchmarking the efficiency of water companies is essential to set water tariffs and to promote their sustainability. In doing so, most of the previous studies have applied conventional data envelopment analysis (DEA) models. However, it is a deterministic method that does not allow to identify environmental factors influencing efficiency scores. To overcome this limitation, this paper evaluates the efficiency of a sample of Chilean water and sewerage companies applying a double-bootstrap DEA model. Results evidenced that the ranking of water and sewerage companies changes notably whether efficiency scores are computed applying conventional or double-bootstrap DEA models. Moreover, it was found that the percentage of non-revenue water and customer density are factors influencing the efficiency of Chilean water and sewerage companies. This paper illustrates the importance of using a robust and reliable method to increase the relevance of benchmarking tools.
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
Bootstrapping the energy flow in the beginning of life.
Hengeveld, R; Fedonkin, M A
2007-01-01
This paper suggests that the energy flow on which all living structures depend only started up slowly, the low-energy, initial phase starting up a second, slightly more energetic phase, and so on. In this way, the build up of the energy flow follows a bootstrapping process similar to that found in the development of computers, the first generation making possible the calculations necessary for constructing the second one, etc. In the biogenetic upstart of an energy flow, non-metals in the lower periods of the Periodic Table of Elements would have constituted the most primitive systems, their operation being enhanced and later supplanted by elements in the higher periods that demand more energy. This bootstrapping process would put the development of the metabolisms based on the second period elements carbon, nitrogen and oxygen at the end of the evolutionary process rather than at, or even before, the biogenetic event.
Factors associated with study attrition among HIV-infected risky drinkers in St. Petersburg, Russia.
Kiriazova, T; Cheng, D M; Coleman, S M; Blokhina, E; Krupitsky, E; Lira, M C; Bridden, C; Raj, A; Samet, J H
2014-01-01
Participant attrition in HIV longitudinal studies may introduce bias and diminish research quality. The identification of participant characteristics that are predictive of attrition might inform retention strategies. The study aimed to identify factors associated with attrition among HIV-infected Russian risky drinkers from the secondary HIV prevention HERMITAGE trial. We examined whether current injection drug use (IDU), binge drinking, depressive symptoms, HIV status nondisclosure, stigma, and lifetime history of incarceration were predictors of study attrition. We also explored effect modification due to gender. Complete loss to follow-up (LTFU), defined as no follow-up visits after baseline, was the primary outcome, and time to first missed visit was the secondary outcome. We used multiple logistic regression models for the primary analysis, and Cox proportional hazards models for the secondary analysis. Of 660 participants, 101 (15.3%) did not return after baseline. No significant associations between independent variables and complete LTFU were observed. Current IDU and HIV status nondisclosure were significantly associated with time to first missed visit (adjusted hazard ratio [AHR], 1.39; 95% CI, 1.03-1.87; AHR, 1.38; 95% CI, 1.03-1.86, respectively). Gender stratified analyses suggested a larger impact of binge drinking among men and history of incarceration among women with time to first missed visit. Although no factors were significantly associated with complete LTFU, current IDU and HIV status nondisclosure were significantly associated with time to first missed visit in HIV-infected Russian risky drinkers. An understanding of these predictors may inform retention efforts in longitudinal studies.
Missing data in trauma registries: A systematic review.
Shivasabesan, Gowri; Mitra, Biswadev; O'Reilly, Gerard M
2018-03-30
Trauma registries play an integral role in trauma systems but their valid use hinges on data quality. The aim of this study was to determine, among contemporary publications using trauma registry data, the level of reporting of data completeness and the methods used to deal with missing data. A systematic review was conducted of all trauma registry-based manuscripts published from 01 January 2015 to current date (17 March 2017). Studies were identified by searching MEDLINE, EMBASE, and CINAHL using relevant subject headings and keywords. Included manuscripts were evaluated based on previously published recommendations regarding the reporting and discussion of missing data. Manuscripts were graded on their degree of characterization of such observations. In addition, the methods used to manage missing data were examined. There were 539 manuscripts that met inclusion criteria. Among these, 208 (38.6%) manuscripts did not mention data completeness and 88 (16.3%) mentioned missing data but did not quantify the extent. Only a handful (n = 26; 4.8%) quantified the 'missingness' of all variables. Most articles (n = 477; 88.5%) contained no details such as a comparison between patient characteristics in cohorts with and without missing data. Of the 331 articles which made at least some mention of data completeness, the method of managing missing data was unknown in 34 (10.3%). When method(s) to handle missing data were identified, 234 (78.8%) manuscripts used complete case analysis only, 18 (6.1%) used multiple imputation only and 34 (11.4%) used a combination of these. Most manuscripts using trauma registry data did not quantify the extent of missing data for any variables and contained minimal discussion regarding missingness. Out of the studies which identified a method of managing missing data, most used complete case analysis, a method that may bias results. The lack of standardization in the reporting and management of missing data questions the validity of conclusions from research based on trauma registry data. Copyright © 2018 Elsevier Ltd. All rights reserved.
Combining Fourier and lagged k-nearest neighbor imputation for biomedical time series data.
Rahman, Shah Atiqur; Huang, Yuxiao; Claassen, Jan; Heintzman, Nathaniel; Kleinberg, Samantha
2015-12-01
Most clinical and biomedical data contain missing values. A patient's record may be split across multiple institutions, devices may fail, and sensors may not be worn at all times. While these missing values are often ignored, this can lead to bias and error when the data are mined. Further, the data are not simply missing at random. Instead the measurement of a variable such as blood glucose may depend on its prior values as well as that of other variables. These dependencies exist across time as well, but current methods have yet to incorporate these temporal relationships as well as multiple types of missingness. To address this, we propose an imputation method (FLk-NN) that incorporates time lagged correlations both within and across variables by combining two imputation methods, based on an extension to k-NN and the Fourier transform. This enables imputation of missing values even when all data at a time point is missing and when there are different types of missingness both within and across variables. In comparison to other approaches on three biological datasets (simulated and actual Type 1 diabetes datasets, and multi-modality neurological ICU monitoring) the proposed method has the highest imputation accuracy. This was true for up to half the data being missing and when consecutive missing values are a significant fraction of the overall time series length. Copyright © 2015 Elsevier Inc. All rights reserved.
Wang, Guoshen; Pan, Yi; Seth, Puja; Song, Ruiguang; Belcher, Lisa
2017-01-01
Missing data create challenges for determining progress made in linking HIV-positive persons to HIV medical care. Statistical methods are not used to address missing program data on linkage. In 2014, 61 health department jurisdictions were funded by Centers for Disease Control and Prevention (CDC) and submitted data on HIV testing, newly diagnosed HIV-positive persons, and linkage to HIV medical care. Missing or unusable data existed in our data set. A new approach using multiple imputation to address missing linkage data was proposed, and results were compared to the current approach that uses data with complete information. There were 12,472 newly diagnosed HIV-positive persons from CDC-funded HIV testing events in 2014. Using multiple imputation, 94.1% (95% confidence interval (CI): [93.7%, 94.6%]) of newly diagnosed persons were referred to HIV medical care, 88.6% (95% CI: [88.0%, 89.1%]) were linked to care within any time frame, and 83.6% (95% CI: [83.0%, 84.3%]) were linked to care within 90 days. Multiple imputation is recommended for addressing missing linkage data in future analyses when the missing percentage is high. The use of multiple imputation for missing values can result in a better understanding of how programs are performing on key HIV testing and HIV service delivery indicators.
van Leeuwen, Pim J; Hayen, Andrew; Thompson, James E; Moses, Daniel; Shnier, Ron; Böhm, Maret; Abuodha, Magdaline; Haynes, Anne-Maree; Ting, Francis; Barentsz, Jelle; Roobol, Monique; Vass, Justin; Rasiah, Krishan; Delprado, Warick; Stricker, Phillip D
2017-12-01
To develop and externally validate a predictive model for detection of significant prostate cancer. Development of the model was based on a prospective cohort including 393 men who underwent multiparametric magnetic resonance imaging (mpMRI) before biopsy. External validity of the model was then examined retrospectively in 198 men from a separate institution whom underwent mpMRI followed by biopsy for abnormal prostate-specific antigen (PSA) level or digital rectal examination (DRE). A model was developed with age, PSA level, DRE, prostate volume, previous biopsy, and Prostate Imaging Reporting and Data System (PIRADS) score, as predictors for significant prostate cancer (Gleason 7 with >5% grade 4, ≥20% cores positive or ≥7 mm of cancer in any core). Probability was studied via logistic regression. Discriminatory performance was quantified by concordance statistics and internally validated with bootstrap resampling. In all, 393 men had complete data and 149 (37.9%) had significant prostate cancer. While the variable model had good accuracy in predicting significant prostate cancer, area under the curve (AUC) of 0.80, the advanced model (incorporating mpMRI) had a significantly higher AUC of 0.88 (P < 0.001). The model was well calibrated in internal and external validation. Decision analysis showed that use of the advanced model in practice would improve biopsy outcome predictions. Clinical application of the model would reduce 28% of biopsies, whilst missing 2.6% significant prostate cancer. Individualised risk assessment of significant prostate cancer using a predictive model that incorporates mpMRI PIRADS score and clinical data allows a considerable reduction in unnecessary biopsies and reduction of the risk of over-detection of insignificant prostate cancer at the cost of a very small increase in the number of significant cancers missed. © 2017 The Authors BJU International © 2017 BJU International Published by John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Nastally, Becky L.; Dixon, Mark R.
2012-01-01
In the current study, 3 participants with a history of problem gambling were exposed to computerized slot machine play consisting of outcomes that depicted wins, losses, and near misses (2 out of 3 identical slot machine symbols). Participants were asked to rate each type of outcome in terms of its closeness to a win on a scale of 1 to 10 before…
Beckfield, Jason; Olafsdottir, Sigrun; Sosnaud, Benjamin
2017-01-01
This essay reviews and evaluates recent comparative social science scholarship on healthcare systems. We focus on four of the strongest themes in current research: (1) the development of typologies of healthcare systems, (2) assessment of convergence among healthcare systems, (3) problematization of the shifting boundaries of healthcare systems, and (4) the relationship between healthcare systems and social inequalities. Our discussion seeks to highlight the central debates that animate current scholarship and identify unresolved questions and new opportunities for research. We also identify five currents in contemporary sociology that have not been incorporated as deeply as they might into research on healthcare systems. These five “missed turns” include an emphasis on social relations, culture, postnational theory, institutions, and causal mechanisms. We conclude by highlighting some key challenges for comparative research on healthcare systems. PMID:28769148
Seasonal comparisons of sea ice concentration estimates derived from SSM/I, OKEAN, and RADARSAT data
Belchansky, Gennady I.; Douglas, David C.
2002-01-01
The Special Sensor Microwave Imager (SSM/I) microwave satellite radiometer and its predecessor SMMR are primary sources of information for global sea ice and climate studies. However, comparisons of SSM/I, Landsat, AVHRR, and ERS-1 synthetic aperture radar (SAR) have shown substantial seasonal and regional differences in their estimates of sea ice concentration. To evaluate these differences, we compared SSM/I estimates of sea ice coverage derived with the NASA Team and Bootstrap algorithms to estimates made using RADARSAT, and OKEAN-01 satellite sensor data. The study area included the Barents Sea, Kara Sea, Laptev Sea, and adjacent parts of the Arctic Ocean, during October 1995 through October 1999. Ice concentration estimates from spatially and temporally near-coincident imagery were calculated using independent algorithms for each sensor type. The OKEAN algorithm implemented the satellite's two-channel active (radar) and passive microwave data in a linear mixture model based on the measured values of brightness temperature and radar backscatter. The RADARSAT algorithm utilized a segmentation approach of the measured radar backscatter, and the SSM/I ice concentrations were derived at National Snow and Ice Data Center (NSIDC) using the NASA Team and Bootstrap algorithms. Seasonal and monthly differences between SSM/I, OKEAN, and RADARSAT ice concentrations were calculated and compared. Overall, total sea ice concentration estimates derived independently from near-coincident RADARSAT, OKEAN-01, and SSM/I satellite imagery demonstrated mean differences of less than 5.5% (S.D.<9.5%) during the winter period. Differences between the SSM/I NASA Team and the SSM/I Bootstrap concentrations were no more than 3.1% (S.D.<5.4%) during this period. RADARSAT and OKEAN-01 data both yielded higher total ice concentrations than the NASA Team and the Bootstrap algorithms. The Bootstrap algorithm yielded higher total ice concentrations than the NASA Team algorithm. Total ice concentrations derived from OKEAN-01 and SSM/I satellite imagery were highly correlated during winter, spring, and fall, with mean differences of less than 8.1% (S.D.<15%) for the NASA Team algorithm, and less than 2.8% (S.D.<13.8%) for the Bootstrap algorithm. Respective differences between SSM/I NASA Team and SSM/I Bootstrap total concentrations were less than 5.3% (S.D.<6.9%). Monthly mean differences between SSM/I and OKEAN differed annually by less than 6%, with smaller differences primarily in winter. The NASA Team and Bootstrap algorithms underestimated the total sea ice concentrations relative to the RADARSAT ScanSAR no more than 3.0% (S.D.<9%) and 1.2% (S.D.<7.5%) during cold months, and no more than 12% and 7% during summer, respectively. ScanSAR tended to estimate higher ice concentrations for ice concentrations greater than 50%, when compared to SSM/I during all months. ScanSAR underestimated total sea ice concentration by 2% compared to the OKEAN-01 algorithm during cold months, and gave an overestimation by 2% during spring and summer months. Total NASA Team and Bootstrap sea ice concentration estimates derived from coincident SSM/I and OKEAN-01 data demonstrated mean differences of no more than 5.3% (S.D.<7%), 3.1% (S.D.<5.5%), 2.0% (S.D.<5.5%), and 7.3% (S.D.<10%) for fall, winter, spring, and summer periods, respectively. Large disagreements were observed between the OKEAN and NASA Team results in spring and summer for estimates of the first-year (FY) and multiyear (MY) age classes. The OKEAN-01 algorithm and data tended to estimate, on average, lower concentrations of young or FY ice and higher concentrations of total and MY ice for all months and seasons. Our results contribute to the growing body of documentation about the levels of disparity obtained when seasonal sea ice concentrations are estimated using various types of satellite data and algorithms.
Soendergaard, Helle M; Thomsen, Per H; Pedersen, Pernille; Pedersen, Erik; Poulsen, Agnethe E; Nielsen, Jette M; Winther, Lars; Henriksen, Anne; Rungoe, Berit; Soegaard, Hans J
2016-02-01
Knowledge of factors associated with treatment dropout and missed appointments in adults with attention-deficit/hyperactivity disorder (ADHD) is very limited. On the basis of proposed hypotheses that past behavior patterns are more predictive of current behaviors of treatment dropout and missed appointments than are sociodemographic and clinical characteristics, we examined the associations of sociodemographic variables, clinical variables, risk-taking behavior, educational and occupational instability, and behaviors during mandatory schooling with the primary outcome measures of treatment dropout and missed appointments. In a naturalistic cohort study of 151 adult outpatients with ADHD initiating assessment in a Danish ADHD unit from September 1, 2010, to September 1, 2011, the Adult ADHD Self-Report Scale v1.1 symptom checklist (ASRS) and a thorough clinical interview were used to assess ADHD according to DSM-IV-TR criteria. Stepwise logistic regression analysis was used to estimate reported associations. A total of 27% of patients dropped out of treatment and a total of 42% had ≥ 3 missed appointments during treatment. Mood and anxiety disorders significantly lowered the odds of treatment dropout (odds ratio [OR] = 0.18; 95% confidence interval [CI], 0.05-0.65), whereas having started but not completed 2 or more educational programs apart from mandatory schooling significantly increased the odds of dropout (OR = 3.01; 95% CI, 1.32-6.89). Variables significantly associated with most missed appointments were low educational level (OR = 2.19; 95% CI, 1.12-4.31), 3 or more employments of less than 3 months' duration (OR = 2.86; 95% CI, 1.30-6.28), and having skipped class often/very often during mandatory schooling (OR = 2.65; 95% CI, 1.29-5.43). Additionally, the predominantly inattentive ADHD (ADHD-I) subtype lowered the odds of missed appointments (OR = 0.17; 95% CI, 0.05-0.62). Our results suggest that past behavior in terms of highest dropout rates in the educational and occupational systems and highest rates of skipping class during mandatory schooling is equally associated with current behavior of treatment dropout and missed appointments as are sociodemographic and clinical factors. ClinicalTrials.gov identifier: NCT02226445. © Copyright 2015 Physicians Postgraduate Press, Inc.
A cluster bootstrap for two-loop MHV amplitudes
Golden, John; Spradlin, Marcus
2015-02-02
We apply a bootstrap procedure to two-loop MHV amplitudes in planar N=4 super-Yang-Mills theory. We argue that the mathematically most complicated part (the Λ 2 B 2 coproduct component) of the n-particle amplitude is uniquely determined by a simple cluster algebra property together with a few physical constraints (dihedral symmetry, analytic structure, supersymmetry, and well-defined collinear limits). Finally, we present a concise, closed-form expression which manifests these properties for all n.
Wrappers for Performance Enhancement and Oblivious Decision Graphs
1995-09-01
always select all relevant features. We test di erent search engines to search the space of feature subsets and introduce compound operators to speed...distinct instances from the original dataset appearing in the test set is thus 0:632m. The 0i accuracy estimate is derived by using bootstrap sample...i for training and the rest of the instances for testing . Given a number b, the number of bootstrap samples, let 0i be the accuracy estimate for
CME Velocity and Acceleration Error Estimates Using the Bootstrap Method
NASA Technical Reports Server (NTRS)
Michalek, Grzegorz; Gopalswamy, Nat; Yashiro, Seiji
2017-01-01
The bootstrap method is used to determine errors of basic attributes of coronal mass ejections (CMEs) visually identified in images obtained by the Solar and Heliospheric Observatory (SOHO) mission's Large Angle and Spectrometric Coronagraph (LASCO) instruments. The basic parameters of CMEs are stored, among others, in a database known as the SOHO/LASCO CME catalog and are widely employed for many research studies. The basic attributes of CMEs (e.g. velocity and acceleration) are obtained from manually generated height-time plots. The subjective nature of manual measurements introduces random errors that are difficult to quantify. In many studies the impact of such measurement errors is overlooked. In this study we present a new possibility to estimate measurements errors in the basic attributes of CMEs. This approach is a computer-intensive method because it requires repeating the original data analysis procedure several times using replicate datasets. This is also commonly called the bootstrap method in the literature. We show that the bootstrap approach can be used to estimate the errors of the basic attributes of CMEs having moderately large numbers of height-time measurements. The velocity errors are in the vast majority small and depend mostly on the number of height-time points measured for a particular event. In the case of acceleration, the errors are significant, and for more than half of all CMEs, they are larger than the acceleration itself.
Simplified Estimation and Testing in Unbalanced Repeated Measures Designs.
Spiess, Martin; Jordan, Pascal; Wendt, Mike
2018-05-07
In this paper we propose a simple estimator for unbalanced repeated measures design models where each unit is observed at least once in each cell of the experimental design. The estimator does not require a model of the error covariance structure. Thus, circularity of the error covariance matrix and estimation of correlation parameters and variances are not necessary. Together with a weak assumption about the reason for the varying number of observations, the proposed estimator and its variance estimator are unbiased. As an alternative to confidence intervals based on the normality assumption, a bias-corrected and accelerated bootstrap technique is considered. We also propose the naive percentile bootstrap for Wald-type tests where the standard Wald test may break down when the number of observations is small relative to the number of parameters to be estimated. In a simulation study we illustrate the properties of the estimator and the bootstrap techniques to calculate confidence intervals and conduct hypothesis tests in small and large samples under normality and non-normality of the errors. The results imply that the simple estimator is only slightly less efficient than an estimator that correctly assumes a block structure of the error correlation matrix, a special case of which is an equi-correlation matrix. Application of the estimator and the bootstrap technique is illustrated using data from a task switch experiment based on an experimental within design with 32 cells and 33 participants.
NASA Technical Reports Server (NTRS)
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
Image analysis of representative food structures: application of the bootstrap method.
Ramírez, Cristian; Germain, Juan C; Aguilera, José M
2009-08-01
Images (for example, photomicrographs) are routinely used as qualitative evidence of the microstructure of foods. In quantitative image analysis it is important to estimate the area (or volume) to be sampled, the field of view, and the resolution. The bootstrap method is proposed to estimate the size of the sampling area as a function of the coefficient of variation (CV(Bn)) and standard error (SE(Bn)) of the bootstrap taking sub-areas of different sizes. The bootstrap method was applied to simulated and real structures (apple tissue). For simulated structures, 10 computer-generated images were constructed containing 225 black circles (elements) and different coefficient of variation (CV(image)). For apple tissue, 8 images of apple tissue containing cellular cavities with different CV(image) were analyzed. Results confirmed that for simulated and real structures, increasing the size of the sampling area decreased the CV(Bn) and SE(Bn). Furthermore, there was a linear relationship between the CV(image) and CV(Bn) (.) For example, to obtain a CV(Bn) = 0.10 in an image with CV(image) = 0.60, a sampling area of 400 x 400 pixels (11% of whole image) was required, whereas if CV(image) = 1.46, a sampling area of 1000 x 100 pixels (69% of whole image) became necessary. This suggests that a large-size dispersion of element sizes in an image requires increasingly larger sampling areas or a larger number of images.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oberije, Cary, E-mail: cary.oberije@maastro.nl; De Ruysscher, Dirk; Universitaire Ziekenhuizen Leuven, KU Leuven
Purpose: Although patients with stage III non-small cell lung cancer (NSCLC) are homogeneous according to the TNM staging system, they form a heterogeneous group, which is reflected in the survival outcome. The increasing amount of information for an individual patient and the growing number of treatment options facilitate personalized treatment, but they also complicate treatment decision making. Decision support systems (DSS), which provide individualized prognostic information, can overcome this but are currently lacking. A DSS for stage III NSCLC requires the development and integration of multiple models. The current study takes the first step in this process by developing andmore » validating a model that can provide physicians with a survival probability for an individual NSCLC patient. Methods and Materials: Data from 548 patients with stage III NSCLC were available to enable the development of a prediction model, using stratified Cox regression. Variables were selected by using a bootstrap procedure. Performance of the model was expressed as the c statistic, assessed internally and on 2 external data sets (n=174 and n=130). Results: The final multivariate model, stratified for treatment, consisted of age, gender, World Health Organization performance status, overall treatment time, equivalent radiation dose, number of positive lymph node stations, and gross tumor volume. The bootstrapped c statistic was 0.62. The model could identify risk groups in external data sets. Nomograms were constructed to predict an individual patient's survival probability ( (www.predictcancer.org)). The data set can be downloaded at (https://www.cancerdata.org/10.1016/j.ijrobp.2015.02.048). Conclusions: The prediction model for overall survival of patients with stage III NSCLC highlights the importance of combining patient, clinical, and treatment variables. Nomograms were developed and validated. This tool could be used as a first building block for a decision support system.« less
PROGRESS IN THE PEELING-BALLOONING MODEL OF ELMS: TOROIDAL ROTATION AND 3D NONLINEAR DYNAMICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
SNYDER,P.B; WILSON,H.R; XU,X.Q
2004-06-01
Understanding the physics of the H-Mode pedestal and edge localized modes (ELMs) is very important to next-step fusion devices for two primary reasons: (1) The pressure at the top of the edge barrier (''pedestal height'') strongly impacts global confinement and fusion performance, and (2) large ELMs lead to localized transient heat loads on material surfaces that may constrain component lifetimes. The development of the peeling-ballooning model has shed light on these issues by positing a mechanism for ELM onset and constraints on the pedestal height. The mechanism involves instability of ideal coupled ''peeling-ballooning'' modes driven by the sharp pressure gradientmore » and consequent large bootstrap current in the H-mode edge. It was first investigated in the local, high-n limit [1], and later quantified for non-local, finite-n modes in general toroidal geometry [2,3]. Important aspects are that a range of wavelengths may potentially be unstable, with intermediate n's (n {approx} 3-30) generally limiting in high performance regimes, and that stability bounds are strongly sensitive to shape [Fig l(a)], and to collisionality (i.e. temperature and density) [4] through the bootstrap current. The development of efficient MHD stability codes such as ELITE [3,2] and MISHKA [5] has allowed detailed quantification of peeling-ballooning stability bounds (e.g. [6]) and extensive and largely successful comparisons with observation (e.g. [2,6-9]). These previous calculations are ideal, static, and linear. Here we extend this work to incorporate the impact of sheared toroidal rotation, and the non-ideal, nonlinear dynamics which must be studied to quantify ELM size and heat deposition on material surfaces.« less
Akl, Elie A; Kahale, Lara A; Ebrahim, Shanil; Alonso-Coello, Pablo; Schünemann, Holger J; Guyatt, Gordon H
2016-08-01
To categorize the challenges in determining the extent of missing participant data in randomized trials and suggest potential solutions for systematic review authors. During the process of updating a series of Cochrane systematic reviews on the topic of anticoagulation in patients with cancer, we identified challenges and used an iterative approach to improve, and a consensus process to agree on the challenges identified, and to suggest potential ways of dealing with them. The five systematic reviews included 58 trials and 75 meta-analyses for patient-important dichotomous outcomes with 27,037 randomized participants. We identified three categories of challenges: (1) Although systematic reviewers require information about missing data to be reported by outcome, trialists typically report the information by participant; (2) It is not always clear whether the trialists followed up participants in certain categories (e.g., noncompliers), that is, whether some categories of participants did or did not have missing data; (3) It is not always clear how the trialists dealt with missing data in their analysis (e.g., exclusion from the denominator vs. assumptions made for the numerator). We discuss potential solutions for each one of these challenges and suggest further research work. Current reporting of missing data is often not explicit and transparent, and although our potential solutions to problems of suboptimal reporting may be helpful, reliable and valid characterization of the extent and nature of missing data remains elusive. Reporting of missing data in trials needs further improvement. Copyright © 2016 Elsevier Inc. All rights reserved.
Lyratzopoulos, G; Vedsted, P; Singh, H
2015-01-01
The diagnosis of cancer is a complex, multi-step process. In this paper, we highlight factors involved in missed opportunities to diagnose cancer more promptly in symptomatic patients and discuss responsible mechanisms and potential strategies to shorten intervals from presentation to diagnosis. Missed opportunities are instances in which post-hoc judgement indicates that alternative decisions or actions could have led to more timely diagnosis. They can occur in any of the three phases of the diagnostic process (initial diagnostic assessment; diagnostic test performance and interpretation; and diagnostic follow-up and coordination) and can involve patient, doctor/care team, and health-care system factors, often in combination. In this perspective article, we consider epidemiological ‘signals' suggestive of missed opportunities and draw on evidence from retrospective case reviews of cancer patient cohorts to summarise factors that contribute to missed opportunities. Multi-disciplinary research targeting such factors is important to shorten diagnostic intervals post presentation. Insights from the fields of organisational and cognitive psychology, human factors science and informatics can be extremely valuable in this emerging research agenda. We provide a conceptual foundation for the development of future interventions to minimise the occurrence of missed opportunities in cancer diagnosis, enriching current approaches that chiefly focus on clinical decision support or on widening access to investigations. PMID:25734393
A pilot study on association between phthalate exposure and missed miscarriage.
Yi, H; Gu, H; Zhou, T; Chen, Y; Wang, G; Jin, Y; Yuan, W; Zhao, H; Zhang, L
2016-05-01
The incidence of missed miscarriage has been increasing during the past decade in China and the etiology of about half of the cases remains unclear. Exposure to phthalates has been considered as a risk factor. The aim of this paper is to assess the association between exposure to phthalates and missed miscarriage. A case-control study was performed including 150 cases of missed miscarriage and 150 matched controls with normal pregnancies. The levels of phthalate exposure were compared between the two groups by measuring 13 phthalate metabolites in urine samples. Blood samples were collected for serum hormone measurement to assess the relationship between serum hormone level and phthalate exposure. The urinary levels of metabolites of di-(2-ethylhexyl) phthalate (DEHP) and dimethyl phthalate (DMP) were significantly higher in the cases than in the controls. A strong dose-response relationship was observed between urinary metabolite levels and the odds of missed miscarriage. Monomethyl phthalate (MMP), a metabolite of DMP, and mono-2-ethylhexyl phthalate (MEHP), a metabolite of DEHP, each had significant negative correlation with maternal serum hormone levels. In the current study, exposure to DEHP and DMP was found to be associated with missed miscarriage. Interruption of hormone synthesis by DMP and DEHP metabolites represents a plausible mechanism of phthalate reproductive toxicity.
Konias, Sokratis; Chouvarda, Ioanna; Vlahavas, Ioannis; Maglaveras, Nicos
2005-09-01
Current approaches for mining association rules usually assume that the mining is performed in a static database, where the problem of missing attribute values does not practically exist. However, these assumptions are not preserved in some medical databases, like in a home care system. In this paper, a novel uncertainty rule algorithm is illustrated, namely URG-2 (Uncertainty Rule Generator), which addresses the problem of mining dynamic databases containing missing values. This algorithm requires only one pass from the initial dataset in order to generate the item set, while new metrics corresponding to the notion of Support and Confidence are used. URG-2 was evaluated over two medical databases, introducing randomly multiple missing values for each record's attribute (rate: 5-20% by 5% increments) in the initial dataset. Compared with the classical approach (records with missing values are ignored), the proposed algorithm was more robust in mining rules from datasets containing missing values. In all cases, the difference in preserving the initial rules ranged between 30% and 60% in favour of URG-2. Moreover, due to its incremental nature, URG-2 saved over 90% of the time required for thorough re-mining. Thus, the proposed algorithm can offer a preferable solution for mining in dynamic relational databases.
49 CFR 570.58 - Electric brake system.
Code of Federal Regulations, 2011 CFR
2011-10-01
... manufacturer's maximum current rating. In progressing from zero to maximum, the ammeter indication shall show no fluctuation evidencing a short circuit or other interruption of current. (1) Inspection procedure... missing. Terminal connections shall be clean. Conductor wire gauge shall not be below the brake...
Liu, Chunbo; Pan, Feng; Li, Yun
2016-07-29
Glutamate is of great importance in food and pharmaceutical industries. There is still lack of effective statistical approaches for fault diagnosis in the fermentation process of glutamate. To date, the statistical approach based on generalized additive model (GAM) and bootstrap has not been used for fault diagnosis in fermentation processes, much less the fermentation process of glutamate with small samples sets. A combined approach of GAM and bootstrap was developed for the online fault diagnosis in the fermentation process of glutamate with small sample sets. GAM was first used to model the relationship between glutamate production and different fermentation parameters using online data from four normal fermentation experiments of glutamate. The fitted GAM with fermentation time, dissolved oxygen, oxygen uptake rate and carbon dioxide evolution rate captured 99.6 % variance of glutamate production during fermentation process. Bootstrap was then used to quantify the uncertainty of the estimated production of glutamate from the fitted GAM using 95 % confidence interval. The proposed approach was then used for the online fault diagnosis in the abnormal fermentation processes of glutamate, and a fault was defined as the estimated production of glutamate fell outside the 95 % confidence interval. The online fault diagnosis based on the proposed approach identified not only the start of the fault in the fermentation process, but also the end of the fault when the fermentation conditions were back to normal. The proposed approach only used a small sample sets from normal fermentations excitements to establish the approach, and then only required online recorded data on fermentation parameters for fault diagnosis in the fermentation process of glutamate. The proposed approach based on GAM and bootstrap provides a new and effective way for the fault diagnosis in the fermentation process of glutamate with small sample sets.
Wang, Jung-Han; Abdel-Aty, Mohamed; Wang, Ling
2017-07-01
There have been plenty of studies intended to use different methods, for example, empirical Bayes before-after methods, to get accurate estimation of CMFs. All of them have different assumptions toward crash count if there was no treatment. Additionally, another major assumption is that multiple sites share the same true CMF. Under this assumption, the CMF at an individual intersection is randomly drawn from a normally distributed population of CMFs at all intersections. Since CMFs are non-zero values, the population of all CMFs might not follow normal distributions, and even if it does, the true mean of CMFs at some intersections may be different from that at others. Therefore, a bootstrap method based on before-after empirical Bayes theory was proposed to estimate CMFs, but it did not make distributional assumptions. This bootstrap procedure has the added benefit of producing a measure of CMF stability. Furthermore, based on the bootstrapped CMF, a new CMF precision rating method was proposed to evaluate the reliability of CMFs. This study chose 29 urban four-legged intersections as treated sites, and their controls were changed from stop-controlled to signal-controlled. Meanwhile, 124 urban four-legged stop-controlled intersections were selected as reference sites. At first, different safety performance functions (SPFs) were applied to five crash categories, and it was found that each crash category had different optimal SPF form. Then, the CMFs of these five crash categories were estimated using the bootstrap empirical Bayes method. The results of the bootstrapped method showed that signalization significantly decreased Angle+Left-Turn crashes, and its CMF had the highest precision. While, the CMF for Rear-End crashes was unreliable. For KABCO, KABC, and KAB crashes, their CMFs were proved to be reliable for the majority of intersections, but the estimated effect of signalization may be not accurate at some sites. Copyright © 2017 Elsevier Ltd. All rights reserved.
Ekedahl, Anders; Brosius, Helen; Jönsson, Julia; Karlsson, Hanna; Yngvesson, Maria
2011-11-01
To study discrepancies between (i) the prescribed current treatment stated by patients with congestive heart failure (CHF) compared with patients with other chronic diseases, (ii) the data in the medication list (ML) in the electronic medical record and (iii) the data in the prescription list (PL) on the prescriptions stored in the national prescription repository in Sweden, to determine current, noncurrent, duplicate and missing prescriptions. At one healthcare centre, a random sample of patients 18 years and older with a diagnosis of CHF, diabetes mellitus (DM) or osteoarthritis (OA) provided written informed consent to participate. Participants were interviewed by telephone on the prescribed current treatment. Of 161 invited patients (61 CHF, 50 DM and 50 OA), 66 patients were included. More than 80% of the patients had at least one discrepancy, a noncurrent, a duplicate or a missing prescription, in the ML and PL. The overall congruence for unique prescriptions on current treatment between the ML and the PL was only 55%. Patients with CHF had overall more discrepancies and patients with DM fewer discrepancies in the ML. Prescriptions for noncurrent treatment, duplicates and missing prescriptions are common in both the ML in the electronic medical record and the list on prescriptions stored in the Swedish National Prescription Repository. Patients with CHF had more discrepancies in the ML. The risk for medication errors in primary care due to incorrect information on prescribed treatment may be substantial. Copyright © 2011 John Wiley & Sons, Ltd.
NLSI Focus Group on Missing ALSEP Data Recovery: Progress and Plans
NASA Technical Reports Server (NTRS)
Lewis, L. R.; Nakamura, Y.; Nagihara, S.; Williams, D. R.; Chi, P.; Taylor, P. T.; Schmidt, G. K.; Grayzeck, E. J.
2011-01-01
On the six Apollo landed missions, the Astronauts deployed the Apollo Lunar Surface Experiments Package (ALSEP) science stations which measured active and passive seismic events, magnetic fields, charged particles, solar wind, heat flow, the diffuse atmosphere, meteorites and their ejecta, lunar dust, etc. Today's scientists are able to extract new information and make new discoveries from the old ALSEP data utilizing recent advances in computer capabilities and new analysis techniques. However, current-day investigators are encountering problems trying to use the ALSEP data. In 2007 archivists from NASA Goddard Space Flight Center (GSFC) National Space Science Data Center (NSSDC) estimated only about 50 percent of the processed ALSEP lunar surface data-of-interest to current lunar science investigators were in the NSSDC archives. The current-day lunar science investigators found most of the ALSEP data, then in the NSSDC archives. were extremely difficult to use. The data were in forms often not well described in the published reports and rerecording anomalies existed in the data which could only be resolved by tape experts. To resolve this problem, the DPS Lunar Data Node was established in 2008 at NSSDC and is in the process of successfully making the existing archived ALSEP data available to current-day investigators in easily useable forms. In July of 2010 the NASA Lunar Science Institute (NLSI) at Ames Research Center established the Recovery of Missing ALSEP Data Focus Group in recognition of the importance of the current activities to find the raw and processed ALSEP data missing from the NSSDC archives.
Simons, Claire L; Rivero-Arias, Oliver; Yu, Ly-Mee; Simon, Judit
2015-04-01
Missing data are a well-known and widely documented problem in cost-effectiveness analyses alongside clinical trials using individual patient-level data. Current methodological research recommends multiple imputation (MI) to deal with missing health outcome data, but there is little guidance on whether MI for multi-attribute questionnaires, such as the EQ-5D-3L, should be carried out at domain or at summary score level. In this paper, we evaluated the impact of imputing individual domains versus imputing index values to deal with missing EQ-5D-3L data using a simulation study and developed recommendations for future practice. We simulated missing data in a patient-level dataset with complete EQ-5D-3L data at one point in time from a large multinational clinical trial (n = 1,814). Different proportions of missing data were generated using a missing at random (MAR) mechanism and three different scenarios were studied. The performance of using each method was evaluated using root mean squared error and mean absolute error of the actual versus predicted EQ-5D-3L indices. In large sample sizes (n > 500) and a missing data pattern that follows mainly unit non-response, imputing domains or the index produced similar results. However, domain imputation became more accurate than index imputation with pattern of missingness following an item non-response. For smaller sample sizes (n < 100), index imputation was more accurate. When MI models were misspecified, both domain and index imputations were inaccurate for any proportion of missing data. The decision between imputing the domains or the EQ-5D-3L index scores depends on the observed missing data pattern and the sample size available for analysis. Analysts conducting this type of exercises should also evaluate the sensitivity of the analysis to the MAR assumption and whether the imputation model is correctly specified.
NASA Astrophysics Data System (ADS)
Eckhardt, Donald H.; Garrido Pestaña, José Luis
2014-06-01
The nineteenth century's quest for the missing matter (Vulcan) ended with the publication of Einstein's General Theory of Relativity. We contend that the current quest for the missing matter is parallel in its perseverance and in its ultimate futility. After setting the search for dark matter in its historic perspective, we critique extant dark matter models and offer alternative explanations -- derived from a Lorentz-invariant Lagrangian -- that will, at the very least, sow seeds of doubt about the existence of dark matter.
Cost aware cache replacement policy in shared last-level cache for hybrid memory based fog computing
NASA Astrophysics Data System (ADS)
Jia, Gangyong; Han, Guangjie; Wang, Hao; Wang, Feng
2018-04-01
Fog computing requires a large main memory capacity to decrease latency and increase the Quality of Service (QoS). However, dynamic random access memory (DRAM), the commonly used random access memory, cannot be included into a fog computing system due to its high consumption of power. In recent years, non-volatile memories (NVM) such as Phase-Change Memory (PCM) and Spin-transfer torque RAM (STT-RAM) with their low power consumption have emerged to replace DRAM. Moreover, the currently proposed hybrid main memory, consisting of both DRAM and NVM, have shown promising advantages in terms of scalability and power consumption. However, the drawbacks of NVM, such as long read/write latency give rise to potential problems leading to asymmetric cache misses in the hybrid main memory. Current last level cache (LLC) policies are based on the unified miss cost, and result in poor performance in LLC and add to the cost of using NVM. In order to minimize the cache miss cost in the hybrid main memory, we propose a cost aware cache replacement policy (CACRP) that reduces the number of cache misses from NVM and improves the cache performance for a hybrid memory system. Experimental results show that our CACRP behaves better in LLC performance, improving performance up to 43.6% (15.5% on average) compared to LRU.
SparRec: An effective matrix completion framework of missing data imputation for GWAS
NASA Astrophysics Data System (ADS)
Jiang, Bo; Ma, Shiqian; Causey, Jason; Qiao, Linbo; Hardin, Matthew Price; Bitts, Ian; Johnson, Daniel; Zhang, Shuzhong; Huang, Xiuzhen
2016-10-01
Genome-wide association studies present computational challenges for missing data imputation, while the advances of genotype technologies are generating datasets of large sample sizes with sample sets genotyped on multiple SNP chips. We present a new framework SparRec (Sparse Recovery) for imputation, with the following properties: (1) The optimization models of SparRec, based on low-rank and low number of co-clusters of matrices, are different from current statistics methods. While our low-rank matrix completion (LRMC) model is similar to Mendel-Impute, our matrix co-clustering factorization (MCCF) model is completely new. (2) SparRec, as other matrix completion methods, is flexible to be applied to missing data imputation for large meta-analysis with different cohorts genotyped on different sets of SNPs, even when there is no reference panel. This kind of meta-analysis is very challenging for current statistics based methods. (3) SparRec has consistent performance and achieves high recovery accuracy even when the missing data rate is as high as 90%. Compared with Mendel-Impute, our low-rank based method achieves similar accuracy and efficiency, while the co-clustering based method has advantages in running time. The testing results show that SparRec has significant advantages and competitive performance over other state-of-the-art existing statistics methods including Beagle and fastPhase.
Peel, D; Waples, R S; Macbeth, G M; Do, C; Ovenden, J R
2013-03-01
Theoretical models are often applied to population genetic data sets without fully considering the effect of missing data. Researchers can deal with missing data by removing individuals that have failed to yield genotypes and/or by removing loci that have failed to yield allelic determinations, but despite their best efforts, most data sets still contain some missing data. As a consequence, realized sample size differs among loci, and this poses a problem for unbiased methods that must explicitly account for random sampling error. One commonly used solution for the calculation of contemporary effective population size (N(e) ) is to calculate the effective sample size as an unweighted mean or harmonic mean across loci. This is not ideal because it fails to account for the fact that loci with different numbers of alleles have different information content. Here we consider this problem for genetic estimators of contemporary effective population size (N(e) ). To evaluate bias and precision of several statistical approaches for dealing with missing data, we simulated populations with known N(e) and various degrees of missing data. Across all scenarios, one method of correcting for missing data (fixed-inverse variance-weighted harmonic mean) consistently performed the best for both single-sample and two-sample (temporal) methods of estimating N(e) and outperformed some methods currently in widespread use. The approach adopted here may be a starting point to adjust other population genetics methods that include per-locus sample size components. © 2012 Blackwell Publishing Ltd.
NASA Technical Reports Server (NTRS)
Kravchenko, Michael; ORourke, Mary Jane; Golden, Johnny; Finckenor, Miria; Leatherwood, Michael; Alred, John
2010-01-01
The International Space Station Materials and Processes (ISS M&P) team has multiple material samples on MISSE 6, 7 and 8 to observe Low Earth Orbit (LEO) environmental effects on Space Station materials. Optical properties, thickness/mass loss, surface elemental analysis, visual and microscopic analysis for surface change are some of the techniques employed in this investigation. The ISS M&P team has participated in previous MISSE activities in order to better characterize the LEO effects on Space Station materials. This investigation will further this effort. Results for the following MISSE 6 samples materials will be presented: a comparison of anodize and chemical conversion coatings on various aluminum alloys, electroless nickel; AZ93 white ceramic thermal control coating with and without Teflon; Hyzod(TM) polycarbonate used to temporarily protect ISS windows; Russian quartz window material; reformulated Teflon (TM) coated Beta Cloth (Teflon TM without perfluorooctanoic acid (PFOA)) and a Dutch version of beta cloth. Discussion for current and future MISSE materials experiments will be presented. MISSE 7 samples are: deionized water sealed anodized aluminum Photofoil(TM); indium tin oxide (ITO)- coated Kapton(TM) used as thermo-optical surfaces; mechanically scribed tin-plated beryllium-copper samples for "tin pest" growth ( alpha/Beta transformation); Crew Exploration Vehicle (CEV) parachute soft goods. MISSE 8 sample: exposed "scrim cloth" (fiberglass weave) from the ISS solar array wing material, Davlyn fiberglass sleeve material, Permacel and Intertape protective tapes, and ITO-coated Kapton.
Are Expectations the Missing Link between Life History Strategies and Psychopathology?
Kavanagh, Phillip S; Kahl, Bianca L
2018-01-01
Despite advances in knowledge and thinking about using life history theory to explain psychopathology there is still a missing link. That is, we all have a life history strategy, but not all of us develop mental health problems. We propose that the missing link is expectations - a mismatch between expected environmental conditions (including social) set by variations in life history strategies and the current environmental conditions. The mismatch hypothesis has been applied at the biological level in terms of health and disease and we believe that it can also be applied more broadly at the psychological level in terms of perceived expectations in the social environment and the resulting distress-psychopathology-that manifests when our expectations are not met.
Detecting Breech Presentation Before Labour: Lessons From a Low-Risk Maternity Clinic.
Ressl, Bill; O'Beirne, Maeve
2015-08-01
Evaluation of fetal position is an important part of prenatal care. A woman with a breech presentation may need referral for external cephalic version, for assisted breech delivery, or to schedule a Caesarean section. In many centres, a breech presentation undetected until labour will result in an emergency Caesarean section, a less desirable alternative for both the mother and the health care system. The anecdotal reports of undiagnosed breech presentations at a busy maternity clinic prompted a study to quantify the missed breech presentations and to evaluate the effectiveness of the current detection process, with the aim of allowing no more than 1% of breech presentations to remain undetected until labour. We performed a retrospective analysis of 102 breech deliveries over a 14 month period to quantify missed breech presentations, and used a prospective physician survey documenting how fetal presentation was determined at 186 prenatal visits over four months to analyze the current detection process. We found that approximately 8% of breech presentations were undetected until labour. We concluded that within the limitations of the small sample size evaluated, the current practice of using a vaginal examination to verify fetal presentation determined by abdominal palpation (Leopold's manoeuvres) may not be more accurate than abdominal palpation alone. The current detection process resulted in an unacceptably high rate of missed breech presentations. The results of this study prompted the clinic's acquisition of bedside ultrasound capability to assess fetal position.
Elastic S-matrices in (1 + 1) dimensions and Toda field theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christe, P.; Mussardo, G.
Particular deformations of 2-D conformal field theory lead to integrable massive quantum field theories. These can be characterized by the relative scattering data. This paper proposes a general scheme for classifying the elastic nondegenerate S-matrix in (1 + 1) dimensions starting from the possible boot-strap processes and the spins of the conserved currents. Their identification with the S-matrix coming from the Toda field theory is analyzed. The authors discuss both cases of Toda field theory constructed with the simply-laced Dynkin diagrams and the nonsimply-laced ones. The authors present the results of the perturbative analysis and their geometrical interpretations.
Quantitative body DW-MRI biomarkers uncertainty estimation using unscented wild-bootstrap.
Freiman, M; Voss, S D; Mulkern, R V; Perez-Rossello, J M; Warfield, S K
2011-01-01
We present a new method for the uncertainty estimation of diffusion parameters for quantitative body DW-MRI assessment. Diffusion parameters uncertainty estimation from DW-MRI is necessary for clinical applications that use these parameters to assess pathology. However, uncertainty estimation using traditional techniques requires repeated acquisitions, which is undesirable in routine clinical use. Model-based bootstrap techniques, for example, assume an underlying linear model for residuals rescaling and cannot be utilized directly for body diffusion parameters uncertainty estimation due to the non-linearity of the body diffusion model. To offset this limitation, our method uses the Unscented transform to compute the residuals rescaling parameters from the non-linear body diffusion model, and then applies the wild-bootstrap method to infer the body diffusion parameters uncertainty. Validation through phantom and human subject experiments shows that our method identify the regions with higher uncertainty in body DWI-MRI model parameters correctly with realtive error of -36% in the uncertainty values.
Bootstrapping the (A1, A2) Argyres-Douglas theory
NASA Astrophysics Data System (ADS)
Cornagliotto, Martina; Lemos, Madalena; Liendo, Pedro
2018-03-01
We apply bootstrap techniques in order to constrain the CFT data of the ( A 1 , A 2) Argyres-Douglas theory, which is arguably the simplest of the Argyres-Douglas models. We study the four-point function of its single Coulomb branch chiral ring generator and put numerical bounds on the low-lying spectrum of the theory. Of particular interest is an infinite family of semi-short multiplets labeled by the spin ℓ. Although the conformal dimensions of these multiplets are protected, their three-point functions are not. Using the numerical bootstrap we impose rigorous upper and lower bounds on their values for spins up to ℓ = 20. Through a recently obtained inversion formula, we also estimate them for sufficiently large ℓ, and the comparison of both approaches shows consistent results. We also give a rigorous numerical range for the OPE coefficient of the next operator in the chiral ring, and estimates for the dimension of the first R-symmetry neutral non-protected multiplet for small spin.
López, Erick B; Yamashita, Takashi
2017-02-01
This study examined whether household income mediates the relationship between acculturation and vegetable consumption among Latino adults in the U.S. Data from the 2009 to 2010 National Health and Nutrition Examination Survey were analyzed. Vegetable consumption index was created based on the frequencies of five kinds of vegetables intake. Acculturation was measured with the degree of English language use at home. Path model with bootstrapping technique was employed for mediation analysis. A significant partial mediation relationship was identified. Greater acculturation [95 % bias corrected bootstrap confident interval (BCBCI) = (0.02, 0.33)] was associated with the higher income and in turn, greater vegetable consumption. At the same time, greater acculturation was associated with lower vegetable consumption [95 % BCBCI = (-0.88, -0.07)]. Findings regarding the income as a mediator of the acculturation-dietary behavior relationship inform unique intervention programs and policy changes to address health disparities by race/ethnicity.
Im, Subin; Min, Soonhong
2013-04-01
Exploratory factor analyses of the Kirton Adaption-Innovation Inventory (KAI), which serves to measure individual cognitive styles, generally indicate three factors: sufficiency of originality, efficiency, and rule/group conformity. In contrast, a 2005 study by Im and Hu using confirmatory factor analysis supported a four-factor structure, dividing the sufficiency of originality dimension into two subdimensions, idea generation and preference for change. This study extends Im and Hu's (2005) study of a derived version of the KAI by providing additional evidence of the four-factor structure. Specifically, the authors test the robustness of the parameter estimates to the violation of normality assumptions in the sample using bootstrap methods. A bias-corrected confidence interval bootstrapping procedure conducted among a sample of 356 participants--members of the Arkansas Household Research Panel, with middle SES and average age of 55.6 yr. (SD = 13.9)--showed that the four-factor model with two subdimensions of sufficiency of originality fits the data significantly better than the three-factor model in non-normality conditions.
How to bootstrap a human communication system.
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified items to a partner using repeated non-linguistic vocalization, repeated gesture, or repeated non-linguistic vocalization plus gesture (but without using their existing language system). Gesture proved more effective (measured by communication success) and more efficient (measured by the time taken to communicate) than non-linguistic vocalization across a range of item categories (emotion, object, and action). Combining gesture and vocalization did not improve performance beyond gesture alone. We experimentally demonstrate that gesture is a more effective means of bootstrapping a human communication system. We argue that gesture outperforms non-linguistic vocalization because it lends itself more naturally to the production of motivated signs. © 2013 Cognitive Science Society, Inc.
Measuring and Benchmarking Technical Efficiency of Public Hospitals in Tianjin, China
Li, Hao; Dong, Siping
2015-01-01
China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. PMID:26396090
NASA Astrophysics Data System (ADS)
Oladyshkin, S.; Schroeder, P.; Class, H.; Nowak, W.
2013-12-01
Predicting underground carbon dioxide (CO2) storage represents a challenging problem in a complex dynamic system. Due to lacking information about reservoir parameters, quantification of uncertainties may become the dominant question in risk assessment. Calibration on past observed data from pilot-scale test injection can improve the predictive power of the involved geological, flow, and transport models. The current work performs history matching to pressure time series from a pilot storage site operated in Europe, maintained during an injection period. Simulation of compressible two-phase flow and transport (CO2/brine) in the considered site is computationally very demanding, requiring about 12 days of CPU time for an individual model run. For that reason, brute-force approaches for calibration are not feasible. In the current work, we explore an advanced framework for history matching based on the arbitrary polynomial chaos expansion (aPC) and strict Bayesian principles. The aPC [1] offers a drastic but accurate stochastic model reduction. Unlike many previous chaos expansions, it can handle arbitrary probability distribution shapes of uncertain parameters, and can therefore handle directly the statistical information appearing during the matching procedure. We capture the dependence of model output on these multipliers with the expansion-based reduced model. In our study we keep the spatial heterogeneity suggested by geophysical methods, but consider uncertainty in the magnitude of permeability trough zone-wise permeability multipliers. Next combined the aPC with Bootstrap filtering (a brute-force but fully accurate Bayesian updating mechanism) in order to perform the matching. In comparison to (Ensemble) Kalman Filters, our method accounts for higher-order statistical moments and for the non-linearity of both the forward model and the inversion, and thus allows a rigorous quantification of calibrated model uncertainty. The usually high computational costs of accurate filtering become very feasible for our suggested aPC-based calibration framework. However, the power of aPC-based Bayesian updating strongly depends on the accuracy of prior information. In the current study, the prior assumptions on the model parameters were not satisfactory and strongly underestimate the reservoir pressure. Thus, the aPC-based response surface used in Bootstrap filtering is fitted to a distant and poorly chosen region within the parameter space. Thanks to the iterative procedure suggested in [2] we overcome this drawback with small computational costs. The iteration successively improves the accuracy of the expansion around the current estimation of the posterior distribution. The final result is a calibrated model of the site that can be used for further studies, with an excellent match to the data. References [1] Oladyshkin S. and Nowak W. Data-driven uncertainty quantification using the arbitrary polynomial chaos expansion. Reliability Engineering and System Safety, 106:179-190, 2012. [2] Oladyshkin S., Class H., Nowak W. Bayesian updating via Bootstrap filtering combined with data-driven polynomial chaos expansions: methodology and application to history matching for carbon dioxide storage in geological formations. Computational Geosciences, 17 (4), 671-687, 2013.
Bootstrapping quarks and gluons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chew, G.F.
1979-04-01
Dual topological unitarization (DTU) - the approach to S-matrix causality and unitarity through combinatorial topology - is reviewed. Amplitudes associated with triangulated spheres are shown to constitute the core of particle physics. Each sphere is covered by triangulated disc faces corresponding to hadrons. The leading current candidate for the hadron-face triangulation pattern employs 3-triangle basic subdiscs whose orientations correspond to baryon number and topological color. Additional peripheral triangles lie along the hadron-face perimeter. Certain combinations of peripheral triangles with a basic-disc triangle can be identified as quarks, the flavor of a quark corresponding to the orientation of its edges thatmore » lie on the hadron-face perimeter. Both baryon number and flavor are additively conserved. Quark helicity, which can be associated with triangle-interior orientation, is not uniformly conserved and interacts with particle momentum, whereas flavor does not. Three different colors attach to the 3 quarks associated with a single basic subdisc, but there is no additive physical conservation law associated with color. There is interplay between color and quark helicity. In hadron faces with more than one basic subdisc, there may occur pairs of adjacent flavorless but colored triangles with net helicity +-1 that are identifiable as gluons. Broken symmetry is an automatic feature of the bootstrap. T, C and P symmetries, as well as up-down flavor symmetry, persist on all orientable surfaces.« less
Li, Wen; Zhao, Li-Zhong; Ma, Dong-Wang; Wang, De-Zheng; Shi, Lei; Wang, Hong-Lei; Dong, Mo; Zhang, Shu-Yi; Cao, Lei; Zhang, Wei-Hua; Zhang, Xi-Peng; Zhang, Qing-Huai; Yu, Lin; Qin, Hai; Wang, Xi-Mo; Chen, Sam Li-Sheng
2018-05-01
We aimed to predict colorectal cancer (CRC) based on the demographic features and clinical correlates of personal symptoms and signs from Tianjin community-based CRC screening data.A total of 891,199 residents who were aged 60 to 74 and were screened in 2012 were enrolled. The Lasso logistic regression model was used to identify the predictors for CRC. Predictive validity was assessed by the receiver operating characteristic (ROC) curve. Bootstrapping method was also performed to validate this prediction model.CRC was best predicted by a model that included age, sex, education level, occupations, diarrhea, constipation, colon mucosa and bleeding, gallbladder disease, a stressful life event, family history of CRC, and a positive fecal immunochemical test (FIT). The area under curve (AUC) for the questionnaire with a FIT was 84% (95% CI: 82%-86%), followed by 76% (95% CI: 74%-79%) for a FIT alone, and 73% (95% CI: 71%-76%) for the questionnaire alone. With 500 bootstrap replications, the estimated optimism (<0.005) shows good discrimination in validation of prediction model.A risk prediction model for CRC based on a series of symptoms and signs related to enteric diseases in combination with a FIT was developed from first round of screening. The results of the current study are useful for increasing the awareness of high-risk subjects and for individual-risk-guided invitations or strategies to achieve mass screening for CRC.
BusyBee Web: metagenomic data analysis by bootstrapped supervised binning and annotation
Kiefer, Christina; Fehlmann, Tobias; Backes, Christina
2017-01-01
Abstract Metagenomics-based studies of mixed microbial communities are impacting biotechnology, life sciences and medicine. Computational binning of metagenomic data is a powerful approach for the culture-independent recovery of population-resolved genomic sequences, i.e. from individual or closely related, constituent microorganisms. Existing binning solutions often require a priori characterized reference genomes and/or dedicated compute resources. Extending currently available reference-independent binning tools, we developed the BusyBee Web server for the automated deconvolution of metagenomic data into population-level genomic bins using assembled contigs (Illumina) or long reads (Pacific Biosciences, Oxford Nanopore Technologies). A reversible compression step as well as bootstrapped supervised binning enable quick turnaround times. The binning results are represented in interactive 2D scatterplots. Moreover, bin quality estimates, taxonomic annotations and annotations of antibiotic resistance genes are computed and visualized. Ground truth-based benchmarks of BusyBee Web demonstrate comparably high performance to state-of-the-art binning solutions for assembled contigs and markedly improved performance for long reads (median F1 scores: 70.02–95.21%). Furthermore, the applicability to real-world metagenomic datasets is shown. In conclusion, our reference-independent approach automatically bins assembled contigs or long reads, exhibits high sensitivity and precision, enables intuitive inspection of the results, and only requires FASTA-formatted input. The web-based application is freely accessible at: https://ccb-microbe.cs.uni-saarland.de/busybee. PMID:28472498
Detection of Missing Proteins Using the PRIDE Database as a Source of Mass Spectrometry Evidence.
Garin-Muga, Alba; Odriozola, Leticia; Martínez-Val, Ana; Del Toro, Noemí; Martínez, Rocío; Molina, Manuela; Cantero, Laura; Rivera, Rocío; Garrido, Nicolás; Dominguez, Francisco; Sanchez Del Pino, Manuel M; Vizcaíno, Juan Antonio; Corrales, Fernando J; Segura, Victor
2016-11-04
The current catalogue of the human proteome is not yet complete, as experimental proteomics evidence is still elusive for a group of proteins known as the missing proteins. The Human Proteome Project (HPP) has been successfully using technology and bioinformatic resources to improve the characterization of such challenging proteins. In this manuscript, we propose a pipeline starting with the mining of the PRIDE database to select a group of data sets potentially enriched in missing proteins that are subsequently analyzed for protein identification with a method based on the statistical analysis of proteotypic peptides. Spermatozoa and the HEK293 cell line were found to be a promising source of missing proteins and clearly merit further attention in future studies. After the analysis of the selected samples, we found 342 PSMs, suggesting the presence of 97 missing proteins in human spermatozoa or the HEK293 cell line, while only 36 missing proteins were potentially detected in the retina, frontal cortex, aorta thoracica, or placenta. The functional analysis of the missing proteins detected confirmed their tissue specificity, and the validation of a selected set of peptides using targeted proteomics (SRM/MRM assays) further supports the utility of the proposed pipeline. As illustrative examples, DNAH3 and TEPP in spermatozoa, and UNCX and ATAD3C in HEK293 cells were some of the more robust and remarkable identifications in this study. We provide evidence indicating the relevance to carefully analyze the ever-increasing MS/MS data available from PRIDE and other repositories as sources for missing proteins detection in specific biological matrices as revealed for HEK293 cells.
Detection of Missing Proteins Using the PRIDE Database as a Source of Mass Spectrometry Evidence
2016-01-01
The current catalogue of the human proteome is not yet complete, as experimental proteomics evidence is still elusive for a group of proteins known as the missing proteins. The Human Proteome Project (HPP) has been successfully using technology and bioinformatic resources to improve the characterization of such challenging proteins. In this manuscript, we propose a pipeline starting with the mining of the PRIDE database to select a group of data sets potentially enriched in missing proteins that are subsequently analyzed for protein identification with a method based on the statistical analysis of proteotypic peptides. Spermatozoa and the HEK293 cell line were found to be a promising source of missing proteins and clearly merit further attention in future studies. After the analysis of the selected samples, we found 342 PSMs, suggesting the presence of 97 missing proteins in human spermatozoa or the HEK293 cell line, while only 36 missing proteins were potentially detected in the retina, frontal cortex, aorta thoracica, or placenta. The functional analysis of the missing proteins detected confirmed their tissue specificity, and the validation of a selected set of peptides using targeted proteomics (SRM/MRM assays) further supports the utility of the proposed pipeline. As illustrative examples, DNAH3 and TEPP in spermatozoa, and UNCX and ATAD3C in HEK293 cells were some of the more robust and remarkable identifications in this study. We provide evidence indicating the relevance to carefully analyze the ever-increasing MS/MS data available from PRIDE and other repositories as sources for missing proteins detection in specific biological matrices as revealed for HEK293 cells. PMID:27581094
Tam, Yvonne; Pearson, Luwei
2017-11-07
The Missed Opportunity tool was developed as an application in the Lives Saved Tool (LiST) to allow users to quickly compare the relative impact of interventions. Global Financing Facility (GFF) investment cases have been identified as a potential application of the Missed Opportunity analyses in Democratic Republic of the Congo (DRC), Ethiopia, Kenya, and Tanzania, to use 'lives saved' as a normative factor to set priorities. The Missed Opportunity analysis draws on data and methods in LiST to project maternal, stillbirth, and child deaths averted based on changes in interventions' coverage. Coverage of each individual intervention in LiST was automated to be scaled up from current coverage to 90% in the next year, to simulate a scenario where almost every mother and child receive proven interventions that they need. The main outcome of the Missed Opportunity analysis is deaths averted due to each intervention. When reducing unmet need for contraception is included in the analysis, it ranks as the top missed opportunity across the four countries. When it is not included in the analysis, top interventions with the most total deaths averted are hospital-based interventions such as labor and delivery management in the CEmOC and BEmOC level, and full treatment and supportive care for premature babies, and for sepsis/pneumonia. The Missed Opportunity tool can be used to provide a quick, first look at missed opportunities in a country or geographic region, and help identify interventions for prioritization. While it is a useful advocate for evidence-based priority setting, decision makers need to consider other factors that influence decision making, and also discuss how to implement, deliver, and sustain programs to achieve high coverage.
Normal Theory Two-Stage ML Estimator When Data Are Missing at the Item Level
Savalei, Victoria; Rhemtulla, Mijke
2017-01-01
In many modeling contexts, the variables in the model are linear composites of the raw items measured for each participant; for instance, regression and path analysis models rely on scale scores, and structural equation models often use parcels as indicators of latent constructs. Currently, no analytic estimation method exists to appropriately handle missing data at the item level. Item-level multiple imputation (MI), however, can handle such missing data straightforwardly. In this article, we develop an analytic approach for dealing with item-level missing data—that is, one that obtains a unique set of parameter estimates directly from the incomplete data set and does not require imputations. The proposed approach is a variant of the two-stage maximum likelihood (TSML) methodology, and it is the analytic equivalent of item-level MI. We compare the new TSML approach to three existing alternatives for handling item-level missing data: scale-level full information maximum likelihood, available-case maximum likelihood, and item-level MI. We find that the TSML approach is the best analytic approach, and its performance is similar to item-level MI. We recommend its implementation in popular software and its further study. PMID:29276371
Normal Theory Two-Stage ML Estimator When Data Are Missing at the Item Level.
Savalei, Victoria; Rhemtulla, Mijke
2017-08-01
In many modeling contexts, the variables in the model are linear composites of the raw items measured for each participant; for instance, regression and path analysis models rely on scale scores, and structural equation models often use parcels as indicators of latent constructs. Currently, no analytic estimation method exists to appropriately handle missing data at the item level. Item-level multiple imputation (MI), however, can handle such missing data straightforwardly. In this article, we develop an analytic approach for dealing with item-level missing data-that is, one that obtains a unique set of parameter estimates directly from the incomplete data set and does not require imputations. The proposed approach is a variant of the two-stage maximum likelihood (TSML) methodology, and it is the analytic equivalent of item-level MI. We compare the new TSML approach to three existing alternatives for handling item-level missing data: scale-level full information maximum likelihood, available-case maximum likelihood, and item-level MI. We find that the TSML approach is the best analytic approach, and its performance is similar to item-level MI. We recommend its implementation in popular software and its further study.
Missed and Delayed Diagnosis of Dementia in Primary Care: Prevalence and Contributing Factors
Bradford, Andrea; Kunik, Mark E.; Schulz, Paul; Williams, Susan P.; Singh, Hardeep
2009-01-01
Dementia is a growing public health problem for which early detection may be beneficial. Currently, the diagnosis of dementia in primary care is dependent mostly on clinical suspicion based on patient symptomsor caregivers’ concerns and is prone to be missed or delayed. We conducted a systematic review of the literature to ascertain the prevalence and contributing factors for missed and delayed dementia diagnoses in primary care. Prevalence of missed and delayed diagnosis was estimated by abstracting quantitative data from studies of diagnostic sensitivity among primary care providers. Possible predictors and contributory factors were determined from the text of quantitative and qualitative studies of patient-, caregiver-, provider-, and system-related barriers. Overall estimates of diagnostic sensitivity varied among studies and appeared to be in part a function of dementia severity, degree of patient impairment, dementia subtype, and frequency of patient-provider contact. Major contributory factors included problems with attitudes and patient-provider communication, educational deficits, and system resource constraints. The true prevalence of missed and delayed diagnoses of dementia is unknown but appears to be high. Until the case for dementia screening becomes more compelling, efforts to promote timely detection should focus on removing barriers to diagnosis. PMID:19568149
Guffey, Patrick; Szolnoki, Judit; Caldwell, James; Polaner, David
2011-07-01
Current incident reporting systems encourage retrospective reporting of morbidity and mortality and have low participation rates. A near miss is an event that did not cause patient harm, but had the potential to. By tracking and analyzing near misses, systems improvements can be targeted appropriately, and future errors may be prevented. An electronic, web based, secure, anonymous reporting system for anesthesiologists was designed and instituted at The Children's Hospital, Denver. This portal was compared to an existing hospital incident reporting system. A total of 150 incidents were reported in the first 3 months of operation, compared to four entered in the same time period 1 year ago. An anesthesia-specific anonymous near-miss reporting system, which eases and facilitates data entry and can prospectively identify processes and practices that place patients at risk, was implemented at a large, academic, freestanding children's hospital. This resulted in a dramatic increase in reported events and provided data to target and drive quality and process improvement. © 2011 Blackwell Publishing Ltd.
Classifier performance prediction for computer-aided diagnosis using a limited dataset.
Sahiner, Berkman; Chan, Heang-Ping; Hadjiiski, Lubomir
2008-04-01
In a practical classifier design problem, the true population is generally unknown and the available sample is finite-sized. A common approach is to use a resampling technique to estimate the performance of the classifier that will be trained with the available sample. We conducted a Monte Carlo simulation study to compare the ability of the different resampling techniques in training the classifier and predicting its performance under the constraint of a finite-sized sample. The true population for the two classes was assumed to be multivariate normal distributions with known covariance matrices. Finite sets of sample vectors were drawn from the population. The true performance of the classifier is defined as the area under the receiver operating characteristic curve (AUC) when the classifier designed with the specific sample is applied to the true population. We investigated methods based on the Fukunaga-Hayes and the leave-one-out techniques, as well as three different types of bootstrap methods, namely, the ordinary, 0.632, and 0.632+ bootstrap. The Fisher's linear discriminant analysis was used as the classifier. The dimensionality of the feature space was varied from 3 to 15. The sample size n2 from the positive class was varied between 25 and 60, while the number of cases from the negative class was either equal to n2 or 3n2. Each experiment was performed with an independent dataset randomly drawn from the true population. Using a total of 1000 experiments for each simulation condition, we compared the bias, the variance, and the root-mean-squared error (RMSE) of the AUC estimated using the different resampling techniques relative to the true AUC (obtained from training on a finite dataset and testing on the population). Our results indicated that, under the study conditions, there can be a large difference in the RMSE obtained using different resampling methods, especially when the feature space dimensionality is relatively large and the sample size is small. Under this type of conditions, the 0.632 and 0.632+ bootstrap methods have the lowest RMSE, indicating that the difference between the estimated and the true performances obtained using the 0.632 and 0.632+ bootstrap will be statistically smaller than those obtained using the other three resampling methods. Of the three bootstrap methods, the 0.632+ bootstrap provides the lowest bias. Although this investigation is performed under some specific conditions, it reveals important trends for the problem of classifier performance prediction under the constraint of a limited dataset.
Orlowski, Henning V; Klauer, Thomas; Freyberger, Harald J; Seidler, Günter H; Kuwert, Philipp
2013-01-01
Despite today's extensive research on the psychosocial consequences of World War II, the group of wives and children whose husbands or fathers went "missing in action" during the Second World War, has yet to be studied systematically in Germany. The present review article shows the special role the wives, and in particular the children, of missing German soldiers played in society and discusses the impact of their loved ones being unaccounted has had on the mental health of this group. An overview of current research on the psychosocial status of the war generation is given following a short historical introduction to the theme. Subsequently, we discuss the legal and social situation of the families of missing German soldiers during the postwar decades. Finally, two psychological concepts drawn from the US research show that specific disorders, such as complicated grief or "boundary ambiguity," can occur in the relatives of missing persons and blur the line between hope and grief occurring as a result of ambiguous loss. The psychosocial impact of having a relative go missing has hardly been noticed in the German research tradition after World War II. Particularly in light of the age structure of those directly affected and the experiences of transgenerational transmission this neglected psychosocial research subject urgently needs further scientific investigation, inasmuch as the age of the family members still allows it.
Results of Database Studies in Spine Surgery Can Be Influenced by Missing Data.
Basques, Bryce A; McLynn, Ryan P; Fice, Michael P; Samuel, Andre M; Lukasiewicz, Adam M; Bohl, Daniel D; Ahn, Junyoung; Singh, Kern; Grauer, Jonathan N
2017-12-01
National databases are increasingly being used for research in spine surgery; however, one limitation of such databases that has received sparse mention is the frequency of missing data. Studies using these databases often do not emphasize the percentage of missing data for each variable used and do not specify how patients with missing data are incorporated into analyses. This study uses the American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP) database to examine whether different treatments of missing data can influence the results of spine studies. (1) What is the frequency of missing data fields for demographics, medical comorbidities, preoperative laboratory values, operating room times, and length of stay recorded in ACS-NSQIP? (2) Using three common approaches to handling missing data, how frequently do those approaches agree in terms of finding particular variables to be associated with adverse events? (3) Do different approaches to handling missing data influence the outcomes and effect sizes of an analysis testing for an association with these variables with occurrence of adverse events? Patients who underwent spine surgery between 2005 and 2013 were identified from the ACS-NSQIP database. A total of 88,471 patients undergoing spine surgery were identified. The most common procedures were anterior cervical discectomy and fusion, lumbar decompression, and lumbar fusion. Demographics, comorbidities, and perioperative laboratory values were tabulated for each patient, and the percent of missing data was noted for each variable. These variables were tested for an association with "any adverse event" using three separate multivariate regressions that used the most common treatments for missing data. In the first regression, patients with any missing data were excluded. In the second regression, missing data were treated as a negative or "reference" value; for continuous variables, the mean of each variable's reference range was computed and imputed. In the third regression, any variables with > 10% rate of missing data were removed from the regression; among variables with ≤ 10% missing data, individual cases with missing values were excluded. The results of these regressions were compared to determine how the different treatments of missing data could affect the results of spine studies using the ACS-NSQIP database. Of the 88,471 patients, as many as 4441 (5%) had missing elements among demographic data, 69,184 (72%) among comorbidities, 70,892 (80%) among preoperative laboratory values, and 56,551 (64%) among operating room times. Considering the three different treatments of missing data, we found different risk factors for adverse events. Of 44 risk factors found to be associated with adverse events in any analysis, only 15 (34%) of these risk factors were common among the three regressions. The second treatment of missing data (assuming "normal" value) found the most risk factors (40) to be associated with any adverse event, whereas the first treatment (deleting patients with missing data) found the fewest associations at 20. Among the risk factors associated with any adverse event, the 10 with the greatest effect size (odds ratio) by each regression were ranked. Of the 15 variables in the top 10 for any regression, six of these were common among all three lists. Differing treatments of missing data can influence the results of spine studies using the ACS-NSQIP. The current study highlights the importance of considering how such missing data are handled. Until there are better guidelines on the best approaches to handle missing data, investigators should report how missing data were handled to increase the quality and transparency of orthopaedic database research. Readers of large database studies should note whether handling of missing data was addressed and consider potential bias with high rates or unspecified or weak methods for handling missing data.
Kinetic effects on the currents determining the stability of a magnetic island in tokamaks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poli, E., E-mail: emanuele.poli@ipp.mpg.de; Bergmann, A.; Casson, F. J.
The role of the bootstrap and polarization currents for the stability of neoclassical tearing modes is investigated employing both a drift kinetic and a gyrokinetic approach. The adiabatic response of the ions around the island separatrix implies, for island widths below or around the ion thermal banana width, density flattening for islands rotating at the ion diamagnetic frequency, while for islands rotating at the electron diamagnetic frequency the density is unperturbed and the only contribution to the neoclassical drive arises from electron temperature flattening. As for the polarization current, the full inclusion of finite orbit width effects in the calculationmore » of the potential developing in a rotating island leads to a smoothing of the discontinuous derivatives exhibited by the analytic potential on which the polarization term used in the modeling is based. This leads to a reduction of the polarization-current contribution with respect to the analytic estimate, in line with other studies. Other contributions to the perpendicular ion current, related to the response of the particles around the island separatrix, are found to compete or even dominate the polarization-current term for realistic island rotation frequencies.« less
Identity, Intersectionality, and Mixed-Methods Approaches
ERIC Educational Resources Information Center
Harper, Casandra E.
2011-01-01
In this article, the author argues that current strategies to study and understand students' identities fall short of fully capturing their complexity. A multi-dimensional perspective and a mixed-methods approach can reveal nuance that is missed with current approaches. The author offers an illustration of how mixed-methods research can promote a…
Maternal Depression and Trait Anger as Risk Factors for Escalated Physical Discipline
Shay, Nicole L.; Knutson, John F.
2008-01-01
To test the hypothesized anger-mediated relation between maternal depression and escalation of physical discipline, 122 economically disadvantaged mothers were assessed for current and lifetime diagnoses of depression using the Current Depressive Episode, Past Depression, and Dysthymia sections of the Structured Clinical Interview for DSM-IV (SCID) and a measure of current depressive symptoms, the Beck Depression Inventory–Second Edition (BDI-II). Escalation of physical discipline was assessed using a video analog parenting task; maternal anger not specific to discipline was assessed using the Spielberger Trait Anger Expression Inventory. Reports of anger were associated with the diagnosis of depression and depressive symptoms. Bootstrap analyses of indirect effects indicated that the link between depression and escalated discipline was mediated by anger. Parallel analyses based on BDI-II scores identified a marginally significant indirect effect of depression on discipline. Findings suggest that anger and irritability are central to the putative link between depression and harsh discipline. PMID:18174347
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Xian-Qu; Zhang, Rui-Bin; Meng, Guo
2016-07-15
The destabilization of ideal internal kink modes by trapped fast particles in tokamak plasmas with a “shoulder”-like equilibrium current is investigated. It is found that energetic particle branch of the mode is unstable with the driving of fast-particle precession drifts and corresponds to a precessional fishbone. The mode with a low stability threshold is also more easily excited than the conventional precessional fishbone. This is different from earlier studies for the same equilibrium in which the magnetohydrodynamic (MHD) branch of the mode is stable. Furthermore, the stability and characteristic frequency of the mode are analyzed by solving the dispersion relationmore » and comparing with the conventional fishbone. The results suggest that an equilibrium with a locally flattened q-profile, may be modified by localized current drive (or bootstrap current, etc.), is prone to the onset of the precessional fishbone branch of the mode.« less
Joling, Karlijn J; Bosmans, Judith E; van Marwijk, Harm W J; van der Horst, Henriëtte E; Scheltens, Philip; MacNeil Vroomen, Janet L; van Hout, Hein P J
2013-09-22
Dementia imposes a heavy burden on health and social care systems as well as on family caregivers who provide a substantial portion of the care. Interventions that effectively support caregivers may prevent or delay patient institutionalization and hence be cost-effective. However, evidence about the cost-effectiveness of such interventions is scarce. The aim of this study was to evaluate the cost-effectiveness of a family meetings intervention for family caregivers of dementia patients in comparison with usual care over a period of 12 months. The economic evaluation was conducted from a societal perspective alongside a randomized trial of 192 primary caregivers with community-dwelling dementia patients. Outcome measures included the Quality Adjusted Life-Years (QALY) of caregivers and patients and the incidence of depression and anxiety disorders in caregivers. Missing cost and effect data were imputed using multiple imputations. Bootstrapping was used to estimate uncertainty around the cost-differences and the incremental cost-effectiveness ratio (ICER). The bootstrapped cost-effect pairs were plotted on a cost-effectiveness plane and used to estimate cost-effectiveness curves. No significant differences in costs and effects between the groups were found. At 12 months, total costs per patient and primary caregiver dyad were substantial: €77,832 for the intervention group and €75,201 for the usual care group (adjusted mean difference per dyad €4,149, 95% CI -13,371 to 21,956, ICER 157,534). The main cost driver was informal care (66% of total costs), followed by patients' day treatment and costs of hospital and long-term care facility admissions (23%). Based on the cost-effectiveness acceptability curves, the maximum probability that the intervention was considered cost-effective in comparison with usual care reached 0.4 for the outcome QALY per patient-caregiver dyad and 0.6 for the caregivers' incidence of depression and/or anxiety disorders regardless of the willingness to pay. The annual costs of caring for a person with dementia were substantial with informal care being by far the largest contributor to the total societal costs. Based on this study, family meetings cannot be considered a cost-effective intervention strategy in comparison with usual care. ISRCTN register, ISRCTN90163486.
NASA Astrophysics Data System (ADS)
Iwashita, Fabio; Brooks, Andrew; Spencer, John; Borombovits, Daniel; Curwen, Graeme; Olley, Jon
2015-04-01
Assessing bank stability using geotechnical models traditionally involves the laborious collection of data on the bank and floodplain stratigraphy, as well as in-situ geotechnical data for each sedimentary unit within a river bank. The application of geotechnical bank stability models are limited to those sites where extensive field data has been collected, where their ability to provide predictions of bank erosion at the reach scale are limited without a very extensive and expensive field data collection program. Some challenges in the construction and application of riverbank erosion and hydraulic numerical models are their one-dimensionality, steady-state requirements, lack of calibration data, and nonuniqueness. Also, numerical models commonly can be too rigid with respect to detecting unexpected features like the onset of trends, non-linear relations, or patterns restricted to sub-samples of a data set. These shortcomings create the need for an alternate modelling approach capable of using available data. The application of the Self-Organizing Maps (SOM) approach is well-suited to the analysis of noisy, sparse, nonlinear, multidimensional, and scale-dependent data. It is a type of unsupervised artificial neural network with hybrid competitive-cooperative learning. In this work we present a method that uses a database of geotechnical data collected at over 100 sites throughout Queensland State, Australia, to develop a modelling approach that enables geotechnical parameters (soil effective cohesion, friction angle, soil erodibility and critical stress) to be derived from sediment particle size data (PSD). The model framework and predicted values were evaluated using two methods, splitting the dataset into training and validation set, and through a Bootstrap approach. The basis of Bootstrap cross-validation is a leave-one-out strategy. This requires leaving one data value out of the training set while creating a new SOM to estimate that missing value based on the remaining data. As a new SOM is created up to 30 times for each value under scrutiny, it forms the basis for a stochastic framework from which residuals are used to evaluate error statistics and model bias. The proposed method is suitable to estimate soil geotechnical properties, revealing and quantifying relationships between geotechnical variables and particle distribution size, not properly observed by linear multivariate statistical approaches.
NATbox: a network analysis toolbox in R.
Chavan, Shweta S; Bauer, Michael A; Scutari, Marco; Nagarajan, Radhakrishnan
2009-10-08
There has been recent interest in capturing the functional relationships (FRs) from high-throughput assays using suitable computational techniques. FRs elucidate the working of genes in concert as a system as opposed to independent entities hence may provide preliminary insights into biological pathways and signalling mechanisms. Bayesian structure learning (BSL) techniques and its extensions have been used successfully for modelling FRs from expression profiles. Such techniques are especially useful in discovering undocumented FRs, investigating non-canonical signalling mechanisms and cross-talk between pathways. The objective of the present study is to develop a graphical user interface (GUI), NATbox: Network Analysis Toolbox in the language R that houses a battery of BSL algorithms in conjunction with suitable statistical tools for modelling FRs in the form of acyclic networks from gene expression profiles and their subsequent analysis. NATbox is a menu-driven open-source GUI implemented in the R statistical language for modelling and analysis of FRs from gene expression profiles. It provides options to (i) impute missing observations in the given data (ii) model FRs and network structure from gene expression profiles using a battery of BSL algorithms and identify robust dependencies using a bootstrap procedure, (iii) present the FRs in the form of acyclic graphs for visualization and investigate its topological properties using network analysis metrics, (iv) retrieve FRs of interest from published literature. Subsequently, use these FRs as structural priors in BSL (v) enhance scalability of BSL across high-dimensional data by parallelizing the bootstrap routines. NATbox provides a menu-driven GUI for modelling and analysis of FRs from gene expression profiles. By incorporating readily available functions from existing R-packages, it minimizes redundancy and improves reproducibility, transparency and sustainability, characteristic of open-source environments. NATbox is especially suited for interdisciplinary researchers and biologists with minimal programming experience and would like to use systems biology approaches without delving into the algorithmic aspects. The GUI provides appropriate parameter recommendations for the various menu options including default parameter choices for the user. NATbox can also prove to be a useful demonstration and teaching tool in graduate and undergraduate course in systems biology. It has been tested successfully under Windows and Linux operating systems. The source code along with installation instructions and accompanying tutorial can be found at http://bioinformatics.ualr.edu/natboxWiki/index.php/Main_Page.
Concept Innateness, Concept Continuity, and Bootstrapping
Carey, Susan
2011-01-01
The commentators raised issues relevant to all three important theses of The Origin of Concepts (TOOC). Some questioned the very existence of innate representational primitives, and others questioned my claims about their richness and whether they should be thought of as concepts. Some questioned the existence of conceptual discontinuity in the course of knowledge acquisition and others argued that discontinuity is much more common than portrayed in TOOC. Some raised issues with my characterization of Quinian bootstrapping, and others questioned the dual factor theory of concepts motivated by my picture of conceptual development. PMID:23264705
Crossing symmetry in alpha space
NASA Astrophysics Data System (ADS)
Hogervorst, Matthijs; van Rees, Balt C.
2017-11-01
We initiate the study of the conformal bootstrap using Sturm-Liouville theory, specializing to four-point functions in one-dimensional CFTs. We do so by decomposing conformal correlators using a basis of eigenfunctions of the Casimir which are labeled by a complex number α. This leads to a systematic method for computing conformal block decompositions. Analyzing bootstrap equations in alpha space turns crossing symmetry into an eigenvalue problem for an integral operator K. The operator K is closely related to the Wilson transform, and some of its eigenfunctions can be found in closed form.
Direct measurement of fast transients by using boot-strapped waveform averaging
NASA Astrophysics Data System (ADS)
Olsson, Mattias; Edman, Fredrik; Karki, Khadga Jung
2018-03-01
An approximation to coherent sampling, also known as boot-strapped waveform averaging, is presented. The method uses digital cavities to determine the condition for coherent sampling. It can be used to increase the effective sampling rate of a repetitive signal and the signal to noise ratio simultaneously. The method is demonstrated by using it to directly measure the fluorescence lifetime from Rhodamine 6G by digitizing the signal from a fast avalanche photodiode. The obtained lifetime of 4.0 ns is in agreement with the known values.
On the Counterfactual Nature of Gambling Near-misses: An Experimental Study.
Wu, Yin; van Dijk, Eric; Li, Hong; Aitken, Michael; Clark, Luke
2017-10-01
Research on gambling near-misses has shown that objectively equivalent outcomes can yield divergent emotional and motivational responses. The subjective processing of gambling outcomes is affected substantially by close but non-obtained outcomes (i.e. counterfactuals). In the current paper, we investigate how different types of near-misses influence self-perceived luck and subsequent betting behavior in a wheel-of-fortune task. We investigate the counterfactual mechanism of these effects by testing the relationship with a second task measuring regret/relief processing. Across two experiments (Experiment 1, n = 51; Experiment 2, n = 104), we demonstrate that near-wins (neutral outcomes that are close to a jackpot) decreased self-perceived luck, whereas near-losses (neutral outcomes that are close to a major penalty) increased luck ratings. The effects of near-misses varied by near-miss position (i.e. whether the spinner stopped just short of, or passed through, the counterfactual outcome), consistent with established distinctions between upward versus downward, and additive versus subtractive, counterfactual thinking. In Experiment 1, individuals who showed stronger counterfactual processing on the regret/relief task were more responsive to near-wins and near-losses on the wheel-of-fortune task. The effect of near-miss position was attenuated when the anticipatory phase (i.e. the spin and deceleration) was removed in Experiment 2. Further differences were observed within the objective gains and losses, between "clear" and "narrow" outcomes. Taken together, these results help substantiate the counterfactual mechanism of near-misses. © 2017 The Authors Journal of Behavioral Decision Making Published by John Wiley & Sons Ltd.
On the Counterfactual Nature of Gambling Near‐misses: An Experimental Study
van Dijk, Eric; Li, Hong; Aitken, Michael; Clark, Luke
2017-01-01
Abstract Research on gambling near‐misses has shown that objectively equivalent outcomes can yield divergent emotional and motivational responses. The subjective processing of gambling outcomes is affected substantially by close but non‐obtained outcomes (i.e. counterfactuals). In the current paper, we investigate how different types of near‐misses influence self‐perceived luck and subsequent betting behavior in a wheel‐of‐fortune task. We investigate the counterfactual mechanism of these effects by testing the relationship with a second task measuring regret/relief processing. Across two experiments (Experiment 1, n = 51; Experiment 2, n = 104), we demonstrate that near‐wins (neutral outcomes that are close to a jackpot) decreased self‐perceived luck, whereas near‐losses (neutral outcomes that are close to a major penalty) increased luck ratings. The effects of near‐misses varied by near‐miss position (i.e. whether the spinner stopped just short of, or passed through, the counterfactual outcome), consistent with established distinctions between upward versus downward, and additive versus subtractive, counterfactual thinking. In Experiment 1, individuals who showed stronger counterfactual processing on the regret/relief task were more responsive to near‐wins and near‐losses on the wheel‐of‐fortune task. The effect of near‐miss position was attenuated when the anticipatory phase (i.e. the spin and deceleration) was removed in Experiment 2. Further differences were observed within the objective gains and losses, between “clear” and “narrow” outcomes. Taken together, these results help substantiate the counterfactual mechanism of near‐misses. © 2017 The Authors Journal of Behavioral Decision Making Published by John Wiley & Sons Ltd. PMID:29081596
NASA Astrophysics Data System (ADS)
Stroeve, Julienne C.; Jenouvrier, Stephanie; Campbell, G. Garrett; Barbraud, Christophe; Delord, Karine
2016-08-01
Sea ice variability within the marginal ice zone (MIZ) and polynyas plays an important role for phytoplankton productivity and krill abundance. Therefore, mapping their spatial extent as well as seasonal and interannual variability is essential for understanding how current and future changes in these biologically active regions may impact the Antarctic marine ecosystem. Knowledge of the distribution of MIZ, consolidated pack ice and coastal polynyas in the total Antarctic sea ice cover may also help to shed light on the factors contributing towards recent expansion of the Antarctic ice cover in some regions and contraction in others. The long-term passive microwave satellite data record provides the longest and most consistent record for assessing the proportion of the sea ice cover that is covered by each of these ice categories. However, estimates of the amount of MIZ, consolidated pack ice and polynyas depend strongly on which sea ice algorithm is used. This study uses two popular passive microwave sea ice algorithms, the NASA Team and Bootstrap, and applies the same thresholds to the sea ice concentrations to evaluate the distribution and variability in the MIZ, the consolidated pack ice and coastal polynyas. Results reveal that the seasonal cycle in the MIZ and pack ice is generally similar between both algorithms, yet the NASA Team algorithm has on average twice the MIZ and half the consolidated pack ice area as the Bootstrap algorithm. Trends also differ, with the Bootstrap algorithm suggesting statistically significant trends towards increased pack ice area and no statistically significant trends in the MIZ. The NASA Team algorithm on the other hand indicates statistically significant positive trends in the MIZ during spring. Potential coastal polynya area and amount of broken ice within the consolidated ice pack are also larger in the NASA Team algorithm. The timing of maximum polynya area may differ by as much as 5 months between algorithms. These differences lead to different relationships between sea ice characteristics and biological processes, as illustrated here with the breeding success of an Antarctic seabird.
Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques
NASA Astrophysics Data System (ADS)
Mai, J.; Tolson, B.
2017-12-01
The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method's independency of the convergence testing method, we applied it to two widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991) and the variance-based Sobol' method (Solbol' 1993). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed sensitivity results. This is one step towards reliable and transferable, published sensitivity results.
Brunelli, Alessandro; Tentzeris, Vasileios; Sandri, Alberto; McKenna, Alexandra; Liew, Shan Liung; Milton, Richard; Chaudhuri, Nilanjan; Kefaloyannis, Emmanuel; Papagiannopoulos, Kostas
2016-05-01
To develop a clinically risk-adjusted financial model to estimate the cost associated with a video-assisted thoracoscopic surgery (VATS) lobectomy programme. Prospectively collected data of 236 VATS lobectomy patients (August 2012-December 2013) were analysed retrospectively. Fixed and variable intraoperative and postoperative costs were retrieved from the Hospital Accounting Department. Baseline and surgical variables were tested for a possible association with total cost using a multivariable linear regression and bootstrap analyses. Costs were calculated in GBP and expressed in Euros (EUR:GBP exchange rate 1.4). The average total cost of a VATS lobectomy was €11 368 (range €6992-€62 535). Average intraoperative (including surgical and anaesthetic time, overhead, disposable materials) and postoperative costs [including ward stay, high dependency unit (HDU) or intensive care unit (ICU) and variable costs associated with management of complications] were €8226 (range €5656-€13 296) and €3029 (range €529-€51 970), respectively. The following variables remained reliably associated with total costs after linear regression analysis and bootstrap: carbon monoxide lung diffusion capacity (DLCO) <60% predicted value (P = 0.02, bootstrap 63%) and chronic obstructive pulmonary disease (COPD; P = 0.035, bootstrap 57%). The following model was developed to estimate the total costs: 10 523 + 1894 × COPD + 2376 × DLCO < 60%. The comparison between predicted and observed costs was repeated in 1000 bootstrapped samples to verify the stability of the model. The two values were not different (P > 0.05) in 86% of the samples. A hypothetical patient with COPD and DLCO less than 60% would cost €4270 more than a patient without COPD and with higher DLCO values (€14 793 vs €10 523). Risk-adjusting financial data can help estimate the total cost associated with VATS lobectomy based on clinical factors. This model can be used to audit the internal financial performance of a VATS lobectomy programme for budgeting, planning and for appropriate bundled payment reimbursements. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
Kahana, Shoshana Y.; Jenkins, Richard A.; Bruce, Douglas; Fernandez, Maria I.; Hightow-Weidman, Lisa B.; Bauermeister, Jose A.
2016-01-01
Background The authors examined associations between structural characteristics and HIV disease management among a geographically diverse sample of behaviorally and perinatally HIV-infected adolescents and young adults in the United States. Methods The sample included 1891 adolescents and young adults living with HIV (27.8% perinatally infected; 72.2% behaviorally infected) who were linked to care through 20 Adolescent Medicine Trials Network for HIV/AIDS Interventions Units. All completed audio computer–assisted self-interview surveys. Chart abstraction or blood draw provided viral load data. Geographic-level variables were extracted from the United States Census Bureau (e.g., socioeconomic disadvantage, percent of Black and Latino households, percent rural) and Esri Crime (e.g., global crime index) databases as Zip Code Tabulation Areas. AIDSVu data (e.g., prevalence of HIV among youth) were extracted at the county-level. Using HLM v.7, the authors conducted means-as-outcomes random effects multi-level models to examine the association between structural-level and individual-level factors and (1) being on antiretroviral therapy (ART) currently; (2) being on ART for at least 6 months; (3) missed HIV care appointments (not having missed any vs. having missed one or more appointments) over the past 12 months; and (4) viral suppression (defined by the corresponding assay cutoff for the lower limit of viral load at each participating site which denoted nondetectability vs. detectability). Results Frequencies for the 4 primary outcomes were as follows: current ART use (n = 1120, 59.23%); ART use for ≥6 months (n = 861, 45.53%); at least one missed HIV care appointment (n = 936, 49.50); and viral suppression (n = 577, 30.51%). After adjusting for individual-level factors, youth living in more disadvantaged areas (defined by a composite score derived from 2010 Census indicators including percent poverty, percent receiving public assistance, percent of female, single-headed households, percent unemployment, and percent of people with less than a high school degree) were less likely to report current ART use (OR: 0.85, 95% CI: 0.72–1.00, p = .05). Among current ART users, living in more disadvantaged areas was associated with greater likelihood of having used ART for ≥6 months. Participants living in counties with greater HIV prevalence among 13–24 year olds were more likely to report current ART use (OR: 1.32, 95% CI: 1.05–1.65, p = .02), ≥6 months ART use (OR: 1.32, 95% CI: 1.05–1.65, p = .02), and to be virally suppressed (OR: 1.50, 95% CI: 1.20–1.87, p = .001); however, youth in these areas were also more likely to report missed medical appointments (OR: 1.32, 95% CI: 1.07–1.63, p = .008). Conclusions The findings underscore the multi-level and structural factors associated with ART use, missed HIV care appointments, and viral suppression for adolescents and young adults in the United States. Consideration of these factors is strongly recommended in future intervention, clinical practice, and policy research that seek to understand the contextual influences on individuals’ health behaviors. PMID:27035905
Roche, Anne I; Kroska, Emily B; Miller, Michelle L; Kroska, Sydney K; O'Hara, Michael W
2018-03-22
Childhood trauma is associated with a variety of risky, unhealthy, or problem behaviors. The current study aimed to explore experiential avoidance and mindfulness processes as mechanisms through which childhood trauma and problem behavior are linked in a college sample. The sample consisted of college-aged young adults recruited November-December, 2016 (N = 414). Participants completed self-report measures of childhood trauma, current problem behavior, experiential avoidance, and mindfulness processes. Bootstrapped mediation analyses examined the mechanistic associations of interest. Mediation analyses indicated that experiential avoidance was a significant mediator of the association between childhood trauma and problem behavior. Additionally, multiple mediation analyses indicated that specific mindfulness facets-act with awareness and nonjudgment of inner experience-significantly mediated the same association. Interventions for college students who have experienced childhood trauma might profitably target mechanisms such as avoidance and mindfulness in order to minimize engagement in problem behavior.
Anomalous dimensions of spinning operators from conformal symmetry
NASA Astrophysics Data System (ADS)
Gliozzi, Ferdinando
2018-01-01
We compute, to the first non-trivial order in the ɛ-expansion of a perturbed scalar field theory, the anomalous dimensions of an infinite class of primary operators with arbitrary spin ℓ = 0, 1, . . . , including as a particular case the weakly broken higher-spin currents, using only constraints from conformal symmetry. Following the bootstrap philosophy, no reference is made to any Lagrangian, equations of motion or coupling constants. Even the space dimensions d are left free. The interaction is implicitly turned on through the local operators by letting them acquire anomalous dimensions. When matching certain four-point and five-point functions with the corresponding quantities of the free field theory in the ɛ → 0 limit, no free parameter remains. It turns out that only the expected discrete d values are permitted and the ensuing anomalous dimensions reproduce known results for the weakly broken higher-spin currents and provide new results for the other spinning operators.
Bootstrapping language acquisition.
Abend, Omri; Kwiatkowski, Tom; Smith, Nathaniel J; Goldwater, Sharon; Steedman, Mark
2017-07-01
The semantic bootstrapping hypothesis proposes that children acquire their native language through exposure to sentences of the language paired with structured representations of their meaning, whose component substructures can be associated with words and syntactic structures used to express these concepts. The child's task is then to learn a language-specific grammar and lexicon based on (probably contextually ambiguous, possibly somewhat noisy) pairs of sentences and their meaning representations (logical forms). Starting from these assumptions, we develop a Bayesian probabilistic account of semantically bootstrapped first-language acquisition in the child, based on techniques from computational parsing and interpretation of unrestricted text. Our learner jointly models (a) word learning: the mapping between components of the given sentential meaning and lexical words (or phrases) of the language, and (b) syntax learning: the projection of lexical elements onto sentences by universal construction-free syntactic rules. Using an incremental learning algorithm, we apply the model to a dataset of real syntactically complex child-directed utterances and (pseudo) logical forms, the latter including contextually plausible but irrelevant distractors. Taking the Eve section of the CHILDES corpus as input, the model simulates several well-documented phenomena from the developmental literature. In particular, the model exhibits syntactic bootstrapping effects (in which previously learned constructions facilitate the learning of novel words), sudden jumps in learning without explicit parameter setting, acceleration of word-learning (the "vocabulary spurt"), an initial bias favoring the learning of nouns over verbs, and one-shot learning of words and their meanings. The learner thus demonstrates how statistical learning over structured representations can provide a unified account for these seemingly disparate phenomena. Copyright © 2017 Elsevier B.V. All rights reserved.
Effect of non-normality on test statistics for one-way independent groups designs.
Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R
2012-02-01
The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.
Chen, P; Yu, S; Zhu, G
2012-12-01
The aim of the current study was to investigate the psychosocial impact of dental aesthetics among patients who received anterior implant-supported prostheses. The current study is a cross-sectional evaluation involving 115 individuals who had gone through treatment at the dental clinics of general hospitals. Participants completed the Chinese version of the psychosocial impact of dental aesthetics questionnaire (PIDAQ) before implantation and six months after crown restoration. Basic demographic information was recorded. Six months after implant crown restoration, participants were asked to self-assess their own oral aesthetics compared to before implantation. A total of 106 patients completed the study. PIDAQ scores correlated significantly with the self-assessment of the degree of oral aesthetics. Six months after crown restoration, the two factors (social impact and aesthetic attitude) decreased and the dental self-confidence score increased significantly compared to pre-implantation scores. Gender and education level significantly affected PIDAQ. Anterior implant-supported prostheses significantly affected the patients' psychosocial perception. Implantation of missing anterior teeth can significantly improve patients' negative psychosocial impact of dental aesthetics. Gender and education level are correlated with the degree of improvement. The PIDAQ can be used in assessing the psychosocial effects of implantation in missing anterior teeth.
Dong, Qi; Elliott, Michael R; Raghunathan, Trivellore E
2014-06-01
Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs.
Dong, Qi; Elliott, Michael R.; Raghunathan, Trivellore E.
2017-01-01
Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs. PMID:29200608
Spring, Stefan; Scheuner, Carmen; Göker, Markus; Klenk, Hans-Peter
2015-01-01
In recent years a large number of isolates were obtained from saline environments that are phylogenetically related to distinct clades of oligotrophic marine gammaproteobacteria, which were originally identified in seawater samples using cultivation independent methods and are characterized by high seasonal abundances in coastal environments. To date a sound taxonomic framework for the classification of these ecologically important isolates and related species in accordance with their evolutionary relationships is missing. In this study we demonstrate that a reliable allocation of members of the oligotrophic marine gammaproteobacteria (OMG) group and related species to higher taxonomic ranks is possible by phylogenetic analyses of whole proteomes but also of the RNA polymerase beta subunit, whereas phylogenetic reconstructions based on 16S rRNA genes alone resulted in unstable tree topologies with only insignificant bootstrap support. The identified clades could be correlated with distinct phenotypic traits illustrating an adaptation to common environmental factors in their evolutionary history. Genome wide gene-content analyses revealed the existence of two distinct ecological guilds within the analyzed lineage of marine gammaproteobacteria which can be distinguished by their trophic strategies. Based on our results a novel order within the class Gammaproteobacteria is proposed, which is designated Cellvibrionales ord. nov. and comprises the five novel families Cellvibrionaceae fam. nov., Halieaceae fam. nov., Microbulbiferaceae fam. nov., Porticoccaceae fam. nov., and Spongiibacteraceae fam. nov. PMID:25914684
Frey, H Christopher; Zhao, Yuchao
2004-11-15
Probabilistic emission inventories were developed for urban air toxic emissions of benzene, formaldehyde, chromium, and arsenic for the example of Houston. Variability and uncertainty in emission factors were quantified for 71-97% of total emissions, depending upon the pollutant and data availability. Parametric distributions for interunit variability were fit using maximum likelihood estimation (MLE), and uncertainty in mean emission factors was estimated using parametric bootstrap simulation. For data sets containing one or more nondetected values, empirical bootstrap simulation was used to randomly sample detection limits for nondetected values and observations for sample values, and parametric distributions for variability were fit using MLE estimators for censored data. The goodness-of-fit for censored data was evaluated by comparison of cumulative distributions of bootstrap confidence intervals and empirical data. The emission inventory 95% uncertainty ranges are as small as -25% to +42% for chromium to as large as -75% to +224% for arsenic with correlated surrogates. Uncertainty was dominated by only a few source categories. Recommendations are made for future improvements to the analysis.
Bootstrapping non-commutative gauge theories from L∞ algebras
NASA Astrophysics Data System (ADS)
Blumenhagen, Ralph; Brunner, Ilka; Kupriyanov, Vladislav; Lüst, Dieter
2018-05-01
Non-commutative gauge theories with a non-constant NC-parameter are investigated. As a novel approach, we propose that such theories should admit an underlying L∞ algebra, that governs not only the action of the symmetries but also the dynamics of the theory. Our approach is well motivated from string theory. We recall that such field theories arise in the context of branes in WZW models and briefly comment on its appearance for integrable deformations of AdS5 sigma models. For the SU(2) WZW model, we show that the earlier proposed matrix valued gauge theory on the fuzzy 2-sphere can be bootstrapped via an L∞ algebra. We then apply this approach to the construction of non-commutative Chern-Simons and Yang-Mills theories on flat and curved backgrounds with non-constant NC-structure. More concretely, up to the second order, we demonstrate how derivative and curvature corrections to the equations of motion can be bootstrapped in an algebraic way from the L∞ algebra. The appearance of a non-trivial A∞ algebra is discussed, as well.
A symbol of uniqueness: the cluster bootstrap for the 3-loop MHV heptagon
Drummond, J. M.; Papathanasiou, G.; Spradlin, M.
2015-03-16
Seven-particle scattering amplitudes in planar super-Yang-Mills theory are believed to belong to a special class of generalised polylogarithm functions called heptagon functions. These are functions with physical branch cuts whose symbols may be written in terms of the 42 cluster A-coordinates on Gr(4, 7). Motivated by the success of the hexagon bootstrap programme for constructing six-particle amplitudes we initiate the systematic study of the symbols of heptagon functions. We find that there is exactly one such symbol of weight six which satisfies the MHV last-entry condition and is finite in the 7 ll 6 collinear limit. This unique symbol ismore » both dihedral and parity-symmetric, and remarkably its collinear limit is exactly the symbol of the three-loop six-particle MHV amplitude, although none of these properties were assumed a priori. It must therefore be the symbol of the threeloop seven-particle MHV amplitude. The simplicity of its construction suggests that the n-gon bootstrap may be surprisingly powerful for n > 6.« less
NASA Technical Reports Server (NTRS)
Yoshikawa, H. H.; Madison, I. B.
1971-01-01
This study was performed in support of the NASA Task B-2 Study Plan for Space Basing. The nature of space-based operations implies that orbital transfer of propellant is a prime consideration. The intent of this report is (1) to report on the findings and recommendations of existing literature on space-based propellant transfer techniques, and (2) to determine possible alternatives to the recommended methods. The reviewed literature recommends, in general, the use of conventional liquid transfer techniques (i.e., pumping) in conjunction with an artificially induced gravitational field. An alternate concept that was studied, the Thermal Bootstrap Transfer Process, is based on the compression of a two-phase fluid with subsequent condensation to a liquid (vapor compression/condensation). This concept utilizes the intrinsic energy capacities of the tanks and propellant by exploiting temperature differentials and available energy differences. The results indicate the thermodynamic feasibility of the Thermal Bootstrap Transfer Process for a specific range of tank sizes, temperatures, fill-factors and receiver tank heat transfer coefficients.
Probing light nonthermal dark matter at the LHC
NASA Astrophysics Data System (ADS)
Dutta, Bhaskar; Gao, Yu; Kamon, Teruki
2014-05-01
This paper investigates the collider phenomenology of a minimal nonthermal dark matter model with a 1-GeV dark matter candidate, which naturally explains baryogenesis. Since the light dark matter is not parity protected, it can be singly produced at the LHC. This leads to large missing energy associated with an energetic jet whose transverse momentum distribution is featured by a Jacobian-like shape. The monojet, dijet, paired dijet, and two jets + missing energy channels are studied. Currently existing data at the Tevatron and LHC offer significant bounds on our model.
Optimal simultaneous superpositioning of multiple structures with missing data.
Theobald, Douglas L; Steindel, Phillip A
2012-08-01
Superpositioning is an essential technique in structural biology that facilitates the comparison and analysis of conformational differences among topologically similar structures. Performing a superposition requires a one-to-one correspondence, or alignment, of the point sets in the different structures. However, in practice, some points are usually 'missing' from several structures, for example, when the alignment contains gaps. Current superposition methods deal with missing data simply by superpositioning a subset of points that are shared among all the structures. This practice is inefficient, as it ignores important data, and it fails to satisfy the common least-squares criterion. In the extreme, disregarding missing positions prohibits the calculation of a superposition altogether. Here, we present a general solution for determining an optimal superposition when some of the data are missing. We use the expectation-maximization algorithm, a classic statistical technique for dealing with incomplete data, to find both maximum-likelihood solutions and the optimal least-squares solution as a special case. The methods presented here are implemented in THESEUS 2.0, a program for superpositioning macromolecular structures. ANSI C source code and selected compiled binaries for various computing platforms are freely available under the GNU open source license from http://www.theseus3d.org. dtheobald@brandeis.edu Supplementary data are available at Bioinformatics online.
Colour and spatial cueing in low-prevalence visual search.
Russell, Nicholas C C; Kunar, Melina A
2012-01-01
In visual search, 30-40% of targets with a prevalence rate of 2% are missed, compared to 7% of targets with a prevalence rate of 50% (Wolfe, Horowitz, & Kenner, 2005). This "low-prevalence" (LP) effect is thought to occur as participants are making motor errors, changing their response criteria, and/or quitting their search too soon. We investigate whether colour and spatial cues, known to improve visual search when the target has a high prevalence (HP), benefit search when the target is rare. Experiments 1 and 2 showed that although knowledge of the target's colour reduces miss errors overall, it does not eliminate the LP effect as more targets were missed at LP than at HP. Furthermore, detection of a rare target is significantly impaired if it appears in an unexpected colour-more so than if the prevalence of the target is high (Experiment 2). Experiment 3 showed that, if a rare target is exogenously cued, target detection is improved but still impaired relative to high-prevalence conditions. Furthermore, if the cue is absent or invalid, the percentage of missed targets increases. Participants were given the option to correct motor errors in all three experiments, which reduced but did not eliminate the LP effect. The results suggest that although valid colour and spatial cues improve target detection, participants still miss more targets at LP than at HP. Furthermore, invalid cues at LP are very costly in terms of miss errors. We discuss our findings in relation to current theories and applications of LP search.
Ellison, Kevin S.; Hofmeister, Erik K.; Ribic, Christine A.; Sample, David W.
2014-01-01
Globally, Avipoxvirus species affect over 230 species of wild birds and can significantly impair survival. During banding of nine grassland songbird species (n = 346 individuals) in southwestern Wisconsin, USA, we noted species with a 2–6% prevalence of pox-like lesions (possible evidence of current infection) and 4–10% missing digits (potential evidence of past infection). These prevalences approach those recorded among island endemic birds (4–9% and 9–20% for the Galapagos and Hawaii, respectively) for which Avipoxvirus species have been implicated as contributing to dramatic population declines. Henslow's Sparrow Ammodramus henslowii (n = 165 individuals) had the highest prevalence of lesions (6.1%) and missing digits (9.7%). Among a subset of 26 Henslow's Sparrows from which blood samples were obtained, none had detectable antibody reactive to fowlpox virus antigen. However, four samples (18%) had antibody to canarypox virus antigen with test sample and negative control ratios (P/N values) ranging from 2.4 to 6.5 (median 4.3). Of four antibody-positive birds, two had lesions recorded (one was also missing a digit), one had digits missing, and one had no signs. Additionally, the birds with lesions or missing digits had higher P/N values than did the antibody-positive bird without missing digits or recorded lesions. This study represents an impetus for considering the impacts and dynamics of disease caused by Avipoxvirus among North American grassland bird species.
Li, Yumei; Xiang, Yang; Xu, Chao; Shen, Hui; Deng, Hongwen
2018-01-15
The development of next-generation sequencing technologies has facilitated the identification of rare variants. Family-based design is commonly used to effectively control for population admixture and substructure, which is more prominent for rare variants. Case-parents studies, as typical strategies in family-based design, are widely used in rare variant-disease association analysis. Current methods in case-parents studies are based on complete case-parents data; however, parental genotypes may be missing in case-parents trios, and removing these data may lead to a loss in statistical power. The present study focuses on testing for rare variant-disease association in case-parents study by allowing for missing parental genotypes. In this report, we extended the collapsing method for rare variant association analysis in case-parents studies to allow for missing parental genotypes, and investigated the performance of two methods by using the difference of genotypes between affected offspring and their corresponding "complements" in case-parent trios and TDT framework. Using simulations, we showed that, compared with the methods just only using complete case-parents data, the proposed strategy allowing for missing parental genotypes, or even adding unrelated affected individuals, can greatly improve the statistical power and meanwhile is not affected by population stratification. We conclude that adding case-parents data with missing parental genotypes to complete case-parents data set can greatly improve the power of our strategy for rare variant-disease association.
Ellison, Kevin S; Hofmeister, Erik K; Ribic, Christine A; Sample, David W
2014-10-01
Globally, Avipoxvirus species affect over 230 species of wild birds and can significantly impair survival. During banding of nine grassland songbird species (n=346 individuals) in southwestern Wisconsin, USA, we noted species with a 2-6% prevalence of pox-like lesions (possible evidence of current infection) and 4-10% missing digits (potential evidence of past infection). These prevalences approach those recorded among island endemic birds (4-9% and 9-20% for the Galapagos and Hawaii, respectively) for which Avipoxvirus species have been implicated as contributing to dramatic population declines. Henslow's Sparrow Ammodramus henslowii (n=165 individuals) had the highest prevalence of lesions (6.1%) and missing digits (9.7%). Among a subset of 26 Henslow's Sparrows from which blood samples were obtained, none had detectable antibody reactive to fowlpox virus antigen. However, four samples (18%) had antibody to canarypox virus antigen with test sample and negative control ratios (P/N values) ranging from 2.4 to 6.5 (median 4.3). Of four antibody-positive birds, two had lesions recorded (one was also missing a digit), one had digits missing, and one had no signs. Additionally, the birds with lesions or missing digits had higher P/N values than did the antibody-positive bird without missing digits or recorded lesions. This study represents an impetus for considering the impacts and dynamics of disease caused by Avipoxvirus among North American grassland bird species.
Corticosteroid and Anesthetic Injections for Muscle Strains and Ligament Sprains in the NFL.
Drakos, Mark; Birmingham, Patrick; Delos, Demetris; Barnes, Ronnie; Murphy, Conor; Weiss, Leigh; Warren, Russell
2014-07-01
Administering local anesthetic or corticosteroid injections in professional athletes to allow return to play is common but has traditionally been viewed as suspect and taboo. The skepticism surrounding therapeutic injections stems predominantly from anecdotal experience as opposed to scientific data. The purpose of this paper is to evaluate the current use of corticosteroid injections for muscle strains and ligaments sprains in the National Football League to document player's ability to return to play and possible adverse effects. Athletes from a single National Football League team who received at least one corticosteroid or anesthetic injection for either a muscle strain or ligament sprain during three consecutive seasons were retrospectively reviewed. Thirty-seven injections were given over the three seasons. Injections were either performed blindly or by using ultrasound guidance. Twice as many defensive players were injected than offensive players. The average number of days of conservative treatment before injection was 6.5 days. All players returned to play after injection. There were no complications from any of the injections. Seventeen (55%) players did not miss a single game, and nine (30%) did not miss a single day. Quadriceps strains were associated with the most missed games (four) and the most missed days (36.5). Proximal hamstring strains were second with an average of three missed games and 28 missed days. Corticosteroid injections are a safe and effective therapeutic intervention for treating muscle strains and ligament sprains in order to enable athletes to return to competition earlier.
Dmitriev, Egor V; Khomenko, Georges; Chami, Malik; Sokolov, Anton A; Churilova, Tatyana Y; Korotaev, Gennady K
2009-03-01
The absorption of sunlight by oceanic constituents significantly contributes to the spectral distribution of the water-leaving radiance. Here it is shown that current parameterizations of absorption coefficients do not apply to the optically complex waters of the Crimea Peninsula. Based on in situ measurements, parameterizations of phytoplankton, nonalgal, and total particulate absorption coefficients are proposed. Their performance is evaluated using a log-log regression combined with a low-pass filter and the nonlinear least-square method. Statistical significance of the estimated parameters is verified using the bootstrap method. The parameterizations are relevant for chlorophyll a concentrations ranging from 0.45 up to 2 mg/m(3).
Use of high order, periodic orbits in the PIES code
NASA Astrophysics Data System (ADS)
Monticello, Donald; Reiman, Allan
2010-11-01
We have implemented a version of the PIES code (Princeton Iterative Equilibrium SolverootnotetextA. Reiman et al 2007 Nucl. Fusion 47 572) that uses high order periodic orbits to select the surfaces on which straight magnetic field line coordinates will be calculated. The use of high order periodic orbits has increase the robustness and speed of the PIES code. We now have more uniform treatment of in-phase and out-of-phase islands. This new version has better convergence properties and works well with a full Newton scheme. We now have the ability to shrink islands using a bootstrap like current and this includes the m=1 island in tokamaks.
Exploration of high harmonic fast wave heating on the National Spherical Torus Experiment
NASA Astrophysics Data System (ADS)
Wilson, J. R.; Bell, R. E.; Bernabei, S.; Bitter, M.; Bonoli, P.; Gates, D.; Hosea, J.; LeBlanc, B.; Mau, T. K.; Medley, S.; Menard, J.; Mueller, D.; Ono, M.; Phillips, C. K.; Pinsker, R. I.; Raman, R.; Rosenberg, A.; Ryan, P.; Sabbagh, S.; Stutman, D.; Swain, D.; Takase, Y.; Wilgen, J.
2003-05-01
High harmonic fast wave (HHFW) heating has been proposed as a particularly attractive means for plasma heating and current drive in the high beta plasmas that are achievable in spherical torus (ST) devices. The National Spherical Torus Experiment (NSTX) [M. Ono, S. M. Kaye, S. Neumeyer et al., in Proceedings of the 18th IEEE/NPSS Symposium on Fusion Engineering, Albuquerque, 1999 (IEEE, Piscataway, NJ, 1999), p. 53] is such a device. An rf heating system has been installed on the NSTX to explore the physics of HHFW heating, current drive via rf waves and for use as a tool to demonstrate the attractiveness of the ST concept as a fusion device. To date, experiments have demonstrated many of the theoretical predictions for HHFW. In particular, strong wave absorption on electrons over a wide range of plasma parameters and wave parallel phase velocities, wave acceleration of energetic ions, and indications of current drive for directed wave spectra have been observed. In addition HHFW heating has been used to explore the energy transport properties of NSTX plasmas, to create H-mode discharges with a large fraction of bootstrap current and to control the plasma current profile during the early stages of the discharge.
Improvement of Current Drive Efficiency in Projected FNSF Discharges
NASA Astrophysics Data System (ADS)
Prater, R.; Chan, V.; Garofalo, A.
2012-10-01
The Fusion Nuclear Science Facility - Advanced Tokamak (FNSF-AT) is envisioned as a facility that uses the tokamak approach to address the development of the AT path to fusion and fusion's energy objectives. It uses copper coils for a compact device with high βN and moderate power gain. The major radius is 2.7 m and central toroidal field is 5.44 T. Achieving the required confinement and stability at βN˜3.7 requires a current profile with negative central shear and qmin>1. Off-axis Electron Cyclotron Current Drive (ECCD), in addition to high bootstrap current fraction, can help support this current profile. Using the applied EC frequency and launch location as free parameters, a systematic study has been carried out to optimize the ECCD in the range ρ= 0.5-0.7. Using a top launch, making use of a large toroidal component to the launch direction, adjusting the vertical launch angle so that the rays propagate nearly parallel to the resonance, and adjusting the frequency for optimum total current give a high dimensionless efficiency of 0.44 for a broad ECCD profile peaked at ρ=0.7, and the driven current is 17 kA/MW for n20= 2.1 and Te= 10.3 keV locally.
Tarkan, Sureyya; Plaisant, Catherine; Shneiderman, Ben; Hettinger, A. Zachary
2011-01-01
Researchers have conducted numerous case studies reporting the details on how laboratory test results of patients were missed by the ordering medical providers. Given the importance of timely test results in an outpatient setting, there is limited discussion of electronic versions of test result management tools to help clinicians and medical staff with this complex process. This paper presents three ideas to reduce missed results with a system that facilitates tracking laboratory tests from order to completion as well as during follow-up: (1) define a workflow management model that clarifies responsible agents and associated time frame, (2) generate a user interface for tracking that could eventually be integrated into current electronic health record (EHR) systems, (3) help identify common problems in past orders through retrospective analyses. PMID:22195201
Daley, Alison Moriarty
2012-01-01
This article examines school-based health centers (SBHCs) as complex adaptive systems, the current gaps that exist in contraceptive access, and the potential to maximize this community resource in teen pregnancy and sexually transmitted infection (STI) prevention efforts. Adolescent pregnancy is a major public health challenge for the United States. Existing community resources need to be considered for their potential to impact teen pregnancy and STI prevention efforts. SBHCs are one such community resource to be leveraged in these efforts. They offer adolescent-friendly primary care services and are responsive to the diverse needs of the adolescents utilizing them. However, current restrictions on contraceptive availability limit the ability of SBHCs to maximize opportunities for comprehensive reproductive care and create missed opportunities for pregnancy and STI prevention. A clinical case explores the current models of health care services related to contraceptive care provided in SBHCs and the ability to meet or miss the needs of an adolescent seeking reproductive care in a SBHC.
Estimation of Missing Water-Level Data for the Everglades Depth Estimation Network (EDEN)
Conrads, Paul; Petkewich, Matthew D.
2009-01-01
The Everglades Depth Estimation Network (EDEN) is an integrated network of real-time water-level gaging stations, ground-elevation models, and water-surface elevation models designed to provide scientists, engineers, and water-resource managers with current (2000-2009) water-depth information for the entire freshwater portion of the greater Everglades. The U.S. Geological Survey Greater Everglades Priority Ecosystems Science provides support for EDEN and their goal of providing quality-assured monitoring data for the U.S. Army Corps of Engineers Comprehensive Everglades Restoration Plan. To increase the accuracy of the daily water-surface elevation model, water-level estimation equations were developed to fill missing data. To minimize the occurrences of no estimation of data due to missing data for an input station, a minimum of three linear regression equations were developed for each station using different input stations. Of the 726 water-level estimation equations developed to fill missing data at 239 stations, more than 60 percent of the equations have coefficients of determination greater than 0.90, and 92 percent have an coefficient of determination greater than 0.70.
Consent process for US-based family reference DNA samples.
Katsanis, Sara H; Snyder, Lindsey; Arnholt, Kelly; Mundorff, Amy Z
2018-01-01
DNA collection from family members of the missing is a tenet for missing persons' and mass fatality investigations. Procedures for consenting family members are disparate, depending on the context supporting the reason for sample collection. While guidelines and best practices have been developed for handling mass fatalities and for identification of the missing, these guidelines do not address standard consent practices for living family members of potential victims. We examined the relevant U.S. laws, international guidelines and best practices, sampled consent forms currently used for DNA collection of family members, and drafted model language for a consent form to communicate the required and recommended information. We modeled the consent form on biobank consenting practices and tested the consent language among students and the general population for constructive feedback and readability. We also asked respondents to consider the options for DNA collection and either hypothetically agree or disagree. The model language presented here highlights information important to relay in consent processes and can serve as a foundation for future consent practices in mass fatalities and missing persons' investigations. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Hasegawa, Chika; Nakayama, Yu
2018-03-01
In this paper, we solve the two-point function of the lowest dimensional scalar operator in the critical ϕ4 theory on 4 ‑ 𝜖 dimensional real projective space in three different methods. The first is to use the conventional perturbation theory, and the second is to impose the cross-cap bootstrap equation, and the third is to solve the Schwinger-Dyson equation under the assumption of conformal invariance. We find that the three methods lead to mutually consistent results but each has its own advantage.
On critical exponents without Feynman diagrams
NASA Astrophysics Data System (ADS)
Sen, Kallol; Sinha, Aninda
2016-11-01
In order to achieve a better analytic handle on the modern conformal bootstrap program, we re-examine and extend the pioneering 1974 work of Polyakov’s, which was based on consistency between the operator product expansion and unitarity. As in the bootstrap approach, this method does not depend on evaluating Feynman diagrams. We show how this approach can be used to compute the anomalous dimensions of certain operators in the O(n) model at the Wilson-Fisher fixed point in 4-ɛ dimensions up to O({ɛ }2). AS dedicates this work to the loving memory of his mother.
Iliesiu, Luca; Kos, Filip; Poland, David; ...
2016-03-17
We study the conformal bootstrap for a 4-point function of fermions in 3D. We first introduce an embedding formalism for 3D spinors and compute the conformal blocks appearing in fermion 4-point functions. Using these results, we find general bounds on the dimensions of operators appearing in the ψ × ψ OPE, and also on the central charge C T. We observe features in our bounds that coincide with scaling dimensions in the GrossNeveu models at large N. Finally, we also speculate that other features could coincide with a fermionic CFT containing no relevant scalar operators.
New Methods for Estimating Seasonal Potential Climate Predictability
NASA Astrophysics Data System (ADS)
Feng, Xia
This study develops two new statistical approaches to assess the seasonal potential predictability of the observed climate variables. One is the univariate analysis of covariance (ANOCOVA) model, a combination of autoregressive (AR) model and analysis of variance (ANOVA). It has the advantage of taking into account the uncertainty of the estimated parameter due to sampling errors in statistical test, which is often neglected in AR based methods, and accounting for daily autocorrelation that is not considered in traditional ANOVA. In the ANOCOVA model, the seasonal signals arising from external forcing are determined to be identical or not to assess any interannual variability that may exist is potentially predictable. The bootstrap is an attractive alternative method that requires no hypothesis model and is available no matter how mathematically complicated the parameter estimator. This method builds up the empirical distribution of the interannual variance from the resamplings drawn with replacement from the given sample, in which the only predictability in seasonal means arises from the weather noise. These two methods are applied to temperature and water cycle components including precipitation and evaporation, to measure the extent to which the interannual variance of seasonal means exceeds the unpredictable weather noise compared with the previous methods, including Leith-Shukla-Gutzler (LSG), Madden, and Katz. The potential predictability of temperature from ANOCOVA model, bootstrap, LSG and Madden exhibits a pronounced tropical-extratropical contrast with much larger predictability in the tropics dominated by El Nino/Southern Oscillation (ENSO) than in higher latitudes where strong internal variability lowers predictability. Bootstrap tends to display highest predictability of the four methods, ANOCOVA lies in the middle, while LSG and Madden appear to generate lower predictability. Seasonal precipitation from ANOCOVA, bootstrap, and Katz, resembling that for temperature, is more predictable over the tropical regions, and less predictable in extropics. Bootstrap and ANOCOVA are in good agreement with each other, both methods generating larger predictability than Katz. The seasonal predictability of evaporation over land bears considerably similarity with that of temperature using ANOCOVA, bootstrap, LSG and Madden. The remote SST forcing and soil moisture reveal substantial seasonality in their relations with the potentially predictable seasonal signals. For selected regions, either SST or soil moisture or both shows significant relationships with predictable signals, hence providing indirect insight on slowly varying boundary processes involved to enable useful seasonal climate predication. A multivariate analysis of covariance (MANOCOVA) model is established to identify distinctive predictable patterns, which are uncorrelated with each other. Generally speaking, the seasonal predictability from multivariate model is consistent with that from ANOCOVA. Besides unveiling the spatial variability of predictability, MANOCOVA model also reveals the temporal variability of each predictable pattern, which could be linked to the periodic oscillations.
Missed surgical intensive care unit billing: potential financial impact of 24/7 faculty presence.
Hendershot, Kimberly M; Bollins, John P; Armen, Scott B; Thomas, Yalaunda M; Steinberg, Steven M; Cook, Charles H
2009-07-01
To efficiently capture evaluation and management (E&M) and procedural billing in our surgical intensive care unit (SICU), we have developed an electronic billing system that links to the electronic medical record (EMR). In this system, only notes electronically signed and coded by an attending generate billing charges. We hypothesized that capture of missed billing during nighttime and weekends might be sufficient to subsidize 24/7 in-house attending coverage. A retrospective chart EMR review was performed of the EMRs for all SICU patients during a 2-month period. Note type, date, time, attending signature, and coding were analyzed. Notes without attending signature, diagnosis, or current procedural terminology (CPT) code were considered incomplete and identified as "missed billing." Four hundred and forty-three patients had 465 admissions generating 2,896 notes. Overall, 76% of notes were signed and coded by an attending and billed. Incomplete (not billed) notes represented an overall missed billing opportunity of $159,138 for the 2-month time period (approximately $954,000 annually). Unbilled E&M encounters during weekdays totaled $54,758, whereas unbilled E&M and procedures from weeknights and weekends totaled $88,408 ($44,566 and $43,842, respectively). Missed billing after-hours thus represents approximately $530K annually, extrapolating to approximately $220K in collections from our payer mix. Surprisingly, missed E&M and procedural billing during weekdays totaled $70,730 (approximately $425K billing, approximately $170K collections annually), and typically represented patients seen, but transferred from the SICU before attending documentation was completed. Capture of nighttime and weekend ICU collections alone may be insufficient to add faculty or incentivize in-house coverage, but could certainly complement other in-house derived revenues to such ends. In addition, missed daytime billing in busy modern ICUs can be substantial, and use of an EMR to identify missed billing opportunities can help create solutions to recover these revenues.
NASA Astrophysics Data System (ADS)
Gebhardt, Katharina; Knebelsberger, Thomas
2015-09-01
We morphologically analyzed 79 cephalopod specimens from the North and Baltic Seas belonging to 13 separate species. Another 29 specimens showed morphological features of either Alloteuthis mediaor Alloteuthis subulata or were found to be in between. Reliable identification features to distinguish between A. media and A. subulata are currently not available. The analysis of the DNA barcoding region of the COI gene revealed intraspecific distances (uncorrected p) ranging from 0 to 2.13 % (average 0.1 %) and interspecific distances between 3.31 and 22 % (average 15.52 %). All species formed monophyletic clusters in a neighbor-joining analysis and were supported by bootstrap values of ≥99 %. All COI haplotypes belonging to the 29 Alloteuthis specimens were grouped in one cluster. Neither COI nor 18S rDNA sequences helped to distinguish between the different Alloteuthis morphotypes. For species identification purposes, we recommend the use of COI, as it showed higher bootstrap support of species clusters and less amplification and sequencing failure compared to 18S. Our data strongly support the assumption that the genus Alloteuthis is only represented by a single species, at least in the North Sea. It remained unclear whether this species is A. subulata or A. media. All COI sequences including important metadata were uploaded to the Barcode of Life Data Systems and can be used as reference library for the molecular identification of more than 50 % of the cephalopod fauna known from the North and Baltic Seas.
Benedetto, Umberto; Raja, Shahzad G
2014-11-01
The effectiveness of the routine retrosternal placement of a gentamicin-impregnated collagen sponge (GICS) implant before sternotomy closure is currently a matter of some controversy. We aimed to develop a scoring system to guide decision making for the use of GICS to prevent deep sternal wound infection. Fast backward elimination on predictors, including GICS, was performed using the Lawless and Singhal method. The scoring system was reported as a partial nomogram that can be used to manually obtain predicted individual risk of deep sternal wound infection from the regression model. Bootstrapping validation of the regression models was performed. The final populations consisted of 8750 adult patients undergoing cardiac surgery through full sternotomy during the study period. A total of 329 patients (3.8%) received GICS implant. The overall incidence of deep sternal wound infection was lower among patients who received GICS implant (0.6%) than patients who did not (2.01%) (P=.02). A nomogram to predict the individual risk for deep sternal wound infection was developed that included the use of GICS. Bootstrapping validation confirmed a good discriminative power of the models. The scoring system provides an impartial assessment of the decision-making process for clinicians to establish if GICS implant is effective in reducing the risk for deep sternal wound infection in individual patients undergoing cardiac surgery through full sternotomy. Copyright © 2014 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Qian, J. P.; Garofalo, A. M.; Gong, X. Z.; Ren, Q. L.; Ding, S. Y.; Solomon, W. M.; Xu, G. S.; Grierson, B. A.; Guo, W. F.; Holcomb, C. T.; McClenaghan, J.; McKee, G. R.; Pan, C. K.; Huang, J.; Staebler, G. M.; Wan, B. N.
2017-05-01
Recent EAST/DIII-D joint experiments on the high poloidal beta {β\\text{P}} regime in DIII-D have extended operation with internal transport barriers (ITBs) and excellent energy confinement (H 98y2 ~ 1.6) to higher plasma current, for lower q 95 ⩽ 7.0, and more balanced neutral beam injection (NBI) (torque injection < 2 Nm), for lower plasma rotation than previous results (Garofalo et al, IAEA 2014, Gong et al 2014 IAEA Int. Conf. on Fusion Energy). Transport analysis and experimental measurements at low toroidal rotation suggest that the E × B shear effect is not key to the ITB formation in these high {β\\text{P}} discharges. Experiments and TGLF modeling show that the Shafranov shift has a key stabilizing effect on turbulence. Extrapolation of the DIII-D results using a 0D model shows that with the improved confinement, the high bootstrap fraction regime could achieve fusion gain Q = 5 in ITER at {β\\text{N}} ~ 2.9 and q 95 ~ 7. With the optimization of q(0), the required improved confinement is achievable when using 1.5D TGLF-SAT1 for transport simulations. Results reported in this paper suggest that the DIII-D high {β\\text{P}} scenario could be a candidate for ITER steady state operation.
NASA Astrophysics Data System (ADS)
Stroeve, Julienne; Jenouvrier, Stephanie
2016-04-01
Sea ice variability within the marginal ice zone (MIZ) and polynyas plays an important role for phytoplankton productivity and krill abundance. Therefore mapping their spatial extent, seasonal and interannual variability is essential for understanding how current and future changes in these biological active regions may impact the Antarctic marine ecosystem. Knowledge of the distribution of different ice types to the total Antarctic sea ice cover may also help to shed light on the factors contributing towards recent expansion of the Antarctic ice cover in some regions and contraction in others. The long-term passive microwave satellite data record provides the longest and most consistent data record for assessing different ice types. However, estimates of the amount of MIZ, consolidated pack ice and polynyas depends strongly on what sea ice algorithm is used. This study uses two popular passive microwave sea ice algorithms, the NASA Team and Bootstrap to evaluate the distribution and variability in the MIZ, the consolidated pack ice and coastal polynyas. Results reveal the NASA Team algorithm has on average twice the MIZ and half the consolidated pack ice area as the Bootstrap algorithm. Polynya area is also larger in the NASA Team algorithm, and the timing of maximum polynya area may differ by as much as 5 months between algorithms. These differences lead to different relationships between sea ice characteristics and biological processes, as illustrated here with the breeding success of an Antarctic seabird.
Advanced Melanoma Facebook Live Event
In case you missed it, watch this recent Facebook Live event about the current state of research and treatment for advanced stage melanoma. To learn more, see our evidence-based information about skin cancer, including melanoma.
Cho, Jin-Young; Lee, Hyoung-Joo; Jeong, Seul-Ki; Kim, Kwang-Youl; Kwon, Kyung-Hoon; Yoo, Jong Shin; Omenn, Gilbert S; Baker, Mark S; Hancock, William S; Paik, Young-Ki
2015-12-04
Approximately 2.9 billion long base-pair human reference genome sequences are known to encode some 20 000 representative proteins. However, 3000 proteins, that is, ~15% of all proteins, have no or very weak proteomic evidence and are still missing. Missing proteins may be present in rare samples in very low abundance or be only temporarily expressed, causing problems in their detection and protein profiling. In particular, some technical limitations cause missing proteins to remain unassigned. For example, current mass spectrometry techniques have high limits and error rates for the detection of complex biological samples. An insufficient proteome coverage in a reference sequence database and spectral library also raises major issues. Thus, the development of a better strategy that results in greater sensitivity and accuracy in the search for missing proteins is necessary. To this end, we used a new strategy, which combines a reference spectral library search and a simulated spectral library search, to identify missing proteins. We built the human iRefSPL, which contains the original human reference spectral library and additional peptide sequence-spectrum match entries from other species. We also constructed the human simSPL, which contains the simulated spectra of 173 907 human tryptic peptides determined by MassAnalyzer (version 2.3.1). To prove the enhanced analytical performance of the combination of the human iRefSPL and simSPL methods for the identification of missing proteins, we attempted to reanalyze the placental tissue data set (PXD000754). The data from each experiment were analyzed using PeptideProphet, and the results were combined using iProphet. For the quality control, we applied the class-specific false-discovery rate filtering method. All of the results were filtered at a false-discovery rate of <1% at the peptide and protein levels. The quality-controlled results were then cross-checked with the neXtProt DB (2014-09-19 release). The two spectral libraries, iRefSPL and simSPL, were designed to ensure no overlap of the proteome coverage. They were shown to be complementary to spectral library searching and significantly increased the number of matches. From this trial, 12 new missing proteins were identified that passed the following criterion: at least 2 peptides of 7 or more amino acids in length or one of 9 or more amino acids in length with one or more unique sequences. Thus, the iRefSPL and simSPL combination can be used to help identify peptides that have not been detected by conventional sequence database searches with improved sensitivity and a low error rate.
Liu, Juan; Qi, Zhe-Chen; Zhao, Yun-Peng; Fu, Cheng-Xin; Jenny Xiang, Qiu-Yun
2012-09-01
The complete nucleotide sequence of the chloroplast genome (cpDNA) of Smilax china L. (Smilacaceae) is reported. It is the first complete cp genome sequence in Liliales. Genomic analyses were conducted to examine the rate and pattern of cpDNA genome evolution in Smilax relative to other major lineages of monocots. The cpDNA genomic sequences were combined with those available for Lilium to evaluate the phylogenetic position of Liliales and to investigate the influence of taxon sampling, gene sampling, gene function, natural selection, and substitution rate on phylogenetic inference in monocots. Phylogenetic analyses using sequence data of gene groups partitioned according to gene function, selection force, and total substitution rate demonstrated evident impacts of these factors on phylogenetic inference of monocots and the placement of Liliales, suggesting potential evolutionary convergence or adaptation of some cpDNA genes in monocots. Our study also demonstrated that reduced taxon sampling reduced the bootstrap support for the placement of Liliales in the cpDNA phylogenomic analysis. Analyses of sequences of 77 protein genes with some missing data and sequences of 81 genes (all protein genes plus the rRNA genes) support a sister relationship of Liliales to the commelinids-Asparagales clade, consistent with the APG III system. Analyses of 63 cpDNA protein genes for 32 taxa with few missing data, however, support a sister relationship of Liliales (represented by Smilax and Lilium) to Dioscoreales-Pandanales. Topology tests indicated that these two alignments do not significantly differ given any of these three cpDNA genomic sequence data sets. Furthermore, we found no saturation effect of the data, suggesting that the cpDNA genomic sequence data used in the study are appropriate for monocot phylogenetic study and long-branch attraction is unlikely to be the cause to explain the result of two well-supported, conflict placements of Liliales. Further analyses using sufficient nuclear data remain necessary to evaluate these two phylogenetic hypotheses regarding the position of Liliales and to address the causes of signal conflict among genes and partitions. Copyright © 2012 Elsevier Inc. All rights reserved.
Petrie, Joshua G.; Cheng, Caroline; Malosh, Ryan E.; VanWormer, Jeffrey J.; Flannery, Brendan; Zimmerman, Richard K.; Gaglani, Manjusha; Jackson, Michael L.; King, Jennifer P.; Nowalk, Mary Patricia; Benoit, Joyce; Robertson, Anne; Thaker, Swathi N.; Monto, Arnold S.; Ohmit, Suzanne E.
2016-01-01
Background. Influenza causes significant morbidity and mortality, with considerable economic costs, including lost work productivity. Influenza vaccines may reduce the economic burden through primary prevention of influenza and reduction in illness severity. Methods. We examined illness severity and work productivity loss among working adults with medically attended acute respiratory illnesses and compared outcomes for subjects with and without laboratory-confirmed influenza and by influenza vaccination status among subjects with influenza during the 2012–2013 influenza season. Results. Illnesses laboratory-confirmed as influenza (ie, cases) were subjectively assessed as more severe than illnesses not caused by influenza (ie, noncases) based on multiple measures, including current health status at study enrollment (≤7 days from illness onset) and current activity and sleep quality status relative to usual. Influenza cases reported missing 45% more work hours (20.5 vs 15.0; P < .001) than noncases and subjectively assessed their work productivity as impeded to a greater degree (6.0 vs 5.4; P < .001). Current health status and current activity relative to usual were subjectively assessed as modestly but significantly better for vaccinated cases compared with unvaccinated cases; however, no significant modifications of sleep quality, missed work hours, or work productivity loss were noted for vaccinated subjects. Conclusions. Influenza illnesses were more severe and resulted in more missed work hours and productivity loss than illnesses not confirmed as influenza. Modest reductions in illness severity for vaccinated cases were observed. These findings highlight the burden of influenza illnesses and illustrate the importance of laboratory confirmation of influenza outcomes in evaluations of vaccine effectiveness. PMID:26565004
Patient satisfaction after pulmonary resection for lung cancer: a multicenter comparative analysis.
Pompili, Cecilia; Brunelli, Alessandro; Rocco, Gaetano; Salvi, Rosario; Xiumé, Francesco; La Rocca, Antonello; Sabbatini, Armando; Martucci, Nicola
2013-01-01
Patient satisfaction reflects the perception of the customer about the level of quality of care received during the episode of hospitalization. To compare the levels of satisfaction of patients submitted to lung resection in two different thoracic surgical units. Prospective analysis of 280 consecutive patients submitted to pulmonary resection for neoplastic disease in two centers (center A: 139 patients; center B: 141 patients; 2009-2010). Patients' satisfaction was assessed at discharge through the EORTC-InPatSat32 module, a 32-item, multi-scale self-administered anonymous questionnaire. Each scale (ranging from 0 to 100 in score) was compared between the two units. Multivariable regression and bootstrap were used to verify factors associated with the patients' general satisfaction (dependent variable). Patients from unit B reported a higher general satisfaction (91.5 vs. 88.3, p = 0.04), mainly due to a significantly higher satisfaction in the doctor-related scales (doctors' technical skill: p = 0.001; doctors' interpersonal skill: p = 0.008; doctors' availability: p = 0.005, and doctors information provision: p = 0.0006). Multivariable regression analysis and bootstrap confirmed that level of care in unit B (p = 0.006, bootstrap frequency 60%) along with lower level of education of the patient population (p = 0.02, bootstrap frequency 62%) were independent factors associated with a higher general patient satisfaction. We were able to show a different level of patient satisfaction in patients operated on in two different thoracic surgery units. A reduced level of patient satisfaction may trigger changes in the management policy of individual units in order to meet patients' expectations and improve organizational efficiency. Copyright © 2012 S. Karger AG, Basel.
Wilcox, Thomas P; Zwickl, Derrick J; Heath, Tracy A; Hillis, David M
2002-11-01
Four New World genera of dwarf boas (Exiliboa, Trachyboa, Tropidophis, and Ungaliophis) have been placed by many systematists in a single group (traditionally called Tropidophiidae). However, the monophyly of this group has been questioned in several studies. Moreover, the overall relationships among basal snake lineages, including the placement of the dwarf boas, are poorly understood. We obtained mtDNA sequence data for 12S, 16S, and intervening tRNA-val genes from 23 species of snakes representing most major snake lineages, including all four genera of New World dwarf boas. We then examined the phylogenetic position of these species by estimating the phylogeny of the basal snakes. Our phylogenetic analysis suggests that New World dwarf boas are not monophyletic. Instead, we find Exiliboa and Ungaliophis to be most closely related to sand boas (Erycinae), boas (Boinae), and advanced snakes (Caenophidea), whereas Tropidophis and Trachyboa form an independent clade that separated relatively early in snake radiation. Our estimate of snake phylogeny differs significantly in other ways from some previous estimates of snake phylogeny. For instance, pythons do not cluster with boas and sand boas, but instead show a strong relationship with Loxocemus and Xenopeltis. Additionally, uropeltids cluster strongly with Cylindrophis, and together are embedded in what has previously been considered the macrostomatan radiation. These relationships are supported by both bootstrapping (parametric and nonparametric approaches) and Bayesian analysis, although Bayesian support values are consistently higher than those obtained from nonparametric bootstrapping. Simulations show that Bayesian support values represent much better estimates of phylogenetic accuracy than do nonparametric bootstrap support values, at least under the conditions of our study. Copyright 2002 Elsevier Science (USA)
Explanation of Two Anomalous Results in Statistical Mediation Analysis.
Fritz, Matthew S; Taylor, Aaron B; Mackinnon, David P
2012-01-01
Previous studies of different methods of testing mediation models have consistently found two anomalous results. The first result is elevated Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap tests not found in nonresampling tests or in resampling tests that did not include a bias correction. This is of special concern as the bias-corrected bootstrap is often recommended and used due to its higher statistical power compared with other tests. The second result is statistical power reaching an asymptote far below 1.0 and in some conditions even declining slightly as the size of the relationship between X and M , a , increased. Two computer simulations were conducted to examine these findings in greater detail. Results from the first simulation found that the increased Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap are a function of an interaction between the size of the individual paths making up the mediated effect and the sample size, such that elevated Type I error rates occur when the sample size is small and the effect size of the nonzero path is medium or larger. Results from the second simulation found that stagnation and decreases in statistical power as a function of the effect size of the a path occurred primarily when the path between M and Y , b , was small. Two empirical mediation examples are provided using data from a steroid prevention and health promotion program aimed at high school football players (Athletes Training and Learning to Avoid Steroids; Goldberg et al., 1996), one to illustrate a possible Type I error for the bias-corrected bootstrap test and a second to illustrate a loss in power related to the size of a . Implications of these findings are discussed.
Sample size determination for mediation analysis of longitudinal data.
Pan, Haitao; Liu, Suyu; Miao, Danmin; Yuan, Ying
2018-03-27
Sample size planning for longitudinal data is crucial when designing mediation studies because sufficient statistical power is not only required in grant applications and peer-reviewed publications, but is essential to reliable research results. However, sample size determination is not straightforward for mediation analysis of longitudinal design. To facilitate planning the sample size for longitudinal mediation studies with a multilevel mediation model, this article provides the sample size required to achieve 80% power by simulations under various sizes of the mediation effect, within-subject correlations and numbers of repeated measures. The sample size calculation is based on three commonly used mediation tests: Sobel's method, distribution of product method and the bootstrap method. Among the three methods of testing the mediation effects, Sobel's method required the largest sample size to achieve 80% power. Bootstrapping and the distribution of the product method performed similarly and were more powerful than Sobel's method, as reflected by the relatively smaller sample sizes. For all three methods, the sample size required to achieve 80% power depended on the value of the ICC (i.e., within-subject correlation). A larger value of ICC typically required a larger sample size to achieve 80% power. Simulation results also illustrated the advantage of the longitudinal study design. The sample size tables for most encountered scenarios in practice have also been published for convenient use. Extensive simulations study showed that the distribution of the product method and bootstrapping method have superior performance to the Sobel's method, but the product method was recommended to use in practice in terms of less computation time load compared to the bootstrapping method. A R package has been developed for the product method of sample size determination in mediation longitudinal study design.
Oberije, Cary; De Ruysscher, Dirk; Houben, Ruud; van de Heuvel, Michel; Uyterlinde, Wilma; Deasy, Joseph O; Belderbos, Jose; Dingemans, Anne-Marie C; Rimner, Andreas; Din, Shaun; Lambin, Philippe
2015-07-15
Although patients with stage III non-small cell lung cancer (NSCLC) are homogeneous according to the TNM staging system, they form a heterogeneous group, which is reflected in the survival outcome. The increasing amount of information for an individual patient and the growing number of treatment options facilitate personalized treatment, but they also complicate treatment decision making. Decision support systems (DSS), which provide individualized prognostic information, can overcome this but are currently lacking. A DSS for stage III NSCLC requires the development and integration of multiple models. The current study takes the first step in this process by developing and validating a model that can provide physicians with a survival probability for an individual NSCLC patient. Data from 548 patients with stage III NSCLC were available to enable the development of a prediction model, using stratified Cox regression. Variables were selected by using a bootstrap procedure. Performance of the model was expressed as the c statistic, assessed internally and on 2 external data sets (n=174 and n=130). The final multivariate model, stratified for treatment, consisted of age, gender, World Health Organization performance status, overall treatment time, equivalent radiation dose, number of positive lymph node stations, and gross tumor volume. The bootstrapped c statistic was 0.62. The model could identify risk groups in external data sets. Nomograms were constructed to predict an individual patient's survival probability (www.predictcancer.org). The data set can be downloaded at https://www.cancerdata.org/10.1016/j.ijrobp.2015.02.048. The prediction model for overall survival of patients with stage III NSCLC highlights the importance of combining patient, clinical, and treatment variables. Nomograms were developed and validated. This tool could be used as a first building block for a decision support system. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Pankin, A. Y.; Rafiq, T.; Kritz, A. H.; Park, G. Y.; Snyder, P. B.; Chang, C. S.
2017-06-01
The effects of plasma shaping on the H-mode pedestal structure are investigated. High fidelity kinetic simulations of the neoclassical pedestal dynamics are combined with the magnetohydrodynamic (MHD) stability conditions for triggering edge localized mode (ELM) instabilities that limit the pedestal width and height in H-mode plasmas. The neoclassical kinetic XGC0 code [Chang et al., Phys. Plasmas 11, 2649 (2004)] is used in carrying out a scan over plasma elongation and triangularity. As plasma profiles evolve, the MHD stability limits of these profiles are analyzed with the ideal MHD ELITE code [Snyder et al., Phys. Plasmas 9, 2037 (2002)]. Simulations with the XGC0 code, which includes coupled ion-electron dynamics, yield predictions for both ion and electron pedestal profiles. The differences in the predicted H-mode pedestal width and height for the DIII-D discharges with different elongation and triangularities are discussed. For the discharges with higher elongation, it is found that the gradients of the plasma profiles in the H-mode pedestal reach semi-steady states. In these simulations, the pedestal slowly continues to evolve to higher pedestal pressures and bootstrap currents until the peeling-ballooning stability conditions are satisfied. The discharges with lower elongation do not reach the semi-steady state, and ELM crashes are triggered at earlier times. The plasma elongation is found to have a stronger stabilizing effect than the plasma triangularity. For the discharges with lower elongation and lower triangularity, the ELM frequency is large, and the H-mode pedestal evolves rapidly. It is found that the temperature of neutrals in the scrape-off-layer (SOL) region can affect the dynamics of the H-mode pedestal buildup. However, the final pedestal profiles are nearly independent of the neutral temperature. The elongation and triangularity affect the pedestal widths of plasma density and electron temperature profiles differently. This provides a new mechanism of controlling the pedestal bootstrap current and the pedestal stability.
Pankin, A. Y.; Rafiq, T.; Kritz, A. H.; ...
2017-06-08
The effects of plasma shaping on the H-mode pedestal structure are investigated. High fidelity kinetic simulations of the neoclassical pedestal dynamics are combined with the magnetohydrodynamic (MHD) stability conditions for triggering edge localized mode (ELM) instabilities that limit the pedestal width and height in H-mode plasmas. We use the neoclassical kinetic XGC0 code [Chang et al., Phys. Plasmas 11, 2649 (2004)] to carry out a scan over plasma elongation and triangularity. As plasma profiles evolve, the MHD stability limits of these profiles are analyzed with the ideal MHD ELITE code [Snyder et al., Phys. Plasmas 9, 2037 (2002)]. In simulationsmore » with the XGC0 code, which includes coupled ion-electron dynamics, yield predictions for both ion and electron pedestal profiles. The differences in the predicted H-mode pedestal width and height for the DIII-D discharges with different elongation and triangularities are discussed. For the discharges with higher elongation, it is found that the gradients of the plasma profiles in the H-mode pedestal reach semi-steady states. In these simulations, the pedestal slowly continues to evolve to higher pedestal pressures and bootstrap currents until the peeling-ballooning stability conditions are satisfied. The discharges with lower elongation do not reach the semi-steady state, and ELM crashes are triggered at earlier times. The plasma elongation is found to have a stronger stabilizing effect than the plasma triangularity. For the discharges with lower elongation and lower triangularity, the ELM frequency is large, and the H-mode pedestal evolves rapidly. It is found that the temperature of neutrals in the scrape-off-layer (SOL) region can affect the dynamics of the H-mode pedestal buildup. But the final pedestal profiles are nearly independent of the neutral temperature. The elongation and triangularity affect the pedestal widths of plasma density and electron temperature profiles differently. This provides a new mechanism of controlling the pedestal bootstrap current and the pedestal stability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pankin, A. Y.; Rafiq, T.; Kritz, A. H.
The effects of plasma shaping on the H-mode pedestal structure are investigated. High fidelity kinetic simulations of the neoclassical pedestal dynamics are combined with the magnetohydrodynamic (MHD) stability conditions for triggering edge localized mode (ELM) instabilities that limit the pedestal width and height in H-mode plasmas. We use the neoclassical kinetic XGC0 code [Chang et al., Phys. Plasmas 11, 2649 (2004)] to carry out a scan over plasma elongation and triangularity. As plasma profiles evolve, the MHD stability limits of these profiles are analyzed with the ideal MHD ELITE code [Snyder et al., Phys. Plasmas 9, 2037 (2002)]. In simulationsmore » with the XGC0 code, which includes coupled ion-electron dynamics, yield predictions for both ion and electron pedestal profiles. The differences in the predicted H-mode pedestal width and height for the DIII-D discharges with different elongation and triangularities are discussed. For the discharges with higher elongation, it is found that the gradients of the plasma profiles in the H-mode pedestal reach semi-steady states. In these simulations, the pedestal slowly continues to evolve to higher pedestal pressures and bootstrap currents until the peeling-ballooning stability conditions are satisfied. The discharges with lower elongation do not reach the semi-steady state, and ELM crashes are triggered at earlier times. The plasma elongation is found to have a stronger stabilizing effect than the plasma triangularity. For the discharges with lower elongation and lower triangularity, the ELM frequency is large, and the H-mode pedestal evolves rapidly. It is found that the temperature of neutrals in the scrape-off-layer (SOL) region can affect the dynamics of the H-mode pedestal buildup. But the final pedestal profiles are nearly independent of the neutral temperature. The elongation and triangularity affect the pedestal widths of plasma density and electron temperature profiles differently. This provides a new mechanism of controlling the pedestal bootstrap current and the pedestal stability.« less
High-beta, steady-state hybrid scenario on DIII-D
Petty, C. C.; Kinsey, J. E.; Holcomb, C. T.; ...
2015-12-17
Here, the potential of the hybrid scenario (first developed as an advanced inductive scenario for high fluence) as a regime for high-beta, steady-state plasmas is demonstrated on the DIII-D tokamak. These experiments show that the beneficial characteristics of hybrids, namely safety factor ≥1 with low central magnetic shear, high stability limits and excellent confinement, are maintained when strong central current drive (electron cyclotron and neutral beam) is applied to increase the calculated non-inductive fraction to ≈100% (≈50% bootstrap current). The best discharges achieve normalized beta of 3.4, IPB98(y,2) confinement factor of 1.4, surface loop voltage of 0.01 V, and nearlymore » equal electron and ion temperatures at low collisionality. A zero-dimensional physics model shows that steady-state hybrid operation with Q fus ~ 5 is feasible in FDF and ITER. The advantage of the hybrid scenario as an Advanced Tokamak regime is that the external current drive can be deposited near the plasma axis where the efficiency is high; additionally, good alignment between the current drive and plasma current profiles is not necessary as the poloidal magnetic flux pumping self-organizes the current density profile in hybrids with an m/n=3/2 tearing mode.« less
Missing pulse detector for a variable frequency source
Ingram, Charles B.; Lawhorn, John H.
1979-01-01
A missing pulse detector is provided which has the capability of monitoring a varying frequency pulse source to detect the loss of a single pulse or total loss of signal from the source. A frequency-to-current converter is used to program the output pulse width of a variable period retriggerable one-shot to maintain a pulse width slightly longer than one-half the present monitored pulse period. The retriggerable one-shot is triggered at twice the input pulse rate by employing a frequency doubler circuit connected between the one-shot input and the variable frequency source being monitored. The one-shot remains in the triggered or unstable state under normal conditions even though the source period is varying. A loss of an input pulse or single period of a fluctuating signal input will cause the one-shot to revert to its stable state, changing the output signal level to indicate a missing pulse or signal.
Implementation of a tree algorithm in MCNP code for nuclear well logging applications.
Li, Fusheng; Han, Xiaogang
2012-07-01
The goal of this paper is to develop some modeling capabilities that are missing in the current MCNP code. Those missing capabilities can greatly help for some certain nuclear tools designs, such as a nuclear lithology/mineralogy spectroscopy tool. The new capabilities to be developed in this paper include the following: zone tally, neutron interaction tally, gamma rays index tally and enhanced pulse-height tally. The patched MCNP code also can be used to compute neutron slowing-down length and thermal neutron diffusion length. Copyright © 2011 Elsevier Ltd. All rights reserved.
Lenferink, Lonneke I M; van Denderen, Mariëtte Y; de Keijser, Jos; Wessel, Ineke; Boelen, Paul A
2017-02-01
Traumatic loss (e.g., homicide) is associated with elevated prolonged grief disorder (PGD) and posttraumatic stress disorder (PTSD). Several studies comparing relatives of missing persons with homicidally bereaved individuals showed inconsistent results about the difference in PGD- and PTSD-levels between the groups. These studies were conducted in the context of armed conflict, which may confound the results. The current study aims to compare PGD- and PTSD-levels between the groups outside the context of armed conflict. Relatives of long-term missing persons (n=134) and homicidally bereaved individuals (n=331) completed self-report measures of PGD and PTSD. Multilevel regression modelling was used to compare symptom scores between the groups. Homicidally bereaved individuals reported significantly higher levels of PGD (d=0.86) and PTSD (d=0.28) than relatives of missing persons, when taking relevant covariates (i.e., gender, time since loss, and kinship to the disappeared/deceased person) into account. A limitation of this study is the use of self-report measures instead of clinical interviews. Prior studies among relatives of missing persons and homicidally bereaved individuals in the context of armed conflict may not be generalizable to similar samples outside these contexts. Future research is needed to further explore differences in bereavement-related psychopathology between different groups and correlates and treatment of this psychopathology. Copyright © 2016 Elsevier B.V. All rights reserved.
Design and Performance of a Spectrometer for Deployment on MISSE 7
NASA Technical Reports Server (NTRS)
Pippin, Gary; Beymer, Jim; Robb, Andrew; Longino, James; Perry, George; Stewart, Alan; Finkenor, Miria
2009-01-01
A spectrometer for reflectance and transmission measurements of samples exposed to the space environment has been developed for deployment on the Materials on the International Space Station Experiment (MISSE) 7. The instrument incorporates a miniature commercial fiber optic coupled spectrometer with a computer control system for detector operation, sample motion and illumination. A set of three spectrometers were recently integrated on the MISSE7 platform with launch and deployment on the International Space Station scheduled for summer of this year. The instrument is one of many active experiments on the platform. The performance of the instrument prior to launch will be discussed. Data from samples measured in the laboratory will be compared to those from the instrument prior to launch. These comparisons will illustrate the capabilities of the current design. The space environment challenges many materials. When in operation on the MISSE 7 platform, the new spectrometer will provide real time data on the how the space environment affects the optical properties of thermal control paints and optical coatings. Data obtained from comparison of pre and post flight measurements on hundreds of samples exposed on previous MISSE platforms have been reported at these meetings. With the new spectrometer and the ability to correlate measured changes with time on orbit and the occurrence of both natural events and human activities, a better understanding of the processes responsible for degradation of materials in space will be possible.
Why do workaholics experience depression? A study with Chinese University teachers.
Nie, Yingzhi; Sun, Haitao
2016-10-01
This study focuses on the relationships of workaholism to job burnout and depression of university teachers. The direct and indirect (via job burnout) effects of workaholism on depression were investigated in 412 Chinese university teachers. Structural equation modeling and bootstrap method were used. Results revealed that workaholism, job burnout, and depression significantly correlated with each other. Structural equation modeling and bootstrap test indicated the partial mediation role of job burnout on the relationship between workaholism and depression. The findings shed some light on how workaholism influenced depression and provided valuable evidence for prevention of depression in work. © The Author(s) 2015.
Blank, Jos L T; van Hulst, Bart Laurents
2011-10-01
This paper describes the efficiency of Dutch hospitals using the Data Envelopment Analysis (DEA) method with bootstrapping. In particular, the analysis focuses on accounting for cost inefficiency measures on the part of hospital corporate governance. We use bootstrap techniques, as introduced by Simar and Wilson (J. Econom. 136(1):31-64, 2007), in order to obtain more efficient estimates of the effects of governance on the efficiency. The results show that part of the cost efficiency can be explained with governance. In particular we find that a higher remuneration of the board as well as a higher remuneration of the supervisory board does not implicate better performance.
Construction of prediction intervals for Palmer Drought Severity Index using bootstrap
NASA Astrophysics Data System (ADS)
Beyaztas, Ufuk; Bickici Arikan, Bugrayhan; Beyaztas, Beste Hamiye; Kahya, Ercan
2018-04-01
In this study, we propose an approach based on the residual-based bootstrap method to obtain valid prediction intervals using monthly, short-term (three-months) and mid-term (six-months) drought observations. The effects of North Atlantic and Arctic Oscillation indexes on the constructed prediction intervals are also examined. Performance of the proposed approach is evaluated for the Palmer Drought Severity Index (PDSI) obtained from Konya closed basin located in Central Anatolia, Turkey. The finite sample properties of the proposed method are further illustrated by an extensive simulation study. Our results revealed that the proposed approach is capable of producing valid prediction intervals for future PDSI values.
Spotorno O, Angel E; Córdova, Luis; Solari I, Aldo
2008-12-01
To identify and characterize chilean samples of Trypanosoma cruzi and their association with hosts, the first 516 bp of the mitochondrial cytochrome b gene were sequenced from eight biological samples, and phylogenetically compared with other known 20 American sequences. The molecular characterization of these 28 sequences in a maximum likelihood phylogram (-lnL = 1255.12, tree length = 180, consistency index = 0.79) allowed the robust identification (bootstrap % > 99) of three previously known discrete typing units (DTU): DTU IIb, IIa, and I. An apparently undescribed new sequence found in four new chilean samples was detected and designated as DTU Ib; they were separated by 24.7 differences, but robustly related (bootstrap % = 97 in 500 replicates) to those of DTU I by sharing 12 substitutions, among which four were nonsynonymous ones. Such new DTU Ib was also robust (bootstrap % = 100), and characterized by 10 unambiguous substitutions, with a single nonsynonymous G to T change at site 409. The fact that two of such new sequences were found in parasites from a chilean endemic caviomorph rodent, Octodon degus, and that they were closely related to the ancient DTU I suggested old origins and a long association to caviomorph hosts.
Fernández-Caballero Rico, Jose Ángel; Chueca Porcuna, Natalia; Álvarez Estévez, Marta; Mosquera Gutiérrez, María Del Mar; Marcos Maeso, María Ángeles; García, Federico
2018-02-01
To show how to generate a consensus sequence from the information of massive parallel sequences data obtained from routine HIV anti-retroviral resistance studies, and that may be suitable for molecular epidemiology studies. Paired Sanger (Trugene-Siemens) and next-generation sequencing (NGS) (454 GSJunior-Roche) HIV RT and protease sequences from 62 patients were studied. NGS consensus sequences were generated using Mesquite, using 10%, 15%, and 20% thresholds. Molecular evolutionary genetics analysis (MEGA) was used for phylogenetic studies. At a 10% threshold, NGS-Sanger sequences from 17/62 patients were phylogenetically related, with a median bootstrap-value of 88% (IQR83.5-95.5). Association increased to 36/62 sequences, median bootstrap 94% (IQR85.5-98)], using a 15% threshold. Maximum association was at the 20% threshold, with 61/62 sequences associated, and a median bootstrap value of 99% (IQR98-100). A safe method is presented to generate consensus sequences from HIV-NGS data at 20% threshold, which will prove useful for molecular epidemiological studies. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Highton, R
1993-12-01
An analysis of the relationship between the number of loci utilized in an electrophoretic study of genetic relationships and the statistical support for the topology of UPGMA trees is reported for two published data sets. These are Highton and Larson (Syst. Zool.28:579-599, 1979), an analysis of the relationships of 28 species of plethodonine salamanders, and Hedges (Syst. Zool., 35:1-21, 1986), a similar study of 30 taxa of Holarctic hylid frogs. As the number of loci increases, the statistical support for the topology at each node in UPGMA trees was determined by both the bootstrap and jackknife methods. The results show that the bootstrap and jackknife probabilities supporting the topology at some nodes of UPGMA trees increase as the number of loci utilized in a study is increased, as expected for nodes that have groupings that reflect phylogenetic relationships. The pattern of increase varies and is especially rapid in the case of groups with no close relatives. At nodes that likely do not represent correct phylogenetic relationships, the bootstrap probabilities do not increase and often decline with the addition of more loci.
Comparison of mode estimation methods and application in molecular clock analysis
NASA Technical Reports Server (NTRS)
Hedges, S. Blair; Shah, Prachi
2003-01-01
BACKGROUND: Distributions of time estimates in molecular clock studies are sometimes skewed or contain outliers. In those cases, the mode is a better estimator of the overall time of divergence than the mean or median. However, different methods are available for estimating the mode. We compared these methods in simulations to determine their strengths and weaknesses and further assessed their performance when applied to real data sets from a molecular clock study. RESULTS: We found that the half-range mode and robust parametric mode methods have a lower bias than other mode methods under a diversity of conditions. However, the half-range mode suffers from a relatively high variance and the robust parametric mode is more susceptible to bias by outliers. We determined that bootstrapping reduces the variance of both mode estimators. Application of the different methods to real data sets yielded results that were concordant with the simulations. CONCLUSION: Because the half-range mode is a simple and fast method, and produced less bias overall in our simulations, we recommend the bootstrapped version of it as a general-purpose mode estimator and suggest a bootstrap method for obtaining the standard error and 95% confidence interval of the mode.
Bootstrap percolation on spatial networks
NASA Astrophysics Data System (ADS)
Gao, Jian; Zhou, Tao; Hu, Yanqing
2015-10-01
Bootstrap percolation is a general representation of some networked activation process, which has found applications in explaining many important social phenomena, such as the propagation of information. Inspired by some recent findings on spatial structure of online social networks, here we study bootstrap percolation on undirected spatial networks, with the probability density function of long-range links’ lengths being a power law with tunable exponent. Setting the size of the giant active component as the order parameter, we find a parameter-dependent critical value for the power-law exponent, above which there is a double phase transition, mixed of a second-order phase transition and a hybrid phase transition with two varying critical points, otherwise there is only a second-order phase transition. We further find a parameter-independent critical value around -1, about which the two critical points for the double phase transition are almost constant. To our surprise, this critical value -1 is just equal or very close to the values of many real online social networks, including LiveJournal, HP Labs email network, Belgian mobile phone network, etc. This work helps us in better understanding the self-organization of spatial structure of online social networks, in terms of the effective function for information spreading.
NASA Astrophysics Data System (ADS)
Angrisano, Antonio; Maratea, Antonio; Gaglione, Salvatore
2018-01-01
In the absence of obstacles, a GPS device is generally able to provide continuous and accurate estimates of position, while in urban scenarios buildings can generate multipath and echo-only phenomena that severely affect the continuity and the accuracy of the provided estimates. Receiver autonomous integrity monitoring (RAIM) techniques are able to reduce the negative consequences of large blunders in urban scenarios, but require both a good redundancy and a low contamination to be effective. In this paper a resampling strategy based on bootstrap is proposed as an alternative to RAIM, in order to estimate accurately position in case of low redundancy and multiple blunders: starting with the pseudorange measurement model, at each epoch the available measurements are bootstrapped—that is random sampled with replacement—and the generated a posteriori empirical distribution is exploited to derive the final position. Compared to standard bootstrap, in this paper the sampling probabilities are not uniform, but vary according to an indicator of the measurement quality. The proposed method has been compared with two different RAIM techniques on a data set collected in critical conditions, resulting in a clear improvement on all considered figures of merit.
2009-01-01
Background The International Commission on Radiological Protection (ICRP) recommended annual occupational dose limit is 20 mSv. Cancer mortality in Japanese A-bomb survivors exposed to less than 20 mSv external radiation in 1945 was analysed previously, using a latency model with non-linear dose response. Questions were raised regarding statistical inference with this model. Methods Cancers with over 100 deaths in the 0 - 20 mSv subcohort of the 1950-1990 Life Span Study are analysed with Poisson regression models incorporating latency, allowing linear and non-linear dose response. Bootstrap percentile and Bias-corrected accelerated (BCa) methods and simulation of the Likelihood Ratio Test lead to Confidence Intervals for Excess Relative Risk (ERR) and tests against the linear model. Results The linear model shows significant large, positive values of ERR for liver and urinary cancers at latencies from 37 - 43 years. Dose response below 20 mSv is strongly non-linear at the optimal latencies for the stomach (11.89 years), liver (36.9), lung (13.6), leukaemia (23.66), and pancreas (11.86) and across broad latency ranges. Confidence Intervals for ERR are comparable using Bootstrap and Likelihood Ratio Test methods and BCa 95% Confidence Intervals are strictly positive across latency ranges for all 5 cancers. Similar risk estimates for 10 mSv (lagged dose) are obtained from the 0 - 20 mSv and 5 - 500 mSv data for the stomach, liver, lung and leukaemia. Dose response for the latter 3 cancers is significantly non-linear in the 5 - 500 mSv range. Conclusion Liver and urinary cancer mortality risk is significantly raised using a latency model with linear dose response. A non-linear model is strongly superior for the stomach, liver, lung, pancreas and leukaemia. Bootstrap and Likelihood-based confidence intervals are broadly comparable and ERR is strictly positive by bootstrap methods for all 5 cancers. Except for the pancreas, similar estimates of latency and risk from 10 mSv are obtained from the 0 - 20 mSv and 5 - 500 mSv subcohorts. Large and significant cancer risks for Japanese survivors exposed to less than 20 mSv external radiation from the atomic bombs in 1945 cast doubt on the ICRP recommended annual occupational dose limit. PMID:20003238
Crow, Thomas; Cross, Dorthie; Powers, Abigail; Bradley, Bekh
2014-10-01
Abuse and neglect in childhood are well-established risk factors for later psychopathology. Past research has suggested that childhood emotional abuse may be particularly harmful to psychological development. The current cross-sectional study employed multiple regression techniques to assess the effects of childhood trauma on adulthood depression and emotion dysregulation in a large sample of mostly low-income African Americans recruited in an urban hospital. Bootstrap analyses were used to test emotion dysregulation as a potential mediator between emotional abuse in childhood and current depression. Childhood emotional abuse significantly predicted depressive symptoms even when accounting for all other childhood trauma types, and we found support for a complementary mediation of this relationship by emotion dysregulation. Our findings highlight the importance of emotion dysregulation and childhood emotional abuse in relation to adult depression. Moving forward, clinicians should consider the particular importance of emotional abuse in the development of depression, and future research should seek to identify mechanisms through which emotional abuse increases risk for depression and emotion dysregulation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Crow, Thomas; Cross, Dorthie; Powers, Abigail; Bradley, Bekh
2014-01-01
Abuse and neglect in childhood are well-established risk factors for later psychopathology. Past research has suggested that childhood emotional abuse may be particularly harmful to psychological development. The current cross-sectional study employed multiple regression techniques to assess the effects of childhood trauma on adulthood depression and emotion dysregulation in a large sample of mostly low-income African Americans recruited in an urban hospital. Bootstrap analyses were used to test emotion dysregulation as a potential mediator between emotional abuse in childhood and current depression. Childhood emotional abuse significantly predicted depressive symptoms even when accounting for all other childhood trauma types, and we found support for a complementary mediation of this relationship by emotion dysregulation. Our findings highlight the importance of emotion dysregulation and childhood emotional abuse in relation to adult depression. Moving forward, clinicians should consider the particular importance of emotional abuse in the development of depression, and future research should seek to identify mechanisms through which emotional abuse increases risk for depression and emotion dysregulation. PMID:25035171
Suppressing magnetic island growth by resonant magnetic perturbation
NASA Astrophysics Data System (ADS)
Yu, Q.; Günter, S.; Lackner, K.
2018-05-01
The effect of externally applied resonant magnetic perturbations (RMPs) on the growth of magnetic islands is investigated based on two-fluid equations. It is found that if the local bi-normal electron fluid velocity at the resonant surface is sufficiently large, static RMPs of the same helicity and of moderate amplitude can suppress the growth of magnetic islands in high-temperature plasmas. These islands will otherwise grow, driven by an unfavorable plasma current density profile and bootstrap current perturbation. These results indicate that the error field can stabilize island growth, if the error field amplitude is not too large and the local bi-normal electron fluid velocity is not too low. They also indicate that applied rotating RMPs with an appropriate frequency can be utilized to suppress island growth in high-temperature plasmas, even for a low bi-normal electron fluid velocity. A significant change in the local equilibrium plasma current density gradient by small amplitude RMPs is found for realistic plasma parameters, which are important for the island stability and are expected to be more important for fusion reactors with low plasma resistivity.
Plasma stability analysis using Consistent Automatic Kinetic Equilibrium reconstruction (CAKE)
NASA Astrophysics Data System (ADS)
Roelofs, Matthijs; Kolemen, Egemen; Eldon, David; Glasser, Alex; Meneghini, Orso; Smith, Sterling P.
2017-10-01
Presented here is the Consistent Automatic Kinetic Equilibrium (CAKE) code. CAKE is being developed to perform real-time kinetic equilibrium reconstruction, aiming to do a reconstruction in less than 100ms. This is achieved by taking, next to real-time Motional Stark Effect (MSE) and magnetics data, real-time Thomson Scattering (TS) and real-time Charge Exchange Recombination (CER, still in development) data in to account. Electron densities and temperature are determined by TS, while ion density and pressures are determined using CER. These form, together with the temperature and density of neutrals, the additional pressure constraints. Extra current constraints are imposed in the core by the MSE diagnostics. The pedestal current density is estimated using Sauters equation for the bootstrap current density. By comparing the behaviour of the ideal MHD perturbed potential energy (δW) and the linear stability index (Δ') of CAKE to magnetics-only reconstruction, it can be seen that the use of diagnostics to reconstruct the pedestal have a large effect on stability. Supported by U.S. DOE DE-SC0015878 and DE-FC02-04ER54698.
Nonlinear Fluid Model Of 3-D Field Effects In Tokamak Plasmas
NASA Astrophysics Data System (ADS)
Callen, J. D.; Hegna, C. C.; Beidler, M. T.
2017-10-01
Extended MHD codes (e.g., NIMROD, M3D-C1) are beginning to explore nonlinear effects of small 3-D magnetic fields on tokamak plasmas. To facilitate development of analogous physically understandable reduced models, a fluid-based dynamic nonlinear model of these added 3-D field effects in the base axisymmetric tokamak magnetic field geometry is being developed. The model incorporates kinetic-based closures within an extended MHD framework. Key 3-D field effects models that have been developed include: 1) a comprehensive modified Rutherford equation for the growth of a magnetic island that includes the classical tearing and NTM perturbed bootstrap current drives, externally applied magnetic field and current drives, and classical and neoclassical polarization current effects, and 2) dynamic nonlinear evolution of the plasma toroidal flow (radial electric field) in response to the 3-D fields. An application of this model to RMP ELM suppression precipitated by an ELM crash will be discussed. Supported by Office of Fusion Energy Sciences, Office of Science, Dept. of Energy Grants DE-FG02-86ER53218 and DE-FG02-92ER54139.
Improved Design of Stellarator Coils for Current Carrying Plasmas
NASA Astrophysics Data System (ADS)
Drevlak, M.; Strumberger, E.; Hirshman, S.; Boozer, A.; Brooks, A.; Valanju, P.
1998-11-01
The method of automatic optimization (P. Merkel, Nucl. Fus. 27), (1987) 867; P. Merkel, M. Drevlak, Proc 25th EPS Conf. on Cont. Fus. and Plas. Phys., Prague, in print. for the design of stellarator coils consists essentially of determining filaments such that the average relative field error int dS [ (B_coil + B_j) \\cdot n]^2/B^2_coil is minimized on the prescribed plasma boundary. Bj is the magnetic field produced by the plasma currents of the given finite β fixed boundary equilibrium. For equilibria of the W7-X type, Bj can be neglected, because of the reduced parallel plasma currents. This is not true for quasi-axisymmetric stellarator (QAS) configurations (A. Reiman, et al., to be published.) with large equilibrium and net plasma (bootstrap) currents. Although the coils for QAS exhibit low values of the field error, free boundary calculations indicate that the shape of the plasma is usually not accurately reproduced , particularly when saddle coils are used. We investigate if the surface reconstruction can be improved by introducing a modified measure of the field error based on a measure of the resonant components of the normal field.
Current/Pressure Profile Effects on Tearing Mode Stability in DIII-D Hybrid Discharges
NASA Astrophysics Data System (ADS)
Kim, K.; Park, J. M.; Murakami, M.; La Haye, R. J.; Na, Yong-Su
2015-11-01
It is important to understand the onset threshold and the evolution of tearing modes (TMs) for developing a high-performance steady state fusion reactor. As initial and basic comparisons to determine TM onset, the measured plasma profiles (such as temperature, density, rotation) were compared with the calculated current profiles between a pair of discharges with/without n=1 mode based on the database for DIII-D hybrid plasmas. The profiles were not much different, but the details were analyzed to determine their characteristics, especially near the rational surface. The tearing stability index calculated from PEST3, Δ' tends to increase rapidly just before the n=1 mode onset for these cases. The modeled equilibrium with varying pressure or current profiles parametrically based on the reference discharge is reconstructed for checking the onset dependency on Δ' or neoclassical effects such as bootstrap current. Simulations of TMs with the modeled equilibrium using resistive MHD codes will also be presented and compared with experiments to determine the sensibility for predicting TM onset. Work supported by US DOE under DE-FC02-04ER54698 and DE-AC52-07NA27344.
Imputing missing data via sparse reconstruction techniques.
DOT National Transportation Integrated Search
2017-06-01
The State of Texas does not currently have an automated approach for estimating volumes for links without counts. This research project proposes the development of an automated system to efficiently estimate the traffic volumes on uncounted links, in...
Active Traffic Capture for Network Forensics
NASA Astrophysics Data System (ADS)
Slaviero, Marco; Granova, Anna; Olivier, Martin
Network traffic capture is an integral part of network forensics, but current traffic capture techniques are typically passive in nature. Under heavy loads, it is possible for a sniffer to miss packets, which affects the quality of forensic evidence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaing, K. C.; Peng, Yueng Kay Martin
Transport theory for potato orbits in the region near the magnetic axis in an axisymmetric torus such as tokamaks and spherical tori is extended to the situation where the toroidal flow speed is of the order of the sonic speed as observed in National Spherical Torus Experiment [E. J. Synakowski, M. G. Bell, R. E. Bell et al., Nucl. Fusion 43, 1653 (2003)]. It is found that transport fluxes such as ion radial heat flux, and bootstrap current density are modified by a factor of the order of the square of the toroidal Mach number. The consequences of the orbitmore » squeezing are also presented. The theory is developed for parabolic (in radius r) plasma profiles. A method to apply the results of the theory for the transport modeling is discussed.« less