Validation of neoclassical bootstrap current models in the edge of an H-mode plasma.
Wade, M R; Murakami, M; Politzer, P A
2004-06-11
Analysis of the parallel electric field E(parallel) evolution following an L-H transition in the DIII-D tokamak indicates the generation of a large negative pulse near the edge which propagates inward, indicative of the generation of a noninductive edge current. Modeling indicates that the observed E(parallel) evolution is consistent with a narrow current density peak generated in the plasma edge. Very good quantitative agreement is found between the measured E(parallel) evolution and that expected from neoclassical theory predictions of the bootstrap current.
Non-inductive current generation in fusion plasmas with turbulence
NASA Astrophysics Data System (ADS)
Wang, Weixing; Ethier, S.; Startsev, E.; Chen, J.; Hahm, T. S.; Yoo, M. G.
2017-10-01
It is found that plasma turbulence may strongly influence non-inductive current generation. This may have radical impact on various aspects of tokamak physics. Our simulation study employs a global gyrokinetic model coupling self-consistent neoclassical and turbulent dynamics with focus on electron current. Distinct phases in electron current generation are illustrated in the initial value simulation. In the early phase before turbulence develops, the electron bootstrap current is established in a time scale of a few electron collision times, which closely agrees with the neoclassical prediction. The second phase follows when turbulence begins to saturate, during which turbulent fluctuations are found to strongly affect electron current. The profile structure, amplitude and phase space structure of electron current density are all significantly modified relative to the neoclassical bootstrap current by the presence of turbulence. Both electron parallel acceleration and parallel residual stress drive are shown to play important roles in turbulence-induced current generation. The current density profile is modified in a way that correlates with the fluctuation intensity gradient through its effect on k//-symmetry breaking in fluctuation spectrum. Turbulence is shown to deduct (enhance) plasma self-generated current in low (high) collisionality regime, and the reduction of total electron current relative to the neoclassical bootstrap current increases as collisionality decreases. The implication of this result to the fully non-inductive current operation in steady state burning plasma regime should be investigated. Finally, significant non-inductive current is observed in flat pressure region, which is a nonlocal effect and results from turbulence spreading induced current diffusion. Work supported by U.S. DOE Contract DE-AC02-09-CH11466.
Innovation cascades: artefacts, organization and attributions
2016-01-01
Innovation cascades inextricably link the introduction of new artefacts, transformations in social organization, and the emergence of new functionalities and new needs. This paper describes a positive feedback dynamic, exaptive bootstrapping, through which these cascades proceed, and the characteristics of the relationships in which the new attributions that drive this dynamic are generated. It concludes by arguing that the exaptive bootstrapping dynamic is the principal driver of our current Innovation Society. PMID:26926284
Effects of magnetic islands on bootstrap current in toroidal plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, G.; Lin, Z.
The effects of magnetic islands on electron bootstrap current in toroidal plasmas are studied using gyrokinetic simulations. The magnetic islands cause little changes of the bootstrap current level in the banana regime because of trapped electron effects. In the plateau regime, the bootstrap current is completely suppressed at the island centers due to the destruction of trapped electron orbits by collisions and the flattening of pressure profiles by the islands. In the collisional regime, small but finite bootstrap current can exist inside the islands because of the pressure gradients created by large collisional transport across the islands. Lastly, simulation resultsmore » show that the bootstrap current level increases near the island separatrix due to steeper local density gradients.« less
Effects of magnetic islands on bootstrap current in toroidal plasmas
Dong, G.; Lin, Z.
2016-12-19
The effects of magnetic islands on electron bootstrap current in toroidal plasmas are studied using gyrokinetic simulations. The magnetic islands cause little changes of the bootstrap current level in the banana regime because of trapped electron effects. In the plateau regime, the bootstrap current is completely suppressed at the island centers due to the destruction of trapped electron orbits by collisions and the flattening of pressure profiles by the islands. In the collisional regime, small but finite bootstrap current can exist inside the islands because of the pressure gradients created by large collisional transport across the islands. Lastly, simulation resultsmore » show that the bootstrap current level increases near the island separatrix due to steeper local density gradients.« less
Reduced ion bootstrap current drive on NTM instability
NASA Astrophysics Data System (ADS)
Qu, Hongpeng; Wang, Feng; Wang, Aike; Peng, Xiaodong; Li, Jiquan
2018-05-01
The loss of bootstrap current inside magnetic island plays a dominant role in driving the neoclassical tearing mode (NTM) instability in tokamak plasmas. In this work, we investigate the finite-banana-width (FBW) effect on the profile of ion bootstrap current in the island vicinity via an analytical approach. The results show that even if the pressure gradient vanishes inside the island, the ion bootstrap current can partly survive due to the FBW effect. The efficiency of the FBW effect is higher when the island width becomes smaller. Nevertheless, even when the island width is comparable to the ion FBW, the unperturbed ion bootstrap current inside the island cannot be largely recovered by the FBW effect, and thus the current loss still exists. This suggests that FBW effect alone cannot dramatically reduce the ion bootstrap current drive on NTMs.
NASA Astrophysics Data System (ADS)
Monticello, D. A.; Reiman, A. H.; Watanabe, K. Y.; Nakajima, N.; Okamoto, M.
1997-11-01
The existence of bootstrap currents in both tokamaks and stellarators was confirmed, experimentally, more than ten years ago. Such currents can have significant effects on the equilibrium and stability of these MHD devices. In addition, stellarators, with the notable exception of W7-X, are predicted to have such large bootstrap currents that reliable equilibrium calculations require the self-consistent evaluation of bootstrap currents. Modeling of discharges which contain islands requires an algorithm that does not assume good surfaces. Only one of the two 3-D equilibrium codes that exist, PIES( Reiman, A. H., Greenside, H. S., Compt. Phys. Commun. 43), (1986)., can easily be modified to handle bootstrap current. Here we report on the coupling of the PIES 3-D equilibrium code and NIFS bootstrap code(Watanabe, K., et al., Nuclear Fusion 35) (1995), 335.
Test of bootstrap current models using high- β p EAST-demonstration plasmas on DIII-D
Ren, Qilong; Lao, Lang L.; Garofalo, Andrea M.; ...
2015-01-12
Magnetic measurements together with kinetic profile and motional Stark effect measurements are used in full kinetic equilibrium reconstructions to test the Sauter and NEO bootstrap current models in a DIII-D high-more » $${{\\beta}_{\\text{p}}}$$ EAST-demonstration experiment. This aims at developing on DIII-D a high bootstrap current scenario to be extended on EAST for a demonstration of true steady-state at high performance and uses EAST-similar operational conditions: plasma shape, plasma current, toroidal magnetic field, total heating power and current ramp-up rate. It is found that the large edge bootstrap current in these high-$${{\\beta}_{\\text{p}}}$$ plasmas allows the use of magnetic measurements to clearly distinguish the two bootstrap current models. In these high collisionality and high-$${{\\beta}_{\\text{p}}}$$ plasmas, the Sauter model overpredicts the peak of the edge current density by about 30%, while the first-principle kinetic NEO model is in close agreement with the edge current density of the reconstructed equilibrium. Furthermore, these results are consistent with recent work showing that the Sauter model largely overestimates the edge bootstrap current at high collisionality.« less
Bootstrap and fast wave current drive for tokamak reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ehst, D.A.
1991-09-01
Using the multi-species neoclassical treatment of Hirshman and Sigmar we study steady state bootstrap equilibria with seed currents provided by low frequency (ICRF) fast waves and with additional surface current density driven by lower hybrid waves. This study applies to reactor plasmas of arbitrary aspect ratio. IN one limit the bootstrap component can supply nearly the total equilibrium current with minimal driving power (< 20 MW). However, for larger total currents considerable driving power is required (for ITER: I{sub o} = 18 MA needs P{sub FW} = 15 MW, P{sub LH} = 75 MW). A computational survey of bootstrap fractionmore » and current drive efficiency is presented. 11 refs., 8 figs.« less
Bootstrap current in a tokamak
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kessel, C.E.
1994-03-01
The bootstrap current in a tokamak is examined by implementing the Hirshman-Sigmar model and comparing the predicted current profiles with those from two popular approximations. The dependences of the bootstrap current profile on the plasma properties are illustrated. The implications for steady state tokamaks are presented through two constraints; the pressure profile must be peaked and {beta}{sub p} must be kept below a critical value.
ERIC Educational Resources Information Center
Fan, Xitao
This paper empirically and systematically assessed the performance of bootstrap resampling procedure as it was applied to a regression model. Parameter estimates from Monte Carlo experiments (repeated sampling from population) and bootstrap experiments (repeated resampling from one original bootstrap sample) were generated and compared. Sample…
Control of bootstrap current in the pedestal region of tokamaks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaing, K. C.; Department of Engineering Physics, University of Wisconsin, Madison, Wisconsin 53796; Lai, A. L.
2013-12-15
The high confinement mode (H-mode) plasmas in the pedestal region of tokamaks are characterized by steep gradient of the radial electric field, and sonic poloidal U{sub p,m} flow that consists of poloidal components of the E×B flow and the plasma flow velocity that is parallel to the magnetic field B. Here, E is the electric field. The bootstrap current that is important for the equilibrium, and stability of the pedestal of H-mode plasmas is shown to have an expression different from that in the conventional theory. In the limit where ‖U{sub p,m}‖≫ 1, the bootstrap current is driven by themore » electron temperature gradient and inductive electric field fundamentally different from that in the conventional theory. The bootstrap current in the pedestal region can be controlled through manipulating U{sub p,m} and the gradient of the radial electric. This, in turn, can control plasma stability such as edge-localized modes. Quantitative evaluations of various coefficients are shown to illustrate that the bootstrap current remains finite when ‖U{sub p,m}‖ approaches infinite and to provide indications how to control the bootstrap current. Approximate analytic expressions for viscous coefficients that join results in the banana and plateau-Pfirsch-Schluter regimes are presented to facilitate bootstrap and neoclassical transport simulations in the pedestal region.« less
NASA Astrophysics Data System (ADS)
Olafsdottir, Kristin B.; Mudelsee, Manfred
2013-04-01
Estimation of the Pearson's correlation coefficient between two time series to evaluate the influences of one time depended variable on another is one of the most often used statistical method in climate sciences. Various methods are used to estimate confidence interval to support the correlation point estimate. Many of them make strong mathematical assumptions regarding distributional shape and serial correlation, which are rarely met. More robust statistical methods are needed to increase the accuracy of the confidence intervals. Bootstrap confidence intervals are estimated in the Fortran 90 program PearsonT (Mudelsee, 2003), where the main intention was to get an accurate confidence interval for correlation coefficient between two time series by taking the serial dependence of the process that generated the data into account. However, Monte Carlo experiments show that the coverage accuracy for smaller data sizes can be improved. Here we adapt the PearsonT program into a new version called PearsonT3, by calibrating the confidence interval to increase the coverage accuracy. Calibration is a bootstrap resampling technique, which basically performs a second bootstrap loop or resamples from the bootstrap resamples. It offers, like the non-calibrated bootstrap confidence intervals, robustness against the data distribution. Pairwise moving block bootstrap is used to preserve the serial correlation of both time series. The calibration is applied to standard error based bootstrap Student's t confidence intervals. The performances of the calibrated confidence intervals are examined with Monte Carlo simulations, and compared with the performances of confidence intervals without calibration, that is, PearsonT. The coverage accuracy is evidently better for the calibrated confidence intervals where the coverage error is acceptably small (i.e., within a few percentage points) already for data sizes as small as 20. One form of climate time series is output from numerical models which simulate the climate system. The method is applied to model data from the high resolution ocean model, INALT01 where the relationship between the Agulhas Leakage and the North Brazil Current is evaluated. Preliminary results show significant correlation between the two variables when there is 10 year lag between them, which is more or less the time that takes the Agulhas Leakage water to reach the North Brazil Current. Mudelsee, M., 2003. Estimating Pearson's correlation coefficient with bootstrap confidence interval from serially dependent time series. Mathematical Geology 35, 651-665.
Electron transport fluxes in potato plateau regime
NASA Astrophysics Data System (ADS)
Shaing, K. C.; Hazeltine, R. D.
1997-12-01
Electron transport fluxes in the potato plateau regime are calculated from the solutions of the drift kinetic equation and fluid equations. It is found that the bootstrap current density remains finite in the region close to the magnetic axis, although it decreases with increasing collision frequency. This finite amount of the bootstrap current in the relatively collisional regime is important in modeling tokamak startup with 100% bootstrap current.
Chaibub Neto, Elias
2015-01-01
In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson’s sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling. PMID:26125965
A bootstrap based space-time surveillance model with an application to crime occurrences
NASA Astrophysics Data System (ADS)
Kim, Youngho; O'Kelly, Morton
2008-06-01
This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaing, K.C.; Hazeltine, R.D.
Electron transport fluxes in the potato plateau regime are calculated from the solutions of the drift kinetic equation and fluid equations. It is found that the bootstrap current density remains finite in the region close to the magnetic axis, although it decreases with increasing collision frequency. This finite amount of the bootstrap current in the relatively collisional regime is important in modeling tokamak startup with 100{percent} bootstrap current. {copyright} {ital 1997 American Institute of Physics.}
Bootstrapping Confidence Intervals for Robust Measures of Association.
ERIC Educational Resources Information Center
King, Jason E.
A Monte Carlo simulation study was conducted to determine the bootstrap correction formula yielding the most accurate confidence intervals for robust measures of association. Confidence intervals were generated via the percentile, adjusted, BC, and BC(a) bootstrap procedures and applied to the Winsorized, percentage bend, and Pearson correlation…
Progress Toward Steady State Tokamak Operation Exploiting the high bootstrap current fraction regime
NASA Astrophysics Data System (ADS)
Ren, Q.
2015-11-01
Recent DIII-D experiments have advanced the normalized fusion performance of the high bootstrap current fraction tokamak regime toward reactor-relevant steady state operation. The experiments, conducted by a joint team of researchers from the DIII-D and EAST tokamaks, developed a fully noninductive scenario that could be extended on EAST to a demonstration of long pulse steady-state tokamak operation. Fully noninductive plasmas with extremely high values of the poloidal beta, βp >= 4 , have been sustained at βT >= 2 % for long durations with excellent energy confinement quality (H98y,2 >= 1 . 5) and internal transport barriers (ITBs) generated at large minor radius (>= 0 . 6) in all channels (Te, Ti, ne, VTf). Large bootstrap fraction (fBS ~ 80 %) has been obtained with high βp. ITBs have been shown to be compatible with steady state operation. Because of the unusually large ITB radius, normalized pressure is not limited to low βN values by internal ITB-driven modes. βN up to ~4.3 has been obtained by optimizing the plasma-wall distance. The scenario is robust against several variations, including replacing some on-axis with off-axis neutral beam injection (NBI), adding electron cyclotron (EC) heating, and reducing the NBI torque by a factor of 2. This latter observation is particularly promising for extension of the scenario to EAST, where maximum power is obtained with balanced NBI injection, and to a reactor, expected to have low rotation. However, modeling of this regime has provided new challenges to state-of-the-art modeling capabilities: quasilinear models can dramatically underpredict the electron transport, and the Sauter bootstrap current can be insufficient. The analysis shows first-principle NEO is in good agreement with experiments for the bootstrap current calculation and ETG modes with a larger saturated amplitude or EM modes may provide the missing electron transport. Work supported in part by the US DOE under DE-FC02-04ER54698, DE-AC52-07NA27344, DE-AC02-09CH11466, and the NMCFP of China under 2015GB110000 and 2015GB102000.
Bootstrap Estimation of Sample Statistic Bias in Structural Equation Modeling.
ERIC Educational Resources Information Center
Thompson, Bruce; Fan, Xitao
This study empirically investigated bootstrap bias estimation in the area of structural equation modeling (SEM). Three correctly specified SEM models were used under four different sample size conditions. Monte Carlo experiments were carried out to generate the criteria against which bootstrap bias estimation should be judged. For SEM fit indices,…
Unconventional Expressions: Productive Syntax in the L2 Acquisition of Formulaic Language
ERIC Educational Resources Information Center
Bardovi-Harlig, Kathleen; Stringer, David
2017-01-01
This article presents a generative analysis of the acquisition of formulaic language as an alternative to current usage-based proposals. One influential view of the role of formulaic expressions in second language (L2) development is that they are a bootstrapping mechanism into the L2 grammar; an initial repertoire of constructions allows for…
NASA Astrophysics Data System (ADS)
Peraza-Rodriguez, H.; Reynolds-Barredo, J. M.; Sanchez, R.; Tribaldos, V.; Geiger, J.
2018-02-01
The recently developed free-plasma-boundary version of the SIESTA MHD equilibrium code (Hirshman et al 2011 Phys. Plasmas 18 062504; Peraza-Rodriguez et al 2017 Phys. Plasmas 24 082516) is used for the first time to study scenarios with considerable bootstrap currents for the Wendelstein 7-X (W7-X) stellarator. Bootstrap currents in the range of tens of kAs can lead to the formation of unwanted magnetic island chains or stochastic regions within the plasma and alter the boundary rotational transform due to the small shear in W7-X. The latter issue is of relevance since the island divertor operation of W7-X relies on a proper positioning of magnetic island chains at the plasma edge to control the particle and energy exhaust towards the divertor plates. Two scenarios are examined with the new free-plasma-boundary capabilities of SIESTA: a freely evolving bootstrap current one that illustrates the difficulties arising from the dislocation of the boundary islands, and a second one in which off-axis electron cyclotron current drive (ECCD) is applied to compensate the effects of the bootstrap current and keep the island divertor configuration intact. SIESTA finds that off-axis ECCD is indeed able to keep the location and phase of the edge magnetic island chain unchanged, but it may also lead to an undesired stochastization of parts of the confined plasma if the EC deposition radial profile becomes too narrow.
ERIC Educational Resources Information Center
Kim, Se-Kang
2010-01-01
The aim of the current study is to validate the invariance of major profile patterns derived from multidimensional scaling (MDS) by bootstrapping. Profile Analysis via Multidimensional Scaling (PAMS) was employed to obtain profiles and bootstrapping was used to construct the sampling distributions of the profile coordinates and the empirical…
Comparison of bootstrap approaches for estimation of uncertainties of DTI parameters.
Chung, SungWon; Lu, Ying; Henry, Roland G
2006-11-01
Bootstrap is an empirical non-parametric statistical technique based on data resampling that has been used to quantify uncertainties of diffusion tensor MRI (DTI) parameters, useful in tractography and in assessing DTI methods. The current bootstrap method (repetition bootstrap) used for DTI analysis performs resampling within the data sharing common diffusion gradients, requiring multiple acquisitions for each diffusion gradient. Recently, wild bootstrap was proposed that can be applied without multiple acquisitions. In this paper, two new approaches are introduced called residual bootstrap and repetition bootknife. We show that repetition bootknife corrects for the large bias present in the repetition bootstrap method and, therefore, better estimates the standard errors. Like wild bootstrap, residual bootstrap is applicable to single acquisition scheme, and both are based on regression residuals (called model-based resampling). Residual bootstrap is based on the assumption that non-constant variance of measured diffusion-attenuated signals can be modeled, which is actually the assumption behind the widely used weighted least squares solution of diffusion tensor. The performances of these bootstrap approaches were compared in terms of bias, variance, and overall error of bootstrap-estimated standard error by Monte Carlo simulation. We demonstrate that residual bootstrap has smaller biases and overall errors, which enables estimation of uncertainties with higher accuracy. Understanding the properties of these bootstrap procedures will help us to choose the optimal approach for estimating uncertainties that can benefit hypothesis testing based on DTI parameters, probabilistic fiber tracking, and optimizing DTI methods.
Bootstrap Methods: A Very Leisurely Look.
ERIC Educational Resources Information Center
Hinkle, Dennis E.; Winstead, Wayland H.
The Bootstrap method, a computer-intensive statistical method of estimation, is illustrated using a simple and efficient Statistical Analysis System (SAS) routine. The utility of the method for generating unknown parameters, including standard errors for simple statistics, regression coefficients, discriminant function coefficients, and factor…
Schneider, Kevin; Koblmüller, Stephan; Sefc, Kristina M
2015-11-11
The homoplasy excess test (HET) is a tree-based screen for hybrid taxa in multilocus nuclear phylogenies. Homoplasy between a hybrid taxon and the clades containing the parental taxa reduces bootstrap support in the tree. The HET is based on the expectation that excluding the hybrid taxon from the data set increases the bootstrap support for the parental clades, whereas excluding non-hybrid taxa has little effect on statistical node support. To carry out a HET, bootstrap trees are calculated with taxon-jackknife data sets, that is excluding one taxon (species, population) at a time. Excess increase in bootstrap support for certain nodes upon exclusion of a particular taxon indicates the hybrid (the excluded taxon) and its parents (the clades with increased support).We introduce a new software program, hext, which generates the taxon-jackknife data sets, runs the bootstrap tree calculations, and identifies excess bootstrap increases as outlier values in boxplot graphs. hext is written in r language and accepts binary data (0/1; e.g. AFLP) as well as co-dominant SNP and genotype data.We demonstrate the usefulness of hext in large SNP data sets containing putative hybrids and their parents. For instance, using published data of the genus Vitis (~6,000 SNP loci), hext output supports V. × champinii as a hybrid between V. rupestris and V. mustangensis .With simulated SNP and AFLP data sets, excess increases in bootstrap support were not always connected with the hybrid taxon (false positives), whereas the expected bootstrap signal failed to appear on several occasions (false negatives). Potential causes for both types of spurious results are discussed.With both empirical and simulated data sets, the taxon-jackknife output generated by hext provided additional signatures of hybrid taxa, including changes in tree topology across trees, consistent effects of exclusions of the hybrid and the parent taxa, and moderate (rather than excessive) increases in bootstrap support. hext significantly facilitates the taxon-jackknife approach to hybrid taxon detection, even though the simple test for excess bootstrap increase may not reliably identify hybrid taxa in all applications.
A neural network based reputation bootstrapping approach for service selection
NASA Astrophysics Data System (ADS)
Wu, Quanwang; Zhu, Qingsheng; Li, Peng
2015-10-01
With the concept of service-oriented computing becoming widely accepted in enterprise application integration, more and more computing resources are encapsulated as services and published online. Reputation mechanism has been studied to establish trust on prior unknown services. One of the limitations of current reputation mechanisms is that they cannot assess the reputation of newly deployed services as no record of their previous behaviours exists. Most of the current bootstrapping approaches merely assign default reputation values to newcomers. However, by this kind of methods, either newcomers or existing services will be favoured. In this paper, we present a novel reputation bootstrapping approach, where correlations between features and performance of existing services are learned through an artificial neural network (ANN) and they are then generalised to establish a tentative reputation when evaluating new and unknown services. Reputations of services published previously by the same provider are also incorporated for reputation bootstrapping if available. The proposed reputation bootstrapping approach is seamlessly embedded into an existing reputation model and implemented in the extended service-oriented architecture. Empirical studies of the proposed approach are shown at last.
Bootstrap Estimation and Testing for Variance Equality.
ERIC Educational Resources Information Center
Olejnik, Stephen; Algina, James
The purpose of this study was to develop a single procedure for comparing population variances which could be used for distribution forms. Bootstrap methodology was used to estimate the variability of the sample variance statistic when the population distribution was normal, platykurtic and leptokurtic. The data for the study were generated and…
Forecasting drought risks for a water supply storage system using bootstrap position analysis
Tasker, Gary; Dunne, Paul
1997-01-01
Forecasting the likelihood of drought conditions is an integral part of managing a water supply storage and delivery system. Position analysis uses a large number of possible flow sequences as inputs to a simulation of a water supply storage and delivery system. For a given set of operating rules and water use requirements, water managers can use such a model to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows a few months ahead conditioned on the current reservoir levels and streamflows. The large number of possible flow sequences are generated using a stochastic streamflow model with a random resampling of innovations. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality and it allows incorporation of long-range weather forecasts into the analysis.
Bootstrap position analysis for forecasting low flow frequency
Tasker, Gary D.; Dunne, P.
1997-01-01
A method of random resampling of residuals from stochastic models is used to generate a large number of 12-month-long traces of natural monthly runoff to be used in a position analysis model for a water-supply storage and delivery system. Position analysis uses the traces to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows conditioned on the current reservoir levels and streamflows. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality, fewer parameters need to be estimated directly from the data, and accounting for parameter uncertainty is easily done. For a given set of operating rules and water-use requirements for a system, water managers can use such a model as a decision-making tool to evaluate different operating rules. ?? ASCE,.
Impact of bootstrap current and Landau-fluid closure on ELM crashes and transport
NASA Astrophysics Data System (ADS)
Chen, J. G.; Xu, X. Q.; Ma, C. H.; Lei, Y. A.
2018-05-01
Results presented here are from 6-field Landau-Fluid simulations using shifted circular cross-section tokamak equilibria on BOUT++ framework. Linear benchmark results imply that the collisional and collisionless Landau resonance closures make a little difference on linear growth rate spectra which are quite close to the results with the flux limited Spitzer-Härm parallel flux. Both linear and nonlinear simulations show that the plasma current profile plays dual roles on the peeling-ballooning modes that it can drive the low-n peeling modes and stabilize the high-n ballooning modes. For fixed total pressure and current, as the pedestal current decreases due to the bootstrap current which becomes smaller when the density (collisionality) increases, the operational point is shifted downwards vertically in the Jped - α diagram, resulting in threshold changes of different modes. The bootstrap current can slightly increase radial turbulence spreading range and enhance the energy and particle transports by increasing the perturbed amplitude and broadening cross-phase frequency distribution.
NASA Astrophysics Data System (ADS)
Yang, P.; Ng, T. L.; Yang, W.
2015-12-01
Effective water resources management depends on the reliable estimation of the uncertainty of drought events. Confidence intervals (CIs) are commonly applied to quantify this uncertainty. A CI seeks to be at the minimal length necessary to cover the true value of the estimated variable with the desired probability. In drought analysis where two or more variables (e.g., duration and severity) are often used to describe a drought, copulas have been found suitable for representing the joint probability behavior of these variables. However, the comprehensive assessment of the parameter uncertainties of copulas of droughts has been largely ignored, and the few studies that have recognized this issue have not explicitly compared the various methods to produce the best CIs. Thus, the objective of this study to compare the CIs generated using two widely applied uncertainty estimation methods, bootstrapping and Markov Chain Monte Carlo (MCMC). To achieve this objective, (1) the marginal distributions lognormal, Gamma, and Generalized Extreme Value, and the copula functions Clayton, Frank, and Plackett are selected to construct joint probability functions of two drought related variables. (2) The resulting joint functions are then fitted to 200 sets of simulated realizations of drought events with known distribution and extreme parameters and (3) from there, using bootstrapping and MCMC, CIs of the parameters are generated and compared. The effect of an informative prior on the CIs generated by MCMC is also evaluated. CIs are produced for different sample sizes (50, 100, and 200) of the simulated drought events for fitting the joint probability functions. Preliminary results assuming lognormal marginal distributions and the Clayton copula function suggest that for cases with small or medium sample sizes (~50-100), MCMC to be superior method if an informative prior exists. Where an informative prior is unavailable, for small sample sizes (~50), both bootstrapping and MCMC yield the same level of performance, and for medium sample sizes (~100), bootstrapping is better. For cases with a large sample size (~200), there is little difference between the CIs generated using bootstrapping and MCMC regardless of whether or not an informative prior exists.
Donald B.K. English
2000-01-01
In this paper I use bootstrap procedures to develop confidence intervals for estimates of total industrial output generated per thousand tourist visits. Mean expenditures from replicated visitor expenditure data included weights to correct for response bias. Impacts were estimated with IMPLAN. Ninety percent interval endpoints were 6 to 16 percent above or below the...
NASA Astrophysics Data System (ADS)
Poli, Francesca M.; Kessel, Charles E.
2013-05-01
Plasmas with internal transport barriers (ITBs) are a potential and attractive route to steady-state operation in ITER. These plasmas exhibit radially localized regions of improved confinement with steep pressure gradients in the plasma core, which drive large bootstrap current and generate hollow current profiles and negative magnetic shear. This work examines the formation and sustainment of ITBs in ITER with electron cyclotron heating and current drive. The time-dependent transport simulations indicate that, with a trade-off of the power delivered to the equatorial and to the upper launcher, the sustainment of steady-state ITBs can be demonstrated in ITER with the baseline heating configuration.
Some Aspects of Advanced Tokamak Modeling in DIII-D
NASA Astrophysics Data System (ADS)
St John, H. E.; Petty, C. C.; Murakami, M.; Kinsey, J. E.
2000-10-01
We extend previous work(M. Murakami, et al., General Atomics Report GA-A23310 (1999).) done on time dependent DIII-D advanced tokamak simulations by introducing theoretical confinement models rather than relying on power balance derived transport coefficients. We explore using NBCD and off axis ECCD together with a self-consistent aligned bootstrap current, driven by the internal transport barrier dynamics generated with the GLF23 confinement model, to shape the hollow current profile and to maintain MHD stable conditions. Our theoretical modeling approach uses measured DIII-D initial conditions to start off the simulations in a smooth consistent manner. This mitigates the troublesome long lived perturbations in the ohmic current profile that is normally caused by inconsistent initial data. To achieve this goal our simulation uses a sequence of time dependent eqdsks generated autonomously by the EFIT MHD equilibrium code in analyzing experimental data to supply the history for the simulation.
Multi-baseline bootstrapping at the Navy precision optical interferometer
NASA Astrophysics Data System (ADS)
Armstrong, J. T.; Schmitt, H. R.; Mozurkewich, D.; Jorgensen, A. M.; Muterspaugh, M. W.; Baines, E. K.; Benson, J. A.; Zavala, Robert T.; Hutter, D. J.
2014-07-01
The Navy Precision Optical Interferometer (NPOI) was designed from the beginning to support baseline boot- strapping with equally-spaced array elements. The motivation was the desire to image the surfaces of resolved stars with the maximum resolution possible with a six-element array. Bootstrapping two baselines together to track fringes on a third baseline has been used at the NPOI for many years, but the capabilities of the fringe tracking software did not permit us to bootstrap three or more baselines together. Recently, both a new backend (VISION; Tennessee State Univ.) and new hardware and firmware (AZ Embedded Systems and New Mexico Tech, respectively) for the current hybrid backend have made multi-baseline bootstrapping possible.
Fernández-Caballero Rico, Jose Ángel; Chueca Porcuna, Natalia; Álvarez Estévez, Marta; Mosquera Gutiérrez, María Del Mar; Marcos Maeso, María Ángeles; García, Federico
2018-02-01
To show how to generate a consensus sequence from the information of massive parallel sequences data obtained from routine HIV anti-retroviral resistance studies, and that may be suitable for molecular epidemiology studies. Paired Sanger (Trugene-Siemens) and next-generation sequencing (NGS) (454 GSJunior-Roche) HIV RT and protease sequences from 62 patients were studied. NGS consensus sequences were generated using Mesquite, using 10%, 15%, and 20% thresholds. Molecular evolutionary genetics analysis (MEGA) was used for phylogenetic studies. At a 10% threshold, NGS-Sanger sequences from 17/62 patients were phylogenetically related, with a median bootstrap-value of 88% (IQR83.5-95.5). Association increased to 36/62 sequences, median bootstrap 94% (IQR85.5-98)], using a 15% threshold. Maximum association was at the 20% threshold, with 61/62 sequences associated, and a median bootstrap value of 99% (IQR98-100). A safe method is presented to generate consensus sequences from HIV-NGS data at 20% threshold, which will prove useful for molecular epidemiological studies. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Bootstrap Current for the Edge Pedestal Plasma in a Diverted Tokamak Geometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koh, S.; Chang, C. S.; Ku, S.
The edge bootstrap current plays a critical role in the equilibrium and stability of the steep edge pedestal plasma. The pedestal plasma has an unconventional and difficult neoclassical property, as compared with the core plasma. It has a narrow passing particle region in velocity space that can be easily modified or destroyed by Coulomb collisions. At the same time, the edge pedestal plasma has steep pressure and electrostatic potential gradients whose scale-lengths are comparable with the ion banana width, and includes a magnetic separatrix surface, across which the topological properties of the magnetic field and particle orbits change abruptly. Amore » driftkinetic particle code XGC0, equipped with a mass-momentum-energy conserving collision operator, is used to study the edge bootstrap current in a realistic diverted magnetic field geometry with a self-consistent radial electric field. When the edge electrons are in the weakly collisional banana regime, surprisingly, the present kinetic simulation confirms that the existing analytic expressions [represented by O. Sauter et al. , Phys. Plasmas 6 , 2834 (1999)] are still valid in this unconventional region, except in a thin radial layer in contact with the magnetic separatrix. The agreement arises from the dominance of the electron contribution to the bootstrap current compared with ion contribution and from a reasonable separation of the trapped-passing dynamics without a strong collisional mixing. However, when the pedestal electrons are in plateau-collisional regime, there is significant deviation of numerical results from the existing analytic formulas, mainly due to large effective collisionality of the passing and the boundary layer trapped particles in edge region. In a conventional aspect ratio tokamak, the edge bootstrap current from kinetic simulation can be significantly less than that from the Sauter formula if the electron collisionality is high. On the other hand, when the aspect ratio is close to unity, the collisional edge bootstrap current can be significantly greater than that from the Sauter formula. Rapid toroidal rotation of the magnetic field lines at the high field side of a tight aspect-ratio tokamak is believed to be the cause of the different behavior. A new analytic fitting formula, as a simple modification to the Sauter formula, is obtained to bring the analytic expression to a better agreement with the edge kinetic simulation results« less
Bootstrap current for the edge pedestal plasma in a diverted tokamak geometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koh, S.; Choe, W.; Chang, C. S.
The edge bootstrap current plays a critical role in the equilibrium and stability of the steep edge pedestal plasma. The pedestal plasma has an unconventional and difficult neoclassical property, as compared with the core plasma. It has a narrow passing particle region in velocity space that can be easily modified or destroyed by Coulomb collisions. At the same time, the edge pedestal plasma has steep pressure and electrostatic potential gradients whose scale-lengths are comparable with the ion banana width, and includes a magnetic separatrix surface, across which the topological properties of the magnetic field and particle orbits change abruptly. Amore » drift-kinetic particle code XGC0, equipped with a mass-momentum-energy conserving collision operator, is used to study the edge bootstrap current in a realistic diverted magnetic field geometry with a self-consistent radial electric field. When the edge electrons are in the weakly collisional banana regime, surprisingly, the present kinetic simulation confirms that the existing analytic expressions [represented by O. Sauter et al., Phys. Plasmas 6, 2834 (1999)] are still valid in this unconventional region, except in a thin radial layer in contact with the magnetic separatrix. The agreement arises from the dominance of the electron contribution to the bootstrap current compared with ion contribution and from a reasonable separation of the trapped-passing dynamics without a strong collisional mixing. However, when the pedestal electrons are in plateau-collisional regime, there is significant deviation of numerical results from the existing analytic formulas, mainly due to large effective collisionality of the passing and the boundary layer trapped particles in edge region. In a conventional aspect ratio tokamak, the edge bootstrap current from kinetic simulation can be significantly less than that from the Sauter formula if the electron collisionality is high. On the other hand, when the aspect ratio is close to unity, the collisional edge bootstrap current can be significantly greater than that from the Sauter formula. Rapid toroidal rotation of the magnetic field lines at the high field side of a tight aspect-ratio tokamak is believed to be the cause of the different behavior. A new analytic fitting formula, as a simple modification to the Sauter formula, is obtained to bring the analytic expression to a better agreement with the edge kinetic simulation results.« less
ERIC Educational Resources Information Center
Essid, Hedi; Ouellette, Pierre; Vigeant, Stephane
2010-01-01
The objective of this paper is to measure the efficiency of high schools in Tunisia. We use a statistical data envelopment analysis (DEA)-bootstrap approach with quasi-fixed inputs to estimate the precision of our measure. To do so, we developed a statistical model serving as the foundation of the data generation process (DGP). The DGP is…
The Role of GRAIL Orbit Determination in Preprocessing of Gravity Science Measurements
NASA Technical Reports Server (NTRS)
Kruizinga, Gerhard; Asmar, Sami; Fahnestock, Eugene; Harvey, Nate; Kahan, Daniel; Konopliv, Alex; Oudrhiri, Kamal; Paik, Meegyeong; Park, Ryan; Strekalov, Dmitry;
2013-01-01
The Gravity Recovery And Interior Laboratory (GRAIL) mission has constructed a lunar gravity field with unprecedented uniform accuracy on the farside and nearside of the Moon. GRAIL lunar gravity field determination begins with preprocessing of the gravity science measurements by applying corrections for time tag error, general relativity, measurement noise and biases. Gravity field determination requires the generation of spacecraft ephemerides of an accuracy not attainable with the pre-GRAIL lunar gravity fields. Therefore, a bootstrapping strategy was developed, iterating between science data preprocessing and lunar gravity field estimation in order to construct sufficiently accurate orbit ephemerides.This paper describes the GRAIL measurements, their dependence on the spacecraft ephemerides and the role of orbit determination in the bootstrapping strategy. Simulation results will be presented that validate the bootstrapping strategy followed by bootstrapping results for flight data, which have led to the latest GRAIL lunar gravity fields.
The economics of bootstrapping space industries - Development of an analytic computer model
NASA Technical Reports Server (NTRS)
Goldberg, A. H.; Criswell, D. R.
1982-01-01
A simple economic model of 'bootstrapping' industrial growth in space and on the Moon is presented. An initial space manufacturing facility (SMF) is assumed to consume lunar materials to enlarge the productive capacity in space. After reaching a predetermined throughput, the enlarged SMF is devoted to products which generate revenue continuously in proportion to the accumulated output mass (such as space solar power stations). Present discounted value and physical estimates for the general factors of production (transport, capital efficiency, labor, etc.) are combined to explore optimum growth in terms of maximized discounted revenues. It is found that 'bootstrapping' reduces the fractional cost to a space industry of transport off-Earth, permits more efficient use of a given transport fleet. It is concluded that more attention should be given to structuring 'bootstrapping' scenarios in which 'learning while doing' can be more fully incorporated in program analysis.
Point Set Denoising Using Bootstrap-Based Radial Basis Function.
Liew, Khang Jie; Ramli, Ahmad; Abd Majid, Ahmad
2016-01-01
This paper examines the application of a bootstrap test error estimation of radial basis functions, specifically thin-plate spline fitting, in surface smoothing. The presence of noisy data is a common issue of the point set model that is generated from 3D scanning devices, and hence, point set denoising is one of the main concerns in point set modelling. Bootstrap test error estimation, which is applied when searching for the smoothing parameters of radial basis functions, is revisited. The main contribution of this paper is a smoothing algorithm that relies on a bootstrap-based radial basis function. The proposed method incorporates a k-nearest neighbour search and then projects the point set to the approximated thin-plate spline surface. Therefore, the denoising process is achieved, and the features are well preserved. A comparison of the proposed method with other smoothing methods is also carried out in this study.
Transport in the plateau regime in a tokamak pedestal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seol, J.; Shaing, K. C.
In a tokamak H-mode, a strong E Multiplication-Sign B flow shear is generated during the L-H transition. Turbulence in a pedestal is suppressed significantly by this E Multiplication-Sign B flow shear. In this case, neoclassical transport may become important. The neoclassical fluxes are calculated in the plateau regime with the parallel plasma flow using their kinetic definitions. In an axisymmetric tokamak, the neoclassical particles fluxes can be decomposed into the banana-plateau flux and the Pfirsch-Schlueter flux. The banana-plateau particle flux is driven by the parallel viscous force and the Pfirsch-Schlueter flux by the poloidal variation of the friction force. Themore » combined quantity of the radial electric field and the parallel flow is determined by the flux surface averaged parallel momentum balance equation rather than requiring the ambipolarity of the total particle fluxes. In this process, the Pfirsch-Schlueter flux does not appear in the flux surface averaged parallel momentum equation. Only the banana-plateau flux is used to determine the parallel flow in the form of the flux surface averaged parallel viscosity. The heat flux, obtained using the solution of the parallel momentum balance equation, decreases exponentially in the presence of sonic M{sub p} without any enhancement over that in the standard neoclassical theory. Here, M{sub p} is a combination of the poloidal E Multiplication-Sign B flow and the parallel mass flow. The neoclassical bootstrap current in the plateau regime is presented. It indicates that the neoclassical bootstrap current also is related only to the banana-plateau fluxes. Finally, transport fluxes are calculated when M{sub p} is large enough to make the parallel electron viscosity comparable with the parallel ion viscosity. It is found that the bootstrap current has a finite value regardless of the magnitude of M{sub p}.« less
Uncertainty Estimation using Bootstrapped Kriging Predictions for Precipitation Isoscapes
NASA Astrophysics Data System (ADS)
Ma, C.; Bowen, G. J.; Vander Zanden, H.; Wunder, M.
2017-12-01
Isoscapes are spatial models representing the distribution of stable isotope values across landscapes. Isoscapes of hydrogen and oxygen in precipitation are now widely used in a diversity of fields, including geology, biology, hydrology, and atmospheric science. To generate isoscapes, geostatistical methods are typically applied to extend predictions from limited data measurements. Kriging is a popular method in isoscape modeling, but quantifying the uncertainty associated with the resulting isoscapes is challenging. Applications that use precipitation isoscapes to determine sample origin require estimation of uncertainty. Here we present a simple bootstrap method (SBM) to estimate the mean and uncertainty of the krigged isoscape and compare these results with a generalized bootstrap method (GBM) applied in previous studies. We used hydrogen isotopic data from IsoMAP to explore these two approaches for estimating uncertainty. We conducted 10 simulations for each bootstrap method and found that SBM results in more kriging predictions (9/10) compared to GBM (4/10). Prediction from SBM was closer to the original prediction generated without bootstrapping and had less variance than GBM. SBM was tested on different datasets from IsoMAP with different numbers of observation sites. We determined that predictions from the datasets with fewer than 40 observation sites using SBM were more variable than the original prediction. The approaches we used for estimating uncertainty will be compiled in an R package that is under development. We expect that these robust estimates of precipitation isoscape uncertainty can be applied in diagnosing the origin of samples ranging from various type of waters to migratory animals, food products, and humans.
Overview of physics research on the TCV tokamak
NASA Astrophysics Data System (ADS)
Fasoli, A.; TCV Team
2009-10-01
The Tokamak à Configuration Variable (TCV) tokamak is equipped with high-power (4.5 MW), real-time-controllable EC systems and flexible shaping, and plays an important role in fusion research by broadening the parameter range of reactor relevant regimes, by investigating tokamak physics questions and by developing new control tools. Steady-state discharges are achieved, in which the current is entirely self-generated through the bootstrap mechanism, a fundamental ingredient for ITER steady-state operation. The discharge remains quiescent over several current redistribution times, demonstrating that a self-consistent, 'bootstrap-aligned' equilibrium state is possible. Electron internal transport barrier regimes sustained by EC current drive have also been explored. MHD activity is shown to be crucial in scenarios characterized by large and slow oscillations in plasma confinement, which in turn can be modified by small Ohmic current perturbations altering the barrier strength. In studies of the relation between anomalous transport and plasma shape, the observed dependences of the electron thermal diffusivity on triangularity (direct) and collisionality (inverse) are qualitatively reproduced by non-linear gyro-kinetic simulations and shown to be governed by TEM turbulence. Parallel SOL flows are studied for their importance for material migration. Flow profiles are measured using a reciprocating Mach probe by changing from lower to upper single-null diverted equilibria and shifting the plasmas vertically. The dominant, field-direction-dependent Pfirsch-Schlüter component is found to be in good agreement with theoretical predictions. A field-direction-independent component is identified and is consistent with flows generated by transient over-pressure due to ballooning-like interchange turbulence. Initial high-resolution infrared images confirm that ELMs have a filamentary structure, while fast, localized radiation measurements reveal that ELM activity first appears in the X-point region. Real time control techniques are currently being applied to EC multiple independent power supplies and beam launchers, e.g. to control the plasma current in fully non-inductive conditions, and the plasma elongation through current broadening by far-off-axis heating at constant shaping field.
Seol, Hyunsoo
2016-06-01
The purpose of this study was to apply the bootstrap procedure to evaluate how the bootstrapped confidence intervals (CIs) for polytomous Rasch fit statistics might differ according to sample sizes and test lengths in comparison with the rule-of-thumb critical value of misfit. A total of 25 simulated data sets were generated to fit the Rasch measurement and then a total of 1,000 replications were conducted to compute the bootstrapped CIs under each of 25 testing conditions. The results showed that rule-of-thumb critical values for assessing the magnitude of misfit were not applicable because the infit and outfit mean square error statistics showed different magnitudes of variability over testing conditions and the standardized fit statistics did not exactly follow the standard normal distribution. Further, they also do not share the same critical range for the item and person misfit. Based on the results of the study, the bootstrapped CIs can be used to identify misfitting items or persons as they offer a reasonable alternative solution, especially when the distributions of the infit and outfit statistics are not well known and depend on sample size. © The Author(s) 2016.
Hager, Robert; Chang, C. S.
2016-04-08
As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. In conclusion, a new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hager, Robert; Chang, C. S.
As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. In conclusion, a new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hager, Robert, E-mail: rhager@pppl.gov; Chang, C. S., E-mail: cschang@pppl.gov
As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. A new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less
Investigation of the n = 1 resistive wall modes in the ITER high-mode confinement
NASA Astrophysics Data System (ADS)
Zheng, L. J.; Kotschenreuther, M. T.; Valanju, P.
2017-06-01
The n = 1 resistive wall mode (RWM) stability of ITER high-mode confinement is investigated with bootstrap current included for equilibrium, together with the rotation and diamagnetic drift effects for stability. Here, n is the toroidal mode number. We use the CORSICA code for computing the free boundary equilibrium and AEGIS code for stability. We find that the inclusion of bootstrap current for equilibrium is critical. It can reduce the local magnetic shear in the pedestal, so that the infernal mode branches can develop. Consequently, the n = 1 modes become unstable without a stabilizing wall at a considerably lower beta limit, driven by the steep pressure gradient in the pedestal. Typical values of the wall position stabilize the ideal mode, but give rise to the ‘pedestal’ resistive wall modes. We find that the rotation can contribute a stabilizing effect on RWMs and the diamagnetic drift effects can further improve the stability in the co-current rotation case. But, generally speaking, the rotation stabilization effects are not as effective as the case without including the bootstrap current effects on equilibrium. We also find that the diamagnetic drift effects are actually destabilizing when there is a counter-current rotation.
A condition for small bootstrap current in three-dimensional toroidal configurations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikhailov, M. I., E-mail: mikhaylov-mi@nrcki.ru; Nührenberg, J.; Zille, R.
2016-11-15
It is shown that, if the maximum of the magnetic field strength on a magnetic surface in a threedimensional magnetic confinement configuration with stellarator symmetry constitutes a line that is orthogonal to the field lines and crosses the symmetry line, then the bootstrap current density is smaller compared to that in quasi-axisymmetric (qa) [J. Nührenberg et al., in Proc. of Joint Varenna−Lausanne Int. Workshop on Theory of Fusion Plasmas, Varenna, 1994, p. 3] and quasi-helically (qh) symmetric [J. Nührenberg and R. Zille, Phys. Lett. A 129, 113 (1988)] configurations.
A bootstrap estimation scheme for chemical compositional data with nondetects
Palarea-Albaladejo, J; Martín-Fernández, J.A; Olea, Ricardo A.
2014-01-01
The bootstrap method is commonly used to estimate the distribution of estimators and their associated uncertainty when explicit analytic expressions are not available or are difficult to obtain. It has been widely applied in environmental and geochemical studies, where the data generated often represent parts of whole, typically chemical concentrations. This kind of constrained data is generically called compositional data, and they require specialised statistical methods to properly account for their particular covariance structure. On the other hand, it is not unusual in practice that those data contain labels denoting nondetects, that is, concentrations falling below detection limits. Nondetects impede the implementation of the bootstrap and represent an additional source of uncertainty that must be taken into account. In this work, a bootstrap scheme is devised that handles nondetects by adding an imputation step within the resampling process and conveniently propagates their associated uncertainly. In doing so, it considers the constrained relationships between chemical concentrations originated from their compositional nature. Bootstrap estimates using a range of imputation methods, including new stochastic proposals, are compared across scenarios of increasing difficulty. They are formulated to meet compositional principles following the log-ratio approach, and an adjustment is introduced in the multivariate case to deal with nonclosed samples. Results suggest that nondetect bootstrap based on model-based imputation is generally preferable. A robust approach based on isometric log-ratio transformations appears to be particularly suited in this context. Computer routines in the R statistical programming language are provided.
NASA Astrophysics Data System (ADS)
Cesario, R. C.; Castaldo, C.; Fonseca, A.; De Angelis, R.; Parail, V.; Smeulders, P.; Beurskens, M.; Brix, M.; Calabrò, G.; De Vries, P.; Mailloux, J.; Pericoli, V.; Ravera, G.; Zagorski, R.
2007-09-01
LHCD has been used in JET experiments aimed at producing internal transport barriers (ITBs) in highly triangular plasmas (δ≈0.4) at high βN (up to 3) for steady-state application. The LHCD is a potentially valuable tool for (i) modifying the target q-profile, which can help avoid deleterious MHD modes and favour the formation of ITBs, and (ii) contributing to the non-inductive current drive required to prolong such plasma regimes. The q-profile evolution has been simulated during the current ramp-up phase for such a discharge (B0 = 2.3 T, IP = 1.5 MA) where 2 MW of LHCD has been coupled. The JETTO code was used taking measured plasma profiles, and the LHCD profile modeled by the LHstar code. The results are in agreement with MSE measurements and indicate the importance of the elevated electron temperature due to LHCD, as well as the driven current. During main heating with 18 MW of NBI and 3 MW of ICRH the bootstrap current density at the edge also becomes large, consistently with the observed reduction of the local turbulence and of the MHD activity. JETTO modelling suggests that the bootstrap current can reduce the magnetic shear (sh) at large radius, potentially affecting the MHD stability and turbulence behaviour in this region. Keywords: lower hybrid current drive (LHCD), bootstrap current, q (safety factor) and shear (sh) profile evolutions.
Carving out the end of the world or (superconformal bootstrap in six dimensions)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Chi-Ming; Lin, Ying-Hsuan
We bootstrap N=(1,0) superconformal field theories in six dimensions, by analyzing the four-point function of flavor current multiplets. By assuming E 8 flavor group, we present universal bounds on the central charge C T and the flavor central charge C J. Based on the numerical data, we conjecture that the rank-one E-string theory saturates the universal lower bound on C J , and numerically determine the spectrum of long multiplets in the rank-one E-string theory. We comment on the possibility of solving the higher-rank E-string theories by bootstrap and thereby probing M-theory on AdS 7×S 4/Z 2 .
Carving out the end of the world or (superconformal bootstrap in six dimensions)
Chang, Chi-Ming; Lin, Ying-Hsuan
2017-08-29
We bootstrap N=(1,0) superconformal field theories in six dimensions, by analyzing the four-point function of flavor current multiplets. By assuming E 8 flavor group, we present universal bounds on the central charge C T and the flavor central charge C J. Based on the numerical data, we conjecture that the rank-one E-string theory saturates the universal lower bound on C J , and numerically determine the spectrum of long multiplets in the rank-one E-string theory. We comment on the possibility of solving the higher-rank E-string theories by bootstrap and thereby probing M-theory on AdS 7×S 4/Z 2 .
Simulating realistic predator signatures in quantitative fatty acid signature analysis
Bromaghin, Jeffrey F.
2015-01-01
Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.
Breaking down Barriers: Academic Obstacles of First-Generation Students at Research Universities
ERIC Educational Resources Information Center
Stebleton, Michael J.; Soria, Krista M.
2012-01-01
The purpose of this study was to examine the perceived academic obstacles of first-generation students in comparison to non-first-generation students. Using the Student Experience in the Research University (SERU) completed by approximately 58,000 students from six research universities, the researchers used nonparametric bootstrapping to analyze…
Trends and Correlation Estimation in Climate Sciences: Effects of Timescale Errors
NASA Astrophysics Data System (ADS)
Mudelsee, M.; Bermejo, M. A.; Bickert, T.; Chirila, D.; Fohlmeister, J.; Köhler, P.; Lohmann, G.; Olafsdottir, K.; Scholz, D.
2012-12-01
Trend describes time-dependence in the first moment of a stochastic process, and correlation measures the linear relation between two random variables. Accurately estimating the trend and correlation, including uncertainties, from climate time series data in the uni- and bivariate domain, respectively, allows first-order insights into the geophysical process that generated the data. Timescale errors, ubiquitious in paleoclimatology, where archives are sampled for proxy measurements and dated, poses a problem to the estimation. Statistical science and the various applied research fields, including geophysics, have almost completely ignored this problem due to its theoretical almost-intractability. However, computational adaptations or replacements of traditional error formulas have become technically feasible. This contribution gives a short overview of such an adaptation package, bootstrap resampling combined with parametric timescale simulation. We study linear regression, parametric change-point models and nonparametric smoothing for trend estimation. We introduce pairwise-moving block bootstrap resampling for correlation estimation. Both methods share robustness against autocorrelation and non-Gaussian distributional shape. We shortly touch computing-intensive calibration of bootstrap confidence intervals and consider options to parallelize the related computer code. Following examples serve not only to illustrate the methods but tell own climate stories: (1) the search for climate drivers of the Agulhas Current on recent timescales, (2) the comparison of three stalagmite-based proxy series of regional, western German climate over the later part of the Holocene, and (3) trends and transitions in benthic oxygen isotope time series from the Cenozoic. Financial support by Deutsche Forschungsgemeinschaft (FOR 668, FOR 1070, MU 1595/4-1) and the European Commission (MC ITN 238512, MC ITN 289447) is acknowledged.
NASA Astrophysics Data System (ADS)
Angrisano, Antonio; Maratea, Antonio; Gaglione, Salvatore
2018-01-01
In the absence of obstacles, a GPS device is generally able to provide continuous and accurate estimates of position, while in urban scenarios buildings can generate multipath and echo-only phenomena that severely affect the continuity and the accuracy of the provided estimates. Receiver autonomous integrity monitoring (RAIM) techniques are able to reduce the negative consequences of large blunders in urban scenarios, but require both a good redundancy and a low contamination to be effective. In this paper a resampling strategy based on bootstrap is proposed as an alternative to RAIM, in order to estimate accurately position in case of low redundancy and multiple blunders: starting with the pseudorange measurement model, at each epoch the available measurements are bootstrapped—that is random sampled with replacement—and the generated a posteriori empirical distribution is exploited to derive the final position. Compared to standard bootstrap, in this paper the sampling probabilities are not uniform, but vary according to an indicator of the measurement quality. The proposed method has been compared with two different RAIM techniques on a data set collected in critical conditions, resulting in a clear improvement on all considered figures of merit.
Flynn-Evans, Erin E.; Lockley, Steven W.
2016-01-01
Study Objectives: There is currently no questionnaire-based pre-screening tool available to detect non-24-hour sleep-wake rhythm disorder (N24HSWD) among blind patients. Our goal was to develop such a tool, derived from gold standard, objective hormonal measures of circadian entrainment status, for the detection of N24HSWD among those with visual impairment. Methods: We evaluated the contribution of 40 variables in their ability to predict N24HSWD among 127 blind women, classified using urinary 6-sulfatoxymelatonin period, an objective marker of circadian entrainment status in this population. We subjected the 40 candidate predictors to 1,000 bootstrapped iterations of a logistic regression forward selection model to predict N24HSWD, with model inclusion set at the p < 0.05 level. We removed any predictors that were not selected at least 1% of the time in the 1,000 bootstrapped models and applied a second round of 1,000 bootstrapped logistic regression forward selection models to the remaining 23 candidate predictors. We included all questions that were selected at least 10% of the time in the final model. We subjected the selected predictors to a final logistic regression model to predict N24SWD over 1,000 bootstrapped models to calculate the concordance statistic and adjusted optimism of the final model. We used this information to generate a predictive model and determined the sensitivity and specificity of the model. Finally, we applied the model to a cohort of 1,262 blind women who completed the survey, but did not collect urine samples. Results: The final model consisted of eight questions. The concordance statistic, adjusted for bootstrapping, was 0.85. The positive predictive value was 88%, the negative predictive value was 79%. Applying this model to our larger dataset of women, we found that 61% of those without light perception, and 27% with some degree of light perception, would be referred for further screening for N24HSWD. Conclusions: Our model has predictive utility sufficient to serve as a pre-screening questionnaire for N24HSWD among the blind. Citation: Flynn-Evans EE, Lockley SW. A pre-screening questionnaire to predict non-24-hour sleep-wake rhythm disorder (N24HSWD) among the blind. J Clin Sleep Med 2016;12(5):703–710. PMID:26951421
Finite Beta Boundary Magnetic Fields of NCSX
NASA Astrophysics Data System (ADS)
Grossman, A.; Kaiser, T.; Mioduszewski, P.
2004-11-01
The magnetic field between the plasma surface and wall of the National Compact Stellarator (NCSX), which uses quasi-symmetry to combine the best features of the tokamak and stellarator in a configuration of low aspect ratio is mapped via field line tracing in a range of finite beta in which part of the rotational transform is generated by the bootstrap current. We adopt the methodology developed for W7-X, in which an equilibrium solution is computed by an inverse equilibrium solver based on an energy minimizing variational moments code, VMEC2000[1], which solves directly for the shape of the flux surfaces given the external coils and their currents as well as a bootstrap current provided by a separate transport calculation. The VMEC solution and the Biot-Savart vacuum fields are coupled to the magnetic field solver for finite-beta equilibrium (MFBE2001)[2] code to determine the magnetic field on a 3D grid over a computational domain. It is found that the edge plasma is more stellarator-like, with a complex 3D structure, and less like the ordered 2D symmetric structure of a tokamak. The field lines make a transition from ergodically covering a surface to ergodically covering a volume, as the distance from the last closed magnetic surface is increased. The results are compared with the PIES[3] calculations. [1] S.P. Hirshman et al. Comput. Phys. Commun. 43 (1986) 143. [2] E. Strumberger, et al. Nucl. Fusion 42 (2002) 827. [3] A.H. Reiman and H.S. Greenside, Comput. Phys. Commun. 43, 157 (1986).
Modelisation de l'historique d'operation de groupes turbine-alternateur
NASA Astrophysics Data System (ADS)
Szczota, Mickael
Because of their ageing fleet, the utility managers are increasingly in needs of tools that can help them to plan efficiently maintenance operations. Hydro-Quebec started a project that aim to foresee the degradation of their hydroelectric runner, and use that information to classify the generating unit. That classification will help to know which generating unit is more at risk to undergo a major failure. Cracks linked to the fatigue phenomenon are a predominant degradation mode and the loading sequences applied to the runner is a parameter impacting the crack growth. So, the aim of this memoir is to create a generator able to generate synthetic loading sequences that are statistically equivalent to the observed history. Those simulated sequences will be used as input in a life assessment model. At first, we describe how the generating units are operated by Hydro-Quebec and analyse the available data, the analysis shows that the data are non-stationnary. Then, we review modelisation and validation methods. In the following chapter a particular attention is given to a precise description of the validation and comparison procedure. Then, we present the comparison of three kind of model : Discrete Time Markov Chains, Discrete Time Semi-Markov Chains and the Moving Block Bootstrap. For the first two models, we describe how to take account for the non-stationnarity. Finally, we show that the Markov Chain is not adapted for our case, and that the Semi-Markov chains are better when they include the non-stationnarity. The final choice between Semi-Markov Chains and the Moving Block Bootstrap depends of the user. But, with a long term vision we recommend the use of Semi-Markov chains for their flexibility. Keywords: Stochastic models, Models validation, Reliability, Semi-Markov Chains, Markov Chains, Bootstrap
Bootstrapping the energy flow in the beginning of life.
Hengeveld, R; Fedonkin, M A
2007-01-01
This paper suggests that the energy flow on which all living structures depend only started up slowly, the low-energy, initial phase starting up a second, slightly more energetic phase, and so on. In this way, the build up of the energy flow follows a bootstrapping process similar to that found in the development of computers, the first generation making possible the calculations necessary for constructing the second one, etc. In the biogenetic upstart of an energy flow, non-metals in the lower periods of the Periodic Table of Elements would have constituted the most primitive systems, their operation being enhanced and later supplanted by elements in the higher periods that demand more energy. This bootstrapping process would put the development of the metabolisms based on the second period elements carbon, nitrogen and oxygen at the end of the evolutionary process rather than at, or even before, the biogenetic event.
Reference interval computation: which method (not) to choose?
Pavlov, Igor Y; Wilson, Andrew R; Delgado, Julio C
2012-07-11
When different methods are applied to reference interval (RI) calculation the results can sometimes be substantially different, especially for small reference groups. If there are no reliable RI data available, there is no way to confirm which method generates results closest to the true RI. We randomly drawn samples obtained from a public database for 33 markers. For each sample, RIs were calculated by bootstrapping, parametric, and Box-Cox transformed parametric methods. Results were compared to the values of the population RI. For approximately half of the 33 markers, results of all 3 methods were within 3% of the true reference value. For other markers, parametric results were either unavailable or deviated considerably from the true values. The transformed parametric method was more accurate than bootstrapping for sample size of 60, very close to bootstrapping for sample size 120, but in some cases unavailable. We recommend against using parametric calculations to determine RIs. The transformed parametric method utilizing Box-Cox transformation would be preferable way of RI calculation, if it satisfies normality test. If not, the bootstrapping is always available, and is almost as accurate and precise as the transformed parametric method. Copyright © 2012 Elsevier B.V. All rights reserved.
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution
Imai, Mutsumi; Kita, Sotaro
2014-01-01
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. PMID:25092666
Impurities in a non-axisymmetric plasma. Transport and effect on bootstrap current
Mollén, A.; Landreman, M.; Smith, H. M.; ...
2015-11-20
Impurities cause radiation losses and plasma dilution, and in stellarator plasmas the neoclassical ambipolar radial electric field is often unfavorable for avoiding strong impurity peaking. In this work we use a new continuum drift-kinetic solver, the SFINCS code (the Stellarator Fokker-Planck Iterative Neoclassical Conservative Solver) [M. Landreman et al., Phys. Plasmas 21 (2014) 042503] which employs the full linearized Fokker-Planck-Landau operator, to calculate neoclassical impurity transport coefficients for a Wendelstein 7-X (W7-X) magnetic configuration. We compare SFINCS calculations with theoretical asymptotes in the high collisionality limit. We observe and explain a 1/nu-scaling of the inter-species radial transport coefficient at lowmore » collisionality, arising due to the field term in the inter-species collision operator, and which is not found with simplified collision models even when momentum correction is applied. However, this type of scaling disappears if a radial electric field is present. We use SFINCS to analyze how the impurity content affects the neoclassical impurity dynamics and the bootstrap current. We show that a change in plasma effective charge Z eff of order unity can affect the bootstrap current enough to cause a deviation in the divertor strike point locations.« less
The prospects for magnetohydrodynamic stability in advanced tokamak regimes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manickam, J.; Chance, M.S.; Jardin, S.C.
1994-05-01
Stability analysis of advanced regime tokamaks is presented. Here advanced regimes are defined to include configurations where the ratio of the bootstrap current, [ital I][sub BS], to the total plasma current, [ital I][sub [ital p
Generalized Bootstrap Method for Assessment of Uncertainty in Semivariogram Inference
Olea, R.A.; Pardo-Iguzquiza, E.
2011-01-01
The semivariogram and its related function, the covariance, play a central role in classical geostatistics for modeling the average continuity of spatially correlated attributes. Whereas all methods are formulated in terms of the true semivariogram, in practice what can be used are estimated semivariograms and models based on samples. A generalized form of the bootstrap method to properly model spatially correlated data is used to advance knowledge about the reliability of empirical semivariograms and semivariogram models based on a single sample. Among several methods available to generate spatially correlated resamples, we selected a method based on the LU decomposition and used several examples to illustrate the approach. The first one is a synthetic, isotropic, exhaustive sample following a normal distribution, the second example is also a synthetic but following a non-Gaussian random field, and a third empirical sample consists of actual raingauge measurements. Results show wider confidence intervals than those found previously by others with inadequate application of the bootstrap. Also, even for the Gaussian example, distributions for estimated semivariogram values and model parameters are positively skewed. In this sense, bootstrap percentile confidence intervals, which are not centered around the empirical semivariogram and do not require distributional assumptions for its construction, provide an achieved coverage similar to the nominal coverage. The latter cannot be achieved by symmetrical confidence intervals based on the standard error, regardless if the standard error is estimated from a parametric equation or from bootstrap. ?? 2010 International Association for Mathematical Geosciences.
A bootstrapping method for development of Treebank
NASA Astrophysics Data System (ADS)
Zarei, F.; Basirat, A.; Faili, H.; Mirain, M.
2017-01-01
Using statistical approaches beside the traditional methods of natural language processing could significantly improve both the quality and performance of several natural language processing (NLP) tasks. The effective usage of these approaches is subject to the availability of the informative, accurate and detailed corpora on which the learners are trained. This article introduces a bootstrapping method for developing annotated corpora based on a complex and rich linguistically motivated elementary structure called supertag. To this end, a hybrid method for supertagging is proposed that combines both of the generative and discriminative methods of supertagging. The method was applied on a subset of Wall Street Journal (WSJ) in order to annotate its sentences with a set of linguistically motivated elementary structures of the English XTAG grammar that is using a lexicalised tree-adjoining grammar formalism. The empirical results confirm that the bootstrapping method provides a satisfactory way for annotating the English sentences with the mentioned structures. The experiments show that the method could automatically annotate about 20% of WSJ with the accuracy of F-measure about 80% of which is particularly 12% higher than the F-measure of the XTAG Treebank automatically generated from the approach proposed by Basirat and Faili [(2013). Bridge the gap between statistical and hand-crafted grammars. Computer Speech and Language, 27, 1085-1104].
Porter, Teresita M; Gibson, Joel F; Shokralla, Shadi; Baird, Donald J; Golding, G Brian; Hajibabaei, Mehrdad
2014-01-01
Current methods to identify unknown insect (class Insecta) cytochrome c oxidase (COI barcode) sequences often rely on thresholds of distances that can be difficult to define, sequence similarity cut-offs, or monophyly. Some of the most commonly used metagenomic classification methods do not provide a measure of confidence for the taxonomic assignments they provide. The aim of this study was to use a naïve Bayesian classifier (Wang et al. Applied and Environmental Microbiology, 2007; 73: 5261) to automate taxonomic assignments for large batches of insect COI sequences such as data obtained from high-throughput environmental sequencing. This method provides rank-flexible taxonomic assignments with an associated bootstrap support value, and it is faster than the blast-based methods commonly used in environmental sequence surveys. We have developed and rigorously tested the performance of three different training sets using leave-one-out cross-validation, two field data sets, and targeted testing of Lepidoptera, Diptera and Mantodea sequences obtained from the Barcode of Life Data system. We found that type I error rates, incorrect taxonomic assignments with a high bootstrap support, were already relatively low but could be lowered further by ensuring that all query taxa are actually present in the reference database. Choosing bootstrap support cut-offs according to query length and summarizing taxonomic assignments to more inclusive ranks can also help to reduce error while retaining the maximum number of assignments. Additionally, we highlight gaps in the taxonomic and geographic representation of insects in public sequence databases that will require further work by taxonomists to improve the quality of assignments generated using any method.
NASA Astrophysics Data System (ADS)
Wu, M. Q.; Pan, C. K.; Chan, V. S.; Li, G. Q.; Garofalo, A. M.; Jian, X.; Liu, L.; Ren, Q. L.; Chen, J. L.; Gao, X.; Gong, X. Z.; Ding, S. Y.; Qian, J. P.; Cfetr Physics Team
2018-04-01
Time-dependent integrated modeling of DIII-D ITER-like and high bootstrap current plasma ramp-up discharges has been performed with the equilibrium code EFIT, and the transport codes TGYRO and ONETWO. Electron and ion temperature profiles are simulated by TGYRO with the TGLF (SAT0 or VX model) turbulent and NEO neoclassical transport models. The VX model is a new empirical extension of the TGLF turbulent model [Jian et al., Nucl. Fusion 58, 016011 (2018)], which captures the physics of multi-scale interaction between low-k and high-k turbulence from nonlinear gyro-kinetic simulation. This model is demonstrated to accurately model low Ip discharges from the EAST tokamak. Time evolution of the plasma current density profile is simulated by ONETWO with the experimental current ramp-up rate. The general trend of the predicted evolution of the current density profile is consistent with that obtained from the equilibrium reconstruction with Motional Stark effect constraints. The predicted evolution of βN , li , and βP also agrees well with the experiments. For the ITER-like cases, the predicted electron and ion temperature profiles using TGLF_Sat0 agree closely with the experimental measured profiles, and are demonstrably better than other proposed transport models. For the high bootstrap current case, the predicted electron and ion temperature profiles perform better in the VX model. It is found that the SAT0 model works well at high IP (>0.76 MA) while the VX model covers a wider range of plasma current ( IP > 0.6 MA). The results reported in this paper suggest that the developed integrated modeling could be a candidate for ITER and CFETR ramp-up engineering design modeling.
Namazi-Rad, Mohammad-Reza; Mokhtarian, Payam; Perez, Pascal
2014-01-01
Generating a reliable computer-simulated synthetic population is necessary for knowledge processing and decision-making analysis in agent-based systems in order to measure, interpret and describe each target area and the human activity patterns within it. In this paper, both synthetic reconstruction (SR) and combinatorial optimisation (CO) techniques are discussed for generating a reliable synthetic population for a certain geographic region (in Australia) using aggregated- and disaggregated-level information available for such an area. A CO algorithm using the quadratic function of population estimators is presented in this paper in order to generate a synthetic population while considering a two-fold nested structure for the individuals and households within the target areas. The baseline population in this study is generated from the confidentialised unit record files (CURFs) and 2006 Australian census tables. The dynamics of the created population is then projected over five years using a dynamic micro-simulation model for individual- and household-level demographic transitions. This projection is then compared with the 2011 Australian census. A prediction interval is provided for the population estimates obtained by the bootstrapping method, by which the variability structure of a predictor can be replicated in a bootstrap distribution. PMID:24733522
2014-01-01
Background Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. Methods We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. Results In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. Conclusions The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes. PMID:24888356
Sadatsafavi, Mohsen; Marra, Carlo; Aaron, Shawn; Bryan, Stirling
2014-06-03
Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes.
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution.
Imai, Mutsumi; Kita, Sotaro
2014-09-19
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Bootstrapping the (A1, A2) Argyres-Douglas theory
NASA Astrophysics Data System (ADS)
Cornagliotto, Martina; Lemos, Madalena; Liendo, Pedro
2018-03-01
We apply bootstrap techniques in order to constrain the CFT data of the ( A 1 , A 2) Argyres-Douglas theory, which is arguably the simplest of the Argyres-Douglas models. We study the four-point function of its single Coulomb branch chiral ring generator and put numerical bounds on the low-lying spectrum of the theory. Of particular interest is an infinite family of semi-short multiplets labeled by the spin ℓ. Although the conformal dimensions of these multiplets are protected, their three-point functions are not. Using the numerical bootstrap we impose rigorous upper and lower bounds on their values for spins up to ℓ = 20. Through a recently obtained inversion formula, we also estimate them for sufficiently large ℓ, and the comparison of both approaches shows consistent results. We also give a rigorous numerical range for the OPE coefficient of the next operator in the chiral ring, and estimates for the dimension of the first R-symmetry neutral non-protected multiplet for small spin.
Im, Subin; Min, Soonhong
2013-04-01
Exploratory factor analyses of the Kirton Adaption-Innovation Inventory (KAI), which serves to measure individual cognitive styles, generally indicate three factors: sufficiency of originality, efficiency, and rule/group conformity. In contrast, a 2005 study by Im and Hu using confirmatory factor analysis supported a four-factor structure, dividing the sufficiency of originality dimension into two subdimensions, idea generation and preference for change. This study extends Im and Hu's (2005) study of a derived version of the KAI by providing additional evidence of the four-factor structure. Specifically, the authors test the robustness of the parameter estimates to the violation of normality assumptions in the sample using bootstrap methods. A bias-corrected confidence interval bootstrapping procedure conducted among a sample of 356 participants--members of the Arkansas Household Research Panel, with middle SES and average age of 55.6 yr. (SD = 13.9)--showed that the four-factor model with two subdimensions of sufficiency of originality fits the data significantly better than the three-factor model in non-normality conditions.
Edge Currents and Stability in DIII-D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, D M; Fenstermacher, M E; Finkenthal, D K
2004-12-01
Understanding the stability physics of the H-mode pedestal in tokamak devices requires an accurate measurement of plasma current in the pedestal region with good spatial resolution. Theoretically, the high pressure gradients achieved in the edge of H-mode plasmas should lead to generation of a significant edge current density peak through bootstrap and Pfirsh-Schl{umlt u}ter effects. This edge current is important for the achievement of second stability in the context of coupled magneto hydrodynamic (MHD) modes which are both pressure (ballooning) and current (peeling) driven. Many aspects of edge localized mode (ELM) behavior can be accounted for in terms of anmore » edge current density peak, with the identification of Type 1 ELMs as intermediate-n toroidal mode number MHD modes being a natural feature of this model. The development of a edge localized instabilities in tokamak experiments code (ELITE) based on this model allows one to efficiently calculate the stability and growth of the relevant modes for a broad range of plasma parameters and thus provides a framework for understanding the limits on pedestal height. This however requires an accurate assessment of the edge current. While estimates of j{sub edge} can be made based on specific bootstrap models, their validity may be limited in the edge (gradient scalelengths comparable to orbit size, large changes in collisionality, etc.). Therefore it is highly desirable to have an actual measurement. Such measurements have been made on the DIII-D tokamak using combined polarimetry and spectroscopy of an injected lithium beam. By analyzing one of the Zeeman-split 2S-2P lithium resonance line components, one can obtain direct information on the local magnetic field components. These values allow one to infer details of the edge current density. Because of the negligible Stark mixing of the relevant atomic levels in lithium, this method of determining j(r) is insensitive to the large local electric fields typically found in enhanced confinement (H-mode) edges, and thus avoids an ambiguity common to MSE measurements of B{sub pol}.« less
Edge Currents and Stability in DIII-D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, D M; Fenstermacher, M E; Finkenthal, D K
2005-05-05
Understanding the stability physics of the H-mode pedestal in tokamak devices requires an accurate measurement of plasma current in the pedestal region with good spatial resolution. Theoretically, the high pressure gradients achieved in the edge of H-mode plasmas should lead to generation of a significant edge current density peak through bootstrap and Pfirsh-Schlueter effects. This edge current is important for the achievement of second stability in the context of coupled magneto hydrodynamic (MHD) modes which are both pressure (ballooning) and current (peeling) driven [1]. Many aspects of edge localized mode (ELM) behavior can be accounted for in terms of anmore » edge current density peak, with the identification of Type 1 ELMs as intermediate-n toroidal mode number MHD modes being a natural feature of this model [2]. The development of a edge localized instabilities in tokamak experiments code (ELITE) based on this model allows one to efficiently calculate the stability and growth of the relevant modes for a broad range of plasma parameters [3,4] and thus provides a framework for understanding the limits on pedestal height. This however requires an accurate assessment of the edge current. While estimates of j{sub edge} can be made based on specific bootstrap models, their validity may be limited in the edge (gradient scale lengths comparable to orbit size, large changes in collisionality, etc.). Therefore it is highly desirable to have an actual measurement. Such measurements have been made on the DIII-D tokamak using combined polarimetry and spectroscopy of an injected lithium beam. [5,6]. By analyzing one of the Zeeman-split 2S-2P lithium resonance line components, one can obtain direct information on the local magnetic field components. These values allow one to infer details of the edge current density. Because of the negligible Stark mixing of the relevant atomic levels in lithium, this method of determining j(r) is insensitive to the large local electric fields typically found in enhanced confinement (H-mode) edges, and thus avoids an ambiguity common to MSE measurements of B{sub pol}.« less
Causality constraints in conformal field theory
Hartman, Thomas; Jain, Sachin; Kundu, Sandipan
2016-05-17
Causality places nontrivial constraints on QFT in Lorentzian signature, for example fixing the signs of certain terms in the low energy Lagrangian. In d dimensional conformal field theory, we show how such constraints are encoded in crossing symmetry of Euclidean correlators, and derive analogous constraints directly from the conformal bootstrap (analytically). The bootstrap setup is a Lorentzian four-point function corresponding to propagation through a shockwave. Crossing symmetry fixes the signs of certain log terms that appear in the conformal block expansion, which constrains the interactions of low-lying operators. As an application, we use the bootstrap to rederive the well knownmore » sign constraint on the (Φ) 4 coupling in effective field theory, from a dual CFT. We also find constraints on theories with higher spin conserved currents. As a result, our analysis is restricted to scalar correlators, but we argue that similar methods should also impose nontrivial constraints on the interactions of spinning operators« less
Conformal Bootstrap in Mellin Space
NASA Astrophysics Data System (ADS)
Gopakumar, Rajesh; Kaviraj, Apratim; Sen, Kallol; Sinha, Aninda
2017-02-01
We propose a new approach towards analytically solving for the dynamical content of conformal field theories (CFTs) using the bootstrap philosophy. This combines the original bootstrap idea of Polyakov with the modern technology of the Mellin representation of CFT amplitudes. We employ exchange Witten diagrams with built-in crossing symmetry as our basic building blocks rather than the conventional conformal blocks in a particular channel. Demanding consistency with the operator product expansion (OPE) implies an infinite set of constraints on operator dimensions and OPE coefficients. We illustrate the power of this method in the ɛ expansion of the Wilson-Fisher fixed point by reproducing anomalous dimensions and, strikingly, obtaining OPE coefficients to higher orders in ɛ than currently available using other analytic techniques (including Feynman diagram calculations). Our results enable us to get a somewhat better agreement between certain observables in the 3D Ising model and the precise numerical values that have been recently obtained.
McClenaghan, Joseph; Garofalo, Andrea M.; Meneghini, Orso; ...
2017-08-03
In this study, transport modeling of a proposed ITER steady-state scenario based on DIII-D high poloidal-beta (more » $${{\\beta}_{p}}$$ ) discharges finds that ITB formation can occur with either sufficient rotation or a negative central shear q-profile. The high $${{\\beta}_{p}}$$ scenario is characterized by a large bootstrap current fraction (80%) which reduces the demands on the external current drive, and a large radius internal transport barrier which is associated with excellent normalized confinement. Modeling predictions of the electron transport in the high $${{\\beta}_{p}}$$ scenario improve as $${{q}_{95}}$$ approaches levels similar to typical existing models of ITER steady-state and the ion transport is turbulence dominated. Typical temperature and density profiles from the non-inductive high $${{\\beta}_{p}}$$ scenario on DIII-D are scaled according to 0D modeling predictions of the requirements for achieving a $Q=5$ steady-state fusion gain in ITER with 'day one' heating and current drive capabilities. Then, TGLF turbulence modeling is carried out under systematic variations of the toroidal rotation and the core q-profile. A high bootstrap fraction, high $${{\\beta}_{p}}$$ scenario is found to be near an ITB formation threshold, and either strong negative central magnetic shear or rotation in a high bootstrap fraction are found to successfully provide the turbulence suppression required to achieve $Q=5$.« less
Transport Barriers in Bootstrap Driven Tokamaks
NASA Astrophysics Data System (ADS)
Staebler, Gary
2017-10-01
Maximizing the bootstrap current in a tokamak, so that it drives a high fraction of the total current, reduces the external power required to drive current by other means. Improved energy confinement, relative to empirical scaling laws, enables a reactor to more fully take advantage of the bootstrap driven tokamak. Experiments have demonstrated improved energy confinement due to the spontaneous formation of an internal transport barrier in high bootstrap fraction discharges. Gyrokinetic analysis, and quasilinear predictive modeling, demonstrates that the observed transport barrier is due to the suppression of turbulence primarily due to the large Shafranov shift. ExB velocity shear does not play a significant role in the transport barrier due to the high safety factor. It will be shown, that the Shafranov shift can produce a bifurcation to improved confinement in regions of positive magnetic shear or a continuous reduction in transport for weak or negative magnetic shear. Operation at high safety factor lowers the pressure gradient threshold for the Shafranov shift driven barrier formation. The ion energy transport is reduced to neoclassical and electron energy and particle transport is reduced, but still turbulent, within the barrier. Deeper into the plasma, very large levels of electron transport are observed. The observed electron temperature profile is shown to be close to the threshold for the electron temperature gradient (ETG) mode. A large ETG driven energy transport is qualitatively consistent with recent multi-scale gyrokinetic simulations showing that reducing the ion scale turbulence can lead to large increase in the electron scale transport. A new saturation model for the quasilinear TGLF transport code, that fits these multi-scale gyrokinetic simulations, can match the data if the impact of zonal flow mixing on the ETG modes is reduced at high safety factor. This work was supported by the U.S. Department of Energy under DE-FG02-95ER54309 and DE-FC02-04ER54698.
CME Velocity and Acceleration Error Estimates Using the Bootstrap Method
NASA Technical Reports Server (NTRS)
Michalek, Grzegorz; Gopalswamy, Nat; Yashiro, Seiji
2017-01-01
The bootstrap method is used to determine errors of basic attributes of coronal mass ejections (CMEs) visually identified in images obtained by the Solar and Heliospheric Observatory (SOHO) mission's Large Angle and Spectrometric Coronagraph (LASCO) instruments. The basic parameters of CMEs are stored, among others, in a database known as the SOHO/LASCO CME catalog and are widely employed for many research studies. The basic attributes of CMEs (e.g. velocity and acceleration) are obtained from manually generated height-time plots. The subjective nature of manual measurements introduces random errors that are difficult to quantify. In many studies the impact of such measurement errors is overlooked. In this study we present a new possibility to estimate measurements errors in the basic attributes of CMEs. This approach is a computer-intensive method because it requires repeating the original data analysis procedure several times using replicate datasets. This is also commonly called the bootstrap method in the literature. We show that the bootstrap approach can be used to estimate the errors of the basic attributes of CMEs having moderately large numbers of height-time measurements. The velocity errors are in the vast majority small and depend mostly on the number of height-time points measured for a particular event. In the case of acceleration, the errors are significant, and for more than half of all CMEs, they are larger than the acceleration itself.
NASA Technical Reports Server (NTRS)
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
Image analysis of representative food structures: application of the bootstrap method.
Ramírez, Cristian; Germain, Juan C; Aguilera, José M
2009-08-01
Images (for example, photomicrographs) are routinely used as qualitative evidence of the microstructure of foods. In quantitative image analysis it is important to estimate the area (or volume) to be sampled, the field of view, and the resolution. The bootstrap method is proposed to estimate the size of the sampling area as a function of the coefficient of variation (CV(Bn)) and standard error (SE(Bn)) of the bootstrap taking sub-areas of different sizes. The bootstrap method was applied to simulated and real structures (apple tissue). For simulated structures, 10 computer-generated images were constructed containing 225 black circles (elements) and different coefficient of variation (CV(image)). For apple tissue, 8 images of apple tissue containing cellular cavities with different CV(image) were analyzed. Results confirmed that for simulated and real structures, increasing the size of the sampling area decreased the CV(Bn) and SE(Bn). Furthermore, there was a linear relationship between the CV(image) and CV(Bn) (.) For example, to obtain a CV(Bn) = 0.10 in an image with CV(image) = 0.60, a sampling area of 400 x 400 pixels (11% of whole image) was required, whereas if CV(image) = 1.46, a sampling area of 1000 x 100 pixels (69% of whole image) became necessary. This suggests that a large-size dispersion of element sizes in an image requires increasingly larger sampling areas or a larger number of images.
Reliability of dose volume constraint inference from clinical data.
Lutz, C M; Møller, D S; Hoffmann, L; Knap, M M; Alber, M
2017-04-21
Dose volume histogram points (DVHPs) frequently serve as dose constraints in radiotherapy treatment planning. An experiment was designed to investigate the reliability of DVHP inference from clinical data for multiple cohort sizes and complication incidence rates. The experimental background was radiation pneumonitis in non-small cell lung cancer and the DVHP inference method was based on logistic regression. From 102 NSCLC real-life dose distributions and a postulated DVHP model, an 'ideal' cohort was generated where the most predictive model was equal to the postulated model. A bootstrap and a Cohort Replication Monte Carlo (CoRepMC) approach were applied to create 1000 equally sized populations each. The cohorts were then analyzed to establish inference frequency distributions. This was applied to nine scenarios for cohort sizes of 102 (1), 500 (2) to 2000 (3) patients (by sampling with replacement) and three postulated DVHP models. The Bootstrap was repeated for a 'non-ideal' cohort, where the most predictive model did not coincide with the postulated model. The Bootstrap produced chaotic results for all models of cohort size 1 for both the ideal and non-ideal cohorts. For cohort size 2 and 3, the distributions for all populations were more concentrated around the postulated DVHP. For the CoRepMC, the inference frequency increased with cohort size and incidence rate. Correct inference rates >[Formula: see text] were only achieved by cohorts with more than 500 patients. Both Bootstrap and CoRepMC indicate that inference of the correct or approximate DVHP for typical cohort sizes is highly uncertain. CoRepMC results were less spurious than Bootstrap results, demonstrating the large influence that randomness in dose-response has on the statistical analysis.
Reliability of dose volume constraint inference from clinical data
NASA Astrophysics Data System (ADS)
Lutz, C. M.; Møller, D. S.; Hoffmann, L.; Knap, M. M.; Alber, M.
2017-04-01
Dose volume histogram points (DVHPs) frequently serve as dose constraints in radiotherapy treatment planning. An experiment was designed to investigate the reliability of DVHP inference from clinical data for multiple cohort sizes and complication incidence rates. The experimental background was radiation pneumonitis in non-small cell lung cancer and the DVHP inference method was based on logistic regression. From 102 NSCLC real-life dose distributions and a postulated DVHP model, an ‘ideal’ cohort was generated where the most predictive model was equal to the postulated model. A bootstrap and a Cohort Replication Monte Carlo (CoRepMC) approach were applied to create 1000 equally sized populations each. The cohorts were then analyzed to establish inference frequency distributions. This was applied to nine scenarios for cohort sizes of 102 (1), 500 (2) to 2000 (3) patients (by sampling with replacement) and three postulated DVHP models. The Bootstrap was repeated for a ‘non-ideal’ cohort, where the most predictive model did not coincide with the postulated model. The Bootstrap produced chaotic results for all models of cohort size 1 for both the ideal and non-ideal cohorts. For cohort size 2 and 3, the distributions for all populations were more concentrated around the postulated DVHP. For the CoRepMC, the inference frequency increased with cohort size and incidence rate. Correct inference rates >85 % were only achieved by cohorts with more than 500 patients. Both Bootstrap and CoRepMC indicate that inference of the correct or approximate DVHP for typical cohort sizes is highly uncertain. CoRepMC results were less spurious than Bootstrap results, demonstrating the large influence that randomness in dose-response has on the statistical analysis.
Limitations of bootstrap current models
Belli, Emily A.; Candy, Jefferey M.; Meneghini, Orso; ...
2014-03-27
We assess the accuracy and limitations of two analytic models of the tokamak bootstrap current: (1) the well-known Sauter model and (2) a recent modification of the Sauter model by Koh et al. For this study, we use simulations from the first-principles kinetic code NEO as the baseline to which the models are compared. Tests are performed using both theoretical parameter scans as well as core- to-edge scans of real DIII-D and NSTX plasma profiles. The effects of extreme aspect ratio, large impurity fraction, energetic particles, and high collisionality are studied. In particular, the error in neglecting cross-species collisional couplingmore » – an approximation inherent to both analytic models – is quantified. Moreover, the implications of the corrections from kinetic NEO simulations on MHD equilibrium reconstructions is studied via integrated modeling with kinetic EFIT.« less
Joint DIII-D/EAST Experiments Toward Steady State AT Demonstration
NASA Astrophysics Data System (ADS)
Garofalo, A. M.; Meneghini, O.; Staebler, G. M.; van Zeeland, M. A.; Gong, X.; Ding, S.; Qian, J.; Ren, Q.; Xu, G.; Grierson, B. A.; Solomon, W. M.; Holcomb, C. T.
2015-11-01
Joint DIII-D/EAST experiments on fully noninductive operation at high poloidal beta have demonstrated several attractive features of this regime for a steady-state fusion reactor. Very large bootstrap fraction (>80 %) is desirable because it reduces the demands on external noninductive current drive. High bootstrap fraction with an H-mode edge results in a broad current profile and internal transport barriers (ITBs) at large minor radius, leading to high normalized energy confinement and high MHD stability limits. The ITB radius expands with higher normalized beta, further improving both stability and confinement. Electron density ITB and large Shafranov shift lead to low AE activity in the plasma core and low anomalous fast ion losses. Both the ITB and the current profile show remarkable robustness against perturbations, without external control. Supported by US DOE under DE-FC02-04ER54698, DE-AC02-09CH11466 & DE-AC52-07NA27344 & by NMCFSP under contracts 2015GB102000 and 2015GB110001.
Transport barriers in bootstrap-driven tokamaks
NASA Astrophysics Data System (ADS)
Staebler, G. M.; Garofalo, A. M.; Pan, C.; McClenaghan, J.; Van Zeeland, M. A.; Lao, L. L.
2018-05-01
Experiments have demonstrated improved energy confinement due to the spontaneous formation of an internal transport barrier in high bootstrap fraction discharges. Gyrokinetic analysis, and quasilinear predictive modeling, demonstrates that the observed transport barrier is caused by the suppression of turbulence primarily from the large Shafranov shift. It is shown that the Shafranov shift can produce a bifurcation to improved confinement in regions of positive magnetic shear or a continuous reduction in transport for weak or negative magnetic shear. Operation at high safety factor lowers the pressure gradient threshold for the Shafranov shift-driven barrier formation. Two self-organized states of the internal and edge transport barrier are observed. It is shown that these two states are controlled by the interaction of the bootstrap current with magnetic shear, and the kinetic ballooning mode instability boundary. Election scale energy transport is predicted to be dominant in the inner 60% of the profile. Evidence is presented that energetic particle-driven instabilities could be playing a role in the thermal energy transport in this region.
New Methods for Estimating Seasonal Potential Climate Predictability
NASA Astrophysics Data System (ADS)
Feng, Xia
This study develops two new statistical approaches to assess the seasonal potential predictability of the observed climate variables. One is the univariate analysis of covariance (ANOCOVA) model, a combination of autoregressive (AR) model and analysis of variance (ANOVA). It has the advantage of taking into account the uncertainty of the estimated parameter due to sampling errors in statistical test, which is often neglected in AR based methods, and accounting for daily autocorrelation that is not considered in traditional ANOVA. In the ANOCOVA model, the seasonal signals arising from external forcing are determined to be identical or not to assess any interannual variability that may exist is potentially predictable. The bootstrap is an attractive alternative method that requires no hypothesis model and is available no matter how mathematically complicated the parameter estimator. This method builds up the empirical distribution of the interannual variance from the resamplings drawn with replacement from the given sample, in which the only predictability in seasonal means arises from the weather noise. These two methods are applied to temperature and water cycle components including precipitation and evaporation, to measure the extent to which the interannual variance of seasonal means exceeds the unpredictable weather noise compared with the previous methods, including Leith-Shukla-Gutzler (LSG), Madden, and Katz. The potential predictability of temperature from ANOCOVA model, bootstrap, LSG and Madden exhibits a pronounced tropical-extratropical contrast with much larger predictability in the tropics dominated by El Nino/Southern Oscillation (ENSO) than in higher latitudes where strong internal variability lowers predictability. Bootstrap tends to display highest predictability of the four methods, ANOCOVA lies in the middle, while LSG and Madden appear to generate lower predictability. Seasonal precipitation from ANOCOVA, bootstrap, and Katz, resembling that for temperature, is more predictable over the tropical regions, and less predictable in extropics. Bootstrap and ANOCOVA are in good agreement with each other, both methods generating larger predictability than Katz. The seasonal predictability of evaporation over land bears considerably similarity with that of temperature using ANOCOVA, bootstrap, LSG and Madden. The remote SST forcing and soil moisture reveal substantial seasonality in their relations with the potentially predictable seasonal signals. For selected regions, either SST or soil moisture or both shows significant relationships with predictable signals, hence providing indirect insight on slowly varying boundary processes involved to enable useful seasonal climate predication. A multivariate analysis of covariance (MANOCOVA) model is established to identify distinctive predictable patterns, which are uncorrelated with each other. Generally speaking, the seasonal predictability from multivariate model is consistent with that from ANOCOVA. Besides unveiling the spatial variability of predictability, MANOCOVA model also reveals the temporal variability of each predictable pattern, which could be linked to the periodic oscillations.
Willem W.S. van Hees
2002-01-01
Comparisons of estimated standard error for a ratio-of-means (ROM) estimator are presented for forest resource inventories conducted in southeast Alaska between 1995 and 2000. Estimated standard errors for the ROM were generated by using a traditional variance estimator and also approximated by bootstrap methods. Estimates of standard error generated by both...
Kappa statistic for the clustered dichotomous responses from physicians and patients
Kang, Chaeryon; Qaqish, Bahjat; Monaco, Jane; Sheridan, Stacey L.; Cai, Jianwen
2013-01-01
The bootstrap method for estimating the standard error of the kappa statistic in the presence of clustered data is evaluated. Such data arise, for example, in assessing agreement between physicians and their patients regarding their understanding of the physician-patient interaction and discussions. We propose a computationally efficient procedure for generating correlated dichotomous responses for physicians and assigned patients for simulation studies. The simulation result demonstrates that the proposed bootstrap method produces better estimate of the standard error and better coverage performance compared to the asymptotic standard error estimate that ignores dependence among patients within physicians with at least a moderately large number of clusters. An example of an application to a coronary heart disease prevention study is presented. PMID:23533082
Jiang, Wenyu; Simon, Richard
2007-12-20
This paper first provides a critical review on some existing methods for estimating the prediction error in classifying microarray data where the number of genes greatly exceeds the number of specimens. Special attention is given to the bootstrap-related methods. When the sample size n is small, we find that all the reviewed methods suffer from either substantial bias or variability. We introduce a repeated leave-one-out bootstrap (RLOOB) method that predicts for each specimen in the sample using bootstrap learning sets of size ln. We then propose an adjusted bootstrap (ABS) method that fits a learning curve to the RLOOB estimates calculated with different bootstrap learning set sizes. The ABS method is robust across the situations we investigate and provides a slightly conservative estimate for the prediction error. Even with small samples, it does not suffer from large upward bias as the leave-one-out bootstrap and the 0.632+ bootstrap, and it does not suffer from large variability as the leave-one-out cross-validation in microarray applications. Copyright (c) 2007 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Ruscio, John; Ruscio, Ayelet Meron; Meron, Mati
2007-01-01
Meehl's taxometric method was developed to distinguish categorical and continuous constructs. However, taxometric output can be difficult to interpret because expected results for realistic data conditions and differing procedural implementations have not been derived analytically or studied through rigorous simulations. By applying bootstrap…
Theodoratou, Evropi; Farrington, Susan M; Tenesa, Albert; McNeill, Geraldine; Cetnarskyj, Roseanne; Korakakis, Emmanouil; Din, Farhat V N; Porteous, Mary E; Dunlop, Malcolm G; Campbell, Harry
2014-01-01
Colorectal cancer (CRC) accounts for 9.7% of all cancer cases and for 8% of all cancer-related deaths. Established risk factors include personal or family history of CRC as well as lifestyle and dietary factors. We investigated the relationship between CRC and demographic, lifestyle, food and nutrient risk factors through a case-control study that included 2062 patients and 2776 controls from Scotland. Forward and backward stepwise regression was applied and the stability of the models was assessed in 1000 bootstrap samples. The variables that were automatically selected to be included by the forward or backward stepwise regression and whose selection was verified by bootstrap sampling in the current study were family history, dietary energy, 'high-energy snack foods', eggs, juice, sugar-sweetened beverages and white fish (associated with an increased CRC risk) and NSAIDs, coffee and magnesium (associated with a decreased CRC risk). Application of forward and backward stepwise regression in this CRC study identified some already established as well as some novel potential risk factors. Bootstrap findings suggest that examination of the stability of regression models by bootstrap sampling is useful in the interpretation of study findings. 'High-energy snack foods' and high-energy drinks (including sugar-sweetened beverages and fruit juices) as risk factors for CRC have not been reported previously and merit further investigation as such snacks and beverages are important contributors in European and North American diets.
Using the Bootstrap Concept to Build an Adaptable and Compact Subversion Artifice
2003-06-01
however, and the current “second generation” of microkernel implementations has resulted in significantly better performance. Of note is the L4 micro...63 c. GEMSOS Kernel .....................................................................63 d. L4 ... Microkernel ........................................................................64 VI. CONCLUSIONS
Fast, Exact Bootstrap Principal Component Analysis for p > 1 million
Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim
2015-01-01
Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject (p) is much larger than the number of subjects (n), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n-dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n-dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p-dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings (p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) (p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods. PMID:27616801
Bootstrapping the O(N) archipelago
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kos, Filip; Poland, David; Simmons-Duffin, David
2015-11-17
We study 3d CFTs with an O(N) global symmetry using the conformal bootstrap for a system of mixed correlators. Specifically, we consider all nonvanishing scalar four-point functions containing the lowest dimension O(N) vector Φ i and the lowest dimension O(N) singlet s, assumed to be the only relevant operators in their symmetry representations. The constraints of crossing symmetry and unitarity for these four-point functions force the scaling dimensions (Δ Φ , Δ s ) to lie inside small islands. Here, we also make rigorous determinations of current two-point functions in the O(2) and O(3) models, with applications to transport inmore » condensed matter systems.« less
A 'bootstrapped' Teaching/Learning Procedure
NASA Astrophysics Data System (ADS)
Odusina Odusote, Olusogo
1998-04-01
Erasing preconceived antiphysics ideas by nonscience/nonmajor physics students have elicited diverse teaching methods. Introductory general physics courses at college level have been taught by a 'bootstrap' approach. A concise treatment of the syllabus by the teacher in about 1/2 of the course duration, with brief exercises and examples. Students are then introduced to real life situations - toys, home appliances, sports, disasters, etc, and the embedded physics concepts discussed. Usually this generates a feeling of deja vu, which elicits desire for more. Each application usually encompasses topics in a broad range of the syllabus. The other half of the course is used by students to work individually/groups on assigned and graded home-works and essays, with guidance from the lecture notes and the teacher/supervisor. An end of course examination shows increase in the success rate.
Kappa statistic for clustered dichotomous responses from physicians and patients.
Kang, Chaeryon; Qaqish, Bahjat; Monaco, Jane; Sheridan, Stacey L; Cai, Jianwen
2013-09-20
The bootstrap method for estimating the standard error of the kappa statistic in the presence of clustered data is evaluated. Such data arise, for example, in assessing agreement between physicians and their patients regarding their understanding of the physician-patient interaction and discussions. We propose a computationally efficient procedure for generating correlated dichotomous responses for physicians and assigned patients for simulation studies. The simulation result demonstrates that the proposed bootstrap method produces better estimate of the standard error and better coverage performance compared with the asymptotic standard error estimate that ignores dependence among patients within physicians with at least a moderately large number of clusters. We present an example of an application to a coronary heart disease prevention study. Copyright © 2013 John Wiley & Sons, Ltd.
Coefficient Omega Bootstrap Confidence Intervals: Nonnormal Distributions
ERIC Educational Resources Information Center
Padilla, Miguel A.; Divers, Jasmin
2013-01-01
The performance of the normal theory bootstrap (NTB), the percentile bootstrap (PB), and the bias-corrected and accelerated (BCa) bootstrap confidence intervals (CIs) for coefficient omega was assessed through a Monte Carlo simulation under conditions not previously investigated. Of particular interests were nonnormal Likert-type and binary items.…
Tests of Independence for Ordinal Data Using Bootstrap.
ERIC Educational Resources Information Center
Chan, Wai; Yung, Yiu-Fai; Bentler, Peter M.; Tang, Man-Lai
1998-01-01
Two bootstrap tests are proposed to test the independence hypothesis in a two-way cross table. Monte Carlo studies are used to compare the traditional asymptotic test with these bootstrap methods, and the bootstrap methods are found superior in two ways: control of Type I error and statistical power. (SLD)
Bootstraps: Federal Trio Programs, if Funded, Could Help Close Income Gap
ERIC Educational Resources Information Center
Jean, Reggie
2011-01-01
Since 1964, the federal government has had two successful programs that have helped Americans from low-income and first-generation college backgrounds (whose parents never enrolled in higher education) prepare for and earn their college degrees, helping to stop the cycle of poverty. The federally funded TRIO programs (Upward Bound, Veterans Upward…
Matthew Parks; Richard Cronn; Aaron Liston
2009-01-01
We reconstruct the infrageneric phylogeny of Pinus from 37 nearly-complete chloroplast genomes (average 109 kilobases each of an approximately 120 kilobase genome) generated using multiplexed massively parallel sequencing. We found that 30/33 ingroup nodes resolved wlth > 95-percent bootstrap support; this is a substantial improvement relative...
ERIC Educational Resources Information Center
Spinella, Sarah
2011-01-01
As result replicability is essential to science and difficult to achieve through external replicability, the present paper notes the insufficiency of null hypothesis statistical significance testing (NHSST) and explains the bootstrap as a plausible alternative, with a heuristic example to illustrate the bootstrap method. The bootstrap relies on…
DOE Office of Scientific and Technical Information (OSTI.GOV)
G.Y. Fu; L.P. Ku; M.H. Redi
A key issue for compact stellarators is the stability of beta-limiting MHD modes, such as external kink modes driven by bootstrap current and pressure gradient. We report here recent progress in MHD stability studies for low-aspect-ratio Quasi-Axisymmetric Stellarators (QAS) and Quasi-Omnigeneous Stellarators (QOS). We find that the N = 0 periodicity-preserving vertical mode is significantly more stable in stellarators than in tokamaks because of the externally generated rotational transform. It is shown that both low-n external kink modes and high-n ballooning modes can be stabilized at high beta by appropriate 3D shaping without a conducting wall. The stabilization mechanism formore » external kink modes in QAS appears to be an enhancement of local magnetic shear due to 3D shaping. The stabilization of ballooning mode in QOS is related to a shortening of the normal curvature connection length.« less
Realizing Steady State Tokamak Operation for Fusion Energy
NASA Astrophysics Data System (ADS)
Luce, T. C.
2009-11-01
Continuous operation of a tokamak for fusion energy has obvious engineering advantages, but also presents physics challenges beyond the achievement of conditions needed for a burning plasma. The power from fusion reactions and external sources must support both the pressure and the current equilibrium without inductive current drive, leading to demands on stability, confinement, current drive, and plasma-wall interactions that exceed those for pulsed tokamaks. These conditions have been met individually in the present generation of tokamaks, and significant progress has been made in the last decade to realize scenarios where the required conditions are obtained simultaneously. Tokamaks are now operated routinely without disruptions close to the ideal MHD pressure limit, as needed for steady-state operation. Scenarios that project to high fusion gain have been demonstrated where more than half of the current is supplied by the ``bootstrap'' current generated by the pressure gradient in the plasma. Fully noninductive sustainment has been obtained for about a resistive time (the longest intrinsic time scale in the confined plasma) with normalized pressure and confinement approaching those needed for demonstration of steady-state conditions in ITER. One key challenge remaining to be addressed is how to handle the demanding heat and particle fluxes expected in a steady-state tokamak without compromising the high level of core plasma performance. Rather than attempt a comprehensive historical survey, this review will start from the plasma requirements of a steady-state tokamak powerplant, illustrate with examples the progress made in both experimental and theoretical understanding, and point to the remaining physics challenges.
ERIC Educational Resources Information Center
Nevitt, Jonathan; Hancock, Gregory R.
2001-01-01
Evaluated the bootstrap method under varying conditions of nonnormality, sample size, model specification, and number of bootstrap samples drawn from the resampling space. Results for the bootstrap suggest the resampling-based method may be conservative in its control over model rejections, thus having an impact on the statistical power associated…
Nonparametric bootstrap analysis with applications to demographic effects in demand functions.
Gozalo, P L
1997-12-01
"A new bootstrap proposal, labeled smooth conditional moment (SCM) bootstrap, is introduced for independent but not necessarily identically distributed data, where the classical bootstrap procedure fails.... A good example of the benefits of using nonparametric and bootstrap methods is the area of empirical demand analysis. In particular, we will be concerned with their application to the study of two important topics: what are the most relevant effects of household demographic variables on demand behavior, and to what extent present parametric specifications capture these effects." excerpt
2017-02-01
scale blade servers (Dell PowerEdge) [20]. It must be recognized however, that the findings are distributed over this collection of architectures not...current operating system designs run into millions of lines of code. Moreover, they compound the opportunity for compromise by granting device drivers...properties (e.g. IP & MAC address) so as to invalidate an adversary’s surveillance data. The current running and bootstrapping instances of the micro
Quasi-Axially Symmetric Stellarators with 3 Field Periods
NASA Astrophysics Data System (ADS)
Garabedian, Paul; Ku, Long-Poe
1998-11-01
Compact hybrid configurations with 2 field periods have been studied recently as candidates for a proof of principle experiment at PPPL, cf. A. Reiman et al., Physics design of a high beta quasi-axially symmetric stellarator, J. Plas. Fus. Res. SERIES 1, 429(1998). This enterprise has led us to the discovery of a family of quasi-axially symmetric stellarators with 3 field periods that seem to have significant advantages, although their aspect ratios are a little larger. They have reversed shear and perform better in a local analysis of ballooning modes. Nonlinear equilibrium and stability calculations predict that the average beta limit may be as high as 6% if the bootstrap current turns out to be as big as that expected in comparable tokamaks. The concept relies on a combination of helical fields and bootstrap current to achieve adequate rotational transform at low aspect ratio. A detailed manuscript describing some of this work will be published soon, cf. P.R. Garabedian, Quasi-axially symmetric stellarators, Proc. Natl. Acad. Sci. USA 95 (1998).
Kaufmann, Esther; Wittmann, Werner W.
2016-01-01
The success of bootstrapping or replacing a human judge with a model (e.g., an equation) has been demonstrated in Paul Meehl’s (1954) seminal work and bolstered by the results of several meta-analyses. To date, however, analyses considering different types of meta-analyses as well as the potential dependence of bootstrapping success on the decision domain, the level of expertise of the human judge, and the criterion for what constitutes an accurate decision have been missing from the literature. In this study, we addressed these research gaps by conducting a meta-analysis of lens model studies. We compared the results of a traditional (bare-bones) meta-analysis with findings of a meta-analysis of the success of bootstrap models corrected for various methodological artifacts. In line with previous studies, we found that bootstrapping was more successful than human judgment. Furthermore, bootstrapping was more successful in studies with an objective decision criterion than in studies with subjective or test score criteria. We did not find clear evidence that the success of bootstrapping depended on the decision domain (e.g., education or medicine) or on the judge’s level of expertise (novice or expert). Correction of methodological artifacts increased the estimated success of bootstrapping, suggesting that previous analyses without artifact correction (i.e., traditional meta-analyses) may have underestimated the value of bootstrapping models. PMID:27327085
Efficient bootstrap estimates for tail statistics
NASA Astrophysics Data System (ADS)
Breivik, Øyvind; Aarnes, Ole Johan
2017-03-01
Bootstrap resamples can be used to investigate the tail of empirical distributions as well as return value estimates from the extremal behaviour of the sample. Specifically, the confidence intervals on return value estimates or bounds on in-sample tail statistics can be obtained using bootstrap techniques. However, non-parametric bootstrapping from the entire sample is expensive. It is shown here that it suffices to bootstrap from a small subset consisting of the highest entries in the sequence to make estimates that are essentially identical to bootstraps from the entire sample. Similarly, bootstrap estimates of confidence intervals of threshold return estimates are found to be well approximated by using a subset consisting of the highest entries. This has practical consequences in fields such as meteorology, oceanography and hydrology where return values are calculated from very large gridded model integrations spanning decades at high temporal resolution or from large ensembles of independent and identically distributed model fields. In such cases the computational savings are substantial.
What Teachers Should Know About the Bootstrap: Resampling in the Undergraduate Statistics Curriculum
Hesterberg, Tim C.
2015-01-01
Bootstrapping has enormous potential in statistics education and practice, but there are subtle issues and ways to go wrong. For example, the common combination of nonparametric bootstrapping and bootstrap percentile confidence intervals is less accurate than using t-intervals for small samples, though more accurate for larger samples. My goals in this article are to provide a deeper understanding of bootstrap methods—how they work, when they work or not, and which methods work better—and to highlight pedagogical issues. Supplementary materials for this article are available online. [Received December 2014. Revised August 2015] PMID:27019512
Bannikova, A A; Bulatova, N Sh; Kramerov, D A
2006-06-01
Genetic exchange among chromosomal races of the common shrew Sorex araneus and the problem of reproductive barriers have been extensively studied by means of such molecular markers as mtDNA, microsatellites, and allozymes. In the present study, the interpopulation and interracial polymorphism in the common shrew was derived, using fingerprints generated by amplified DNA regions flanked by short interspersed repeats (SINEs)-interSINE PCR (IS-PCR). We used primers, complementary to consensus sequences of two short retroposons: mammalian element MIR and the SOR element from the genome of Sorex araneus. Genetic differentiation among eleven populations of the common shrew from eight chromosome races was estimated. The NP and MJ analyses, as well as multidimensional scaling showed that all samples examined grouped into two main clusters, corresponding to European Russia and Siberia. The bootstrap support of the European Russia cluster in the NJ and MP analyses was respectively 76 and 61%. The bootstrap index for the Siberian cluster was 100% in both analyses; the Tomsk race, included into this cluster, was separated with the bootstrap support of NJ/MP 92/95%.
NASA Astrophysics Data System (ADS)
Erkyihun, Solomon Tassew; Rajagopalan, Balaji; Zagona, Edith; Lall, Upmanu; Nowak, Kenneth
2016-05-01
A model to generate stochastic streamflow projections conditioned on quasi-oscillatory climate indices such as Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO) is presented. Recognizing that each climate index has underlying band-limited components that contribute most of the energy of the signals, we first pursue a wavelet decomposition of the signals to identify and reconstruct these features from annually resolved historical data and proxy based paleoreconstructions of each climate index covering the period from 1650 to 2012. A K-Nearest Neighbor block bootstrap approach is then developed to simulate the total signal of each of these climate index series while preserving its time-frequency structure and marginal distributions. Finally, given the simulated climate signal time series, a K-Nearest Neighbor bootstrap is used to simulate annual streamflow series conditional on the joint state space defined by the simulated climate index for each year. We demonstrate this method by applying it to simulation of streamflow at Lees Ferry gauge on the Colorado River using indices of two large scale climate forcings: Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO), which are known to modulate the Colorado River Basin (CRB) hydrology at multidecadal time scales. Skill in stochastic simulation of multidecadal projections of flow using this approach is demonstrated.
Marami Milani, Mohammad Reza; Hense, Andreas; Rahmani, Elham; Ploeger, Angelika
2015-01-01
This study analyzes the linear relationship between climate variables and milk components in Iran by applying bootstrapping to include and assess the uncertainty. The climate parameters, Temperature Humidity Index (THI) and Equivalent Temperature Index (ETI) are computed from the NASA-Modern Era Retrospective-Analysis for Research and Applications (NASA-MERRA) reanalysis (2002–2010). Milk data for fat, protein (measured on fresh matter bases), and milk yield are taken from 936,227 milk records for the same period, using cows fed by natural pasture from April to September. Confidence intervals for the regression model are calculated using the bootstrap technique. This method is applied to the original times series, generating statistically equivalent surrogate samples. As a result, despite the short time data and the related uncertainties, an interesting behavior of the relationships between milk compound and the climate parameters is visible. During spring only, a weak dependency of milk yield and climate variations is obvious, while fat and protein concentrations show reasonable correlations. In summer, milk yield shows a similar level of relationship with ETI, but not with temperature and THI. We suggest this methodology for studies in the field of the impacts of climate change and agriculture, also environment and food with short-term data. PMID:28231215
Warton, David I; Thibaut, Loïc; Wang, Yi Alice
2017-01-01
Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)-common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of "model-free bootstrap", adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods.
Bootstrap Percolation on Homogeneous Trees Has 2 Phase Transitions
NASA Astrophysics Data System (ADS)
Fontes, L. R. G.; Schonmann, R. H.
2008-09-01
We study the threshold θ bootstrap percolation model on the homogeneous tree with degree b+1, 2≤ θ≤ b, and initial density p. It is known that there exists a nontrivial critical value for p, which we call p f , such that a) for p> p f , the final bootstrapped configuration is fully occupied for almost every initial configuration, and b) if p< p f , then for almost every initial configuration, the final bootstrapped configuration has density of occupied vertices less than 1. In this paper, we establish the existence of a distinct critical value for p, p c , such that 0< p c < p f , with the following properties: 1) if p≤ p c , then for almost every initial configuration there is no infinite cluster of occupied vertices in the final bootstrapped configuration; 2) if p> p c , then for almost every initial configuration there are infinite clusters of occupied vertices in the final bootstrapped configuration. Moreover, we show that 3) for p< p c , the distribution of the occupied cluster size in the final bootstrapped configuration has an exponential tail; 4) at p= p c , the expected occupied cluster size in the final bootstrapped configuration is infinite; 5) the probability of percolation of occupied vertices in the final bootstrapped configuration is continuous on [0, p f ] and analytic on ( p c , p f ), admitting an analytic continuation from the right at p c and, only in the case θ= b, also from the left at p f .
Xiao, Yongling; Abrahamowicz, Michal
2010-03-30
We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.
Darling, Stephen; Parker, Mary-Jane; Goodall, Karen E; Havelka, Jelena; Allen, Richard J
2014-03-01
When participants carry out visually presented digit serial recall, their performance is better if they are given the opportunity to encode extra visuospatial information at encoding-a phenomenon that has been termed visuospatial bootstrapping. This bootstrapping is the result of integration of information from different modality-specific short-term memory systems and visuospatial knowledge in long term memory, and it can be understood in the context of recent models of working memory that address multimodal binding (e.g., models incorporating an episodic buffer). Here we report a cross-sectional developmental study that demonstrated visuospatial bootstrapping in adults (n=18) and 9-year-old children (n=15) but not in 6-year-old children (n=18). This is the first developmental study addressing visuospatial bootstrapping, and results demonstrate that the developmental trajectory of bootstrapping is different from that of basic verbal and visuospatial working memory. This pattern suggests that bootstrapping (and hence integrative functions such as those associated with the episodic buffer) emerge independent of the development of basic working memory slave systems during childhood. Copyright © 2013 Elsevier Inc. All rights reserved.
Augmenting Literacy: The Role of Expertise in Digital Writing
ERIC Educational Resources Information Center
Van Ittersum, Derek
2011-01-01
This essay presents a model of reflective use of writing technologies, one that provides a means of more fully exploiting the possibilities of these tools for transforming writing activity. Derived from the work of computer designer Douglas Engelbart, the "bootstrapping" model of reflective use extends current arguments in the field…
ERIC Educational Resources Information Center
Stapleton, Laura M.
2008-01-01
This article discusses replication sampling variance estimation techniques that are often applied in analyses using data from complex sampling designs: jackknife repeated replication, balanced repeated replication, and bootstrapping. These techniques are used with traditional analyses such as regression, but are currently not used with structural…
ERIC Educational Resources Information Center
Harrison, David
1979-01-01
The issue of observability and the relative roles of the senses and reason in understanding the world is reviewed. Eastern "mystical" philosophy serves as a focus in which interpretations of quantum mechanics, as well as the current bootstrap-quark controversy are seen in some slightly different contexts. (Author/GA)
ERIC Educational Resources Information Center
Enders, Craig K.
2005-01-01
The Bollen-Stine bootstrap can be used to correct for standard error and fit statistic bias that occurs in structural equation modeling (SEM) applications due to nonnormal data. The purpose of this article is to demonstrate the use of a custom SAS macro program that can be used to implement the Bollen-Stine bootstrap with existing SEM software.…
Inverse bootstrapping conformal field theories
NASA Astrophysics Data System (ADS)
Li, Wenliang
2018-01-01
We propose a novel approach to study conformal field theories (CFTs) in general dimensions. In the conformal bootstrap program, one usually searches for consistent CFT data that satisfy crossing symmetry. In the new method, we reverse the logic and interpret manifestly crossing-symmetric functions as generating functions of conformal data. Physical CFTs can be obtained by scanning the space of crossing-symmetric functions. By truncating the fusion rules, we are able to concentrate on the low-lying operators and derive some approximate relations for their conformal data. It turns out that the free scalar theory, the 2d minimal model CFTs, the ϕ 4 Wilson-Fisher CFT, the Lee-Yang CFTs and the Ising CFTs are consistent with the universal relations from the minimal fusion rule ϕ 1 × ϕ 1 = I + ϕ 2 + T , where ϕ 1 , ϕ 2 are scalar operators, I is the identity operator and T is the stress tensor.
LeDell, Erin; Petersen, Maya; van der Laan, Mark
In binary classification problems, the area under the ROC curve (AUC) is commonly used to evaluate the performance of a prediction model. Often, it is combined with cross-validation in order to assess how the results will generalize to an independent data set. In order to evaluate the quality of an estimate for cross-validated AUC, we obtain an estimate of its variance. For massive data sets, the process of generating a single performance estimate can be computationally expensive. Additionally, when using a complex prediction method, the process of cross-validating a predictive model on even a relatively small data set can still require a large amount of computation time. Thus, in many practical settings, the bootstrap is a computationally intractable approach to variance estimation. As an alternative to the bootstrap, we demonstrate a computationally efficient influence curve based approach to obtaining a variance estimate for cross-validated AUC.
Petersen, Maya; van der Laan, Mark
2015-01-01
In binary classification problems, the area under the ROC curve (AUC) is commonly used to evaluate the performance of a prediction model. Often, it is combined with cross-validation in order to assess how the results will generalize to an independent data set. In order to evaluate the quality of an estimate for cross-validated AUC, we obtain an estimate of its variance. For massive data sets, the process of generating a single performance estimate can be computationally expensive. Additionally, when using a complex prediction method, the process of cross-validating a predictive model on even a relatively small data set can still require a large amount of computation time. Thus, in many practical settings, the bootstrap is a computationally intractable approach to variance estimation. As an alternative to the bootstrap, we demonstrate a computationally efficient influence curve based approach to obtaining a variance estimate for cross-validated AUC. PMID:26279737
Comparison of Sample Size by Bootstrap and by Formulas Based on Normal Distribution Assumption.
Wang, Zuozhen
2018-01-01
Bootstrapping technique is distribution-independent, which provides an indirect way to estimate the sample size for a clinical trial based on a relatively smaller sample. In this paper, sample size estimation to compare two parallel-design arms for continuous data by bootstrap procedure are presented for various test types (inequality, non-inferiority, superiority, and equivalence), respectively. Meanwhile, sample size calculation by mathematical formulas (normal distribution assumption) for the identical data are also carried out. Consequently, power difference between the two calculation methods is acceptably small for all the test types. It shows that the bootstrap procedure is a credible technique for sample size estimation. After that, we compared the powers determined using the two methods based on data that violate the normal distribution assumption. To accommodate the feature of the data, the nonparametric statistical method of Wilcoxon test was applied to compare the two groups in the data during the process of bootstrap power estimation. As a result, the power estimated by normal distribution-based formula is far larger than that by bootstrap for each specific sample size per group. Hence, for this type of data, it is preferable that the bootstrap method be applied for sample size calculation at the beginning, and that the same statistical method as used in the subsequent statistical analysis is employed for each bootstrap sample during the course of bootstrap sample size estimation, provided there is historical true data available that can be well representative of the population to which the proposed trial is planning to extrapolate.
Application of the Bootstrap Methods in Factor Analysis.
ERIC Educational Resources Information Center
Ichikawa, Masanori; Konishi, Sadanori
1995-01-01
A Monte Carlo experiment was conducted to investigate the performance of bootstrap methods in normal theory maximum likelihood factor analysis when the distributional assumption was satisfied or unsatisfied. Problems arising with the use of bootstrap methods are highlighted. (SLD)
External heating and current drive source requirements towards steady-state operation in ITER
NASA Astrophysics Data System (ADS)
Poli, F. M.; Kessel, C. E.; Bonoli, P. T.; Batchelor, D. B.; Harvey, R. W.; Snyder, P. B.
2014-07-01
Steady state scenarios envisaged for ITER aim at optimizing the bootstrap current, while maintaining sufficient confinement and stability to provide the necessary fusion yield. Non-inductive scenarios will need to operate with internal transport barriers (ITBs) in order to reach adequate fusion gain at typical currents of 9 MA. However, the large pressure gradients associated with ITBs in regions of weak or negative magnetic shear can be conducive to ideal MHD instabilities, reducing the no-wall limit. The E × B flow shear from toroidal plasma rotation is expected to be low in ITER, with a major role in the ITB dynamics being played by magnetic geometry. Combinations of heating and current drive (H/CD) sources that sustain reversed magnetic shear profiles throughout the discharge are the focus of this work. Time-dependent transport simulations indicate that a combination of electron cyclotron (EC) and lower hybrid (LH) waves is a promising route towards steady state operation in ITER. The LH forms and sustains expanded barriers and the EC deposition at mid-radius freezes the bootstrap current profile stabilizing the barrier and leading to confinement levels 50% higher than typical H-mode energy confinement times. Using LH spectra with spectrum centred on parallel refractive index of 1.75-1.85, the performance of these plasma scenarios is close to the ITER target of 9 MA non-inductive current, global confinement gain H98 = 1.6 and fusion gain Q = 5.
Small sample mediation testing: misplaced confidence in bootstrapped confidence intervals.
Koopman, Joel; Howe, Michael; Hollenbeck, John R; Sin, Hock-Peng
2015-01-01
Bootstrapping is an analytical tool commonly used in psychology to test the statistical significance of the indirect effect in mediation models. Bootstrapping proponents have particularly advocated for its use for samples of 20-80 cases. This advocacy has been heeded, especially in the Journal of Applied Psychology, as researchers are increasingly utilizing bootstrapping to test mediation with samples in this range. We discuss reasons to be concerned with this escalation, and in a simulation study focused specifically on this range of sample sizes, we demonstrate not only that bootstrapping has insufficient statistical power to provide a rigorous hypothesis test in most conditions but also that bootstrapping has a tendency to exhibit an inflated Type I error rate. We then extend our simulations to investigate an alternative empirical resampling method as well as a Bayesian approach and demonstrate that they exhibit comparable statistical power to bootstrapping in small samples without the associated inflated Type I error. Implications for researchers testing mediation hypotheses in small samples are presented. For researchers wishing to use these methods in their own research, we have provided R syntax in the online supplemental materials. (c) 2015 APA, all rights reserved.
Bootstrap confidence levels for phylogenetic trees.
Efron, B; Halloran, E; Holmes, S
1996-07-09
Evolutionary trees are often estimated from DNA or RNA sequence data. How much confidence should we have in the estimated trees? In 1985, Felsenstein [Felsenstein, J. (1985) Evolution 39, 783-791] suggested the use of the bootstrap to answer this question. Felsenstein's method, which in concept is a straightforward application of the bootstrap, is widely used, but has been criticized as biased in the genetics literature. This paper concerns the use of the bootstrap in the tree problem. We show that Felsenstein's method is not biased, but that it can be corrected to better agree with standard ideas of confidence levels and hypothesis testing. These corrections can be made by using the more elaborate bootstrap method presented here, at the expense of considerably more computation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Batchelor, D.B.; Carreras, B.A.; Hirshman, S.P.
Significant progress has been made in the development of new modest-size compact stellarator devices that could test optimization principles for the design of a more attractive reactor. These are 3 and 4 field period low-aspect-ratio quasi-omnigenous (QO) stellarators based on an optimization method that targets improved confinement, stability, ease of coil design, low-aspect-ratio, and low bootstrap current.
Working Memory Deficits and Social Problems in Children with ADHD
ERIC Educational Resources Information Center
Kofler, Michael J.; Rapport, Mark D.; Bolden, Jennifer; Sarver, Dustin E.; Raiker, Joseph S.; Alderson, R. Matt
2011-01-01
Social problems are a prevalent feature of ADHD and reflect a major source of functional impairment for these children. The current study examined the impact of working memory deficits on parent- and teacher-reported social problems in a sample of children with ADHD and typically developing boys (N = 39). Bootstrapped, bias-corrected mediation…
ERIC Educational Resources Information Center
Pejovic, Jovana; Molnar, Monika
2017-01-01
Recently it has been proposed that sensitivity to nonarbitrary relationships between speech sounds and objects potentially bootstraps lexical acquisition. However, it is currently unclear whether preverbal infants (e.g., before 6 months of age) with different linguistic profiles are sensitive to such nonarbitrary relationships. Here, the authors…
An algebraic approach to the analytic bootstrap
Alday, Luis F.; Zhiboedov, Alexander
2017-04-27
We develop an algebraic approach to the analytic bootstrap in CFTs. By acting with the Casimir operator on the crossing equation we map the problem of doing large spin sums to any desired order to the problem of solving a set of recursion relations. We compute corrections to the anomalous dimension of large spin operators due to the exchange of a primary and its descendants in the crossed channel and show that this leads to a Borel-summable expansion. Here, we analyse higher order corrections to the microscopic CFT data in the direct channel and its matching to infinite towers ofmore » operators in the crossed channel. We apply this method to the critical O(N ) model. At large N we reproduce the first few terms in the large spin expansion of the known two-loop anomalous dimensions of higher spin currents in the traceless symmetric representation of O(N ) and make further predictions. At small N we present the results for the truncated large spin expansion series of anomalous dimensions of higher spin currents.« less
Progress toward steady-state tokamak operation exploiting the high bootstrap current fraction regime
Ren, Q. L.; Garofalo, A. M.; Gong, X. Z.; ...
2016-06-20
Recent DIII-D experiments have increased the normalized fusion performance of the high bootstrap current fraction tokamak regime toward reactor-relevant steady state operation. The experiments, conducted by a joint team of researchers from the DIII-D and EAST tokamaks, developed a fully noninductive scenario that could be extended on EAST to a demonstration of long pulse steady-state tokamak operation. Improved understanding of scenario stability has led to the achievement of very high values of β p and β N despite strong ITBs. Good confinement has been achieved with reduced toroidal rotation. These high β p plasmas challenge the energy transport understanding, especiallymore » in the electron energy channel. A new turbulent transport model, named 2 TGLF-SAT1, has been developed which improves the transport prediction. Experiments extending results to long pulse on EAST, based on the physics basis developed at DIII-D, have been conducted. Finally, more investigations will be carried out on EAST with more additional auxiliary power to come online in the near term.« less
Coefficient Alpha Bootstrap Confidence Interval under Nonnormality
ERIC Educational Resources Information Center
Padilla, Miguel A.; Divers, Jasmin; Newton, Matthew
2012-01-01
Three different bootstrap methods for estimating confidence intervals (CIs) for coefficient alpha were investigated. In addition, the bootstrap methods were compared with the most promising coefficient alpha CI estimation methods reported in the literature. The CI methods were assessed through a Monte Carlo simulation utilizing conditions…
Pearson-type goodness-of-fit test with bootstrap maximum likelihood estimation.
Yin, Guosheng; Ma, Yanyuan
2013-01-01
The Pearson test statistic is constructed by partitioning the data into bins and computing the difference between the observed and expected counts in these bins. If the maximum likelihood estimator (MLE) of the original data is used, the statistic generally does not follow a chi-squared distribution or any explicit distribution. We propose a bootstrap-based modification of the Pearson test statistic to recover the chi-squared distribution. We compute the observed and expected counts in the partitioned bins by using the MLE obtained from a bootstrap sample. This bootstrap-sample MLE adjusts exactly the right amount of randomness to the test statistic, and recovers the chi-squared distribution. The bootstrap chi-squared test is easy to implement, as it only requires fitting exactly the same model to the bootstrap data to obtain the corresponding MLE, and then constructs the bin counts based on the original data. We examine the test size and power of the new model diagnostic procedure using simulation studies and illustrate it with a real data set.
Thibaut, Loïc; Wang, Yi Alice
2017-01-01
Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)—common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of “model-free bootstrap”, adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods. PMID:28738071
William Salas; Steve Hagen
2013-01-01
This presentation will provide an overview of an approach for quantifying uncertainty in spatial estimates of carbon emission from land use change. We generate uncertainty bounds around our final emissions estimate using a randomized, Monte Carlo (MC)-style sampling technique. This approach allows us to combine uncertainty from different sources without making...
Bootstrap Estimates of Standard Errors in Generalizability Theory
ERIC Educational Resources Information Center
Tong, Ye; Brennan, Robert L.
2007-01-01
Estimating standard errors of estimated variance components has long been a challenging task in generalizability theory. Researchers have speculated about the potential applicability of the bootstrap for obtaining such estimates, but they have identified problems (especially bias) in using the bootstrap. Using Brennan's bias-correcting procedures…
Problems with Multivariate Normality: Can the Multivariate Bootstrap Help?
ERIC Educational Resources Information Center
Thompson, Bruce
Multivariate normality is required for some statistical tests. This paper explores the implications of violating the assumption of multivariate normality and illustrates a graphical procedure for evaluating multivariate normality. The logic for using the multivariate bootstrap is presented. The multivariate bootstrap can be used when distribution…
Investigation of geomagnetic induced current at high latitude during the storm-time variation
NASA Astrophysics Data System (ADS)
Falayi, E. O.; Ogunmodimu, O.; Bolaji, O. S.; Ayanda, J. D.; Ojoniyi, O. S.
2017-06-01
During the geomagnetic disturbances, the geomagnetically induced current (GIC) are influenced by the geoelectric field flowing in conductive Earth. In this paper, we studied the variability of GICs, the time derivatives of the geomagnetic field (dB/dt), geomagnetic indices: Symmetric disturbance field in H (SYM-H) index, AU (eastward electrojet) and AL (westward electrojet) indices, Interplanetary parameters such as solar wind speed (v), and interplanetary magnetic field (Bz) during the geomagnetic storms on 31 March 2001, 21 October 2001, 6 November 2001, 29 October 2003, 31 October 2003 and 9 November 2004 with high solar wind speed due to a coronal mass ejection. Wavelet spectrum based approach was employed to analyze the GIC time series in a sequence of time scales of one to twenty four hours. It was observed that there are more concentration of power between the 14-24 h on 31 March 2001, 17-24 h on 21 October 2001, 1-7 h on 6 November 2001, two peaks were observed between 5-8 h and 21-24 h on 29 October 2003, 1-3 h on 31 October 2003 and 18-22 h on 9 November 2004. Bootstrap method was used to obtain regression correlations between the time derivative of the geomagnetic field (dB/dt) and the observed values of the geomagnetic induced current on 31 March 2001, 21 October 2001, 6 November 2001, 29 October 2003, 31 October 2003 and 9 November 2004 which shows a distributed cluster of correlation coefficients at around r = -0.567, -0.717, -0.477, -0.419, -0.210 and r = -0.488 respectively. We observed that high energy wavelet coefficient correlated well with bootstrap correlation, while low energy wavelet coefficient gives low bootstrap correlation. It was noticed that the geomagnetic storm has a influence on GIC and geomagnetic field derivatives (dB/dt). This might be ascribed to the coronal mass ejection with solar wind due to particle acceleration processes in the solar atmosphere.
From current-driven to neoclassically driven tearing modes.
Reimerdes, H; Sauter, O; Goodman, T; Pochelon, A
2002-03-11
In the TCV tokamak, the m/n = 2/1 island is observed in low-density discharges with central electron-cyclotron current drive. The evolution of its width has two distinct growth phases, one of which can be linked to a "conventional" tearing mode driven unstable by the current profile and the other to a neoclassical tearing mode driven by a perturbation of the bootstrap current. The TCV results provide the first clear observation of such a destabilization mechanism and reconcile the theory of conventional and neoclassical tearing modes, which differ only in the dominant driving term.
Three-dimensional magnetohydrodynamic equilibrium of quiescent H-modes in tokamak systems
NASA Astrophysics Data System (ADS)
Cooper, W. A.; Graves, J. P.; Duval, B. P.; Sauter, O.; Faustin, J. M.; Kleiner, A.; Lanthaler, S.; Patten, H.; Raghunathan, M.; Tran, T.-M.; Chapman, I. T.; Ham, C. J.
2016-06-01
Three dimensional free boundary magnetohydrodynamic equilibria that recover saturated ideal kink/peeling structures are obtained numerically. Simulations that model the JET tokamak at fixed < β > =1.7% with a large edge bootstrap current that flattens the q-profile near the plasma boundary demonstrate that a radial parallel current density ribbon with a dominant m /n = 5/1 Fourier component at {{I}\\text{t}}=2.2 MA develops into a broadband spectrum when the toroidal current I t is increased to 2.5 MA.
Impact of Sampling Density on the Extent of HIV Clustering
Novitsky, Vlad; Moyo, Sikhulile; Lei, Quanhong; DeGruttola, Victor
2014-01-01
Abstract Identifying and monitoring HIV clusters could be useful in tracking the leading edge of HIV transmission in epidemics. Currently, greater specificity in the definition of HIV clusters is needed to reduce confusion in the interpretation of HIV clustering results. We address sampling density as one of the key aspects of HIV cluster analysis. The proportion of viral sequences in clusters was estimated at sampling densities from 1.0% to 70%. A set of 1,248 HIV-1C env gp120 V1C5 sequences from a single community in Botswana was utilized in simulation studies. Matching numbers of HIV-1C V1C5 sequences from the LANL HIV Database were used as comparators. HIV clusters were identified by phylogenetic inference under bootstrapped maximum likelihood and pairwise distance cut-offs. Sampling density below 10% was associated with stochastic HIV clustering with broad confidence intervals. HIV clustering increased linearly at sampling density >10%, and was accompanied by narrowing confidence intervals. Patterns of HIV clustering were similar at bootstrap thresholds 0.7 to 1.0, but the extent of HIV clustering decreased with higher bootstrap thresholds. The origin of sampling (local concentrated vs. scattered global) had a substantial impact on HIV clustering at sampling densities ≥10%. Pairwise distances at 10% were estimated as a threshold for cluster analysis of HIV-1 V1C5 sequences. The node bootstrap support distribution provided additional evidence for 10% sampling density as the threshold for HIV cluster analysis. The detectability of HIV clusters is substantially affected by sampling density. A minimal genotyping density of 10% and sampling density of 50–70% are suggested for HIV-1 V1C5 cluster analysis. PMID:25275430
Visceral sensitivity, anxiety, and smoking among treatment-seeking smokers.
Zvolensky, Michael J; Bakhshaie, Jafar; Norton, Peter J; Smits, Jasper A J; Buckner, Julia D; Garey, Lorra; Manning, Kara
2017-12-01
It is widely recognized that smoking is related to abdominal pain and discomfort, as well as gastrointestinal disorders. Research has shown that visceral sensitivity, experiencing anxiety around gastrointestinal sensations, is associated with poorer gastrointestinal health and related health outcomes. Visceral sensitivity also increases anxiety symptoms and mediates the relation with other risk factors, including gastrointestinal distress. No work to date, however, has evaluated visceral sensitivity in the context of smoking despite the strong association between smoking and poor physical and mental health. The current study sought to examine visceral sensitivity as a unique predictor of cigarette dependence, threat-related smoking abstinence expectancies (somatic symptoms and harmful consequences), and perceived barriers for cessation via anxiety symptoms. Eighty-four treatment seeking adult daily smokers (M age =45.1years [SD=10.4]; 71.6% male) participated in this study. There was a statistically significant indirect effect of visceral sensitivity via general anxiety symptoms on cigarette dependence (b=0.02, SE=0.01, Bootstrapped 95% CI [0.006, 0.05]), smoking abstinence somatic expectancies (b=0.10, SE=0.03, Bootstrapped 95% CI [0.03, 0.19]), smoking abstinence harmful experiences (b=0.13, SE=0.05, Bootstrapped 95% CI [0.03, 0.25]), and barriers to cessation (b=0.05, SE=0.06, Bootstrapped 95% CI [0.01, 0.13]). Overall, the present study serves as an initial investigation into the nature of the associations between visceral sensitivity, anxiety symptoms, and clinically significant smoking processes among treatment-seeking smokers. Future work is needed to explore the extent to which anxiety accounts for relations between visceral sensitivity and other smoking processes (e.g., withdrawal, cessation outcome). Copyright © 2017 Elsevier Ltd. All rights reserved.
Unbiased Estimates of Variance Components with Bootstrap Procedures
ERIC Educational Resources Information Center
Brennan, Robert L.
2007-01-01
This article provides general procedures for obtaining unbiased estimates of variance components for any random-model balanced design under any bootstrap sampling plan, with the focus on designs of the type typically used in generalizability theory. The results reported here are particularly helpful when the bootstrap is used to estimate standard…
Explorations in Statistics: the Bootstrap
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2009-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fourth installment of Explorations in Statistics explores the bootstrap. The bootstrap gives us an empirical approach to estimate the theoretical variability among possible values of a sample statistic such as the…
Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection
NASA Technical Reports Server (NTRS)
Kumar, Sricharan; Srivistava, Ashok N.
2012-01-01
Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.
High performance advanced tokamak regimes in DIII-D for next-step experiments
NASA Astrophysics Data System (ADS)
Greenfield, C. M.; Murakami, M.; Ferron, J. R.; Wade, M. R.; Luce, T. C.; Petty, C. C.; Menard, J. E.; Petrie, T. W.; Allen, S. L.; Burrell, K. H.; Casper, T. A.; DeBoo, J. C.; Doyle, E. J.; Garofalo, A. M.; Gorelov, I. A.; Groebner, R. J.; Hobirk, J.; Hyatt, A. W.; Jayakumar, R. J.; Kessel, C. E.; La Haye, R. J.; Jackson, G. L.; Lohr, J.; Makowski, M. A.; Pinsker, R. I.; Politzer, P. A.; Prater, R.; Strait, E. J.; Taylor, T. S.; West, W. P.; DIII-D Team
2004-05-01
Advanced Tokamak (AT) research in DIII-D [K. H. Burrell for the DIII-D Team, in Proceedings of the 19th Fusion Energy Conference, Lyon, France, 2002 (International Atomic Energy Agency, Vienna, 2002) published on CD-ROM] seeks to provide a scientific basis for steady-state high performance operation in future devices. These regimes require high toroidal beta to maximize fusion output and poloidal beta to maximize the self-driven bootstrap current. Achieving these conditions requires integrated, simultaneous control of the current and pressure profiles, and active magnetohydrodynamic stability control. The building blocks for AT operation are in hand. Resistive wall mode stabilization via plasma rotation and active feedback with nonaxisymmetric coils allows routine operation above the no-wall beta limit. Neoclassical tearing modes are stabilized by active feedback control of localized electron cyclotron current drive (ECCD). Plasma shaping and profile control provide further improvements. Under these conditions, bootstrap supplies most of the current. Steady-state operation requires replacing the remaining Ohmic current, mostly located near the half radius, with noninductive external sources. In DIII-D this current is provided by ECCD, and nearly stationary AT discharges have been sustained with little remaining Ohmic current. Fast wave current drive is being developed to control the central magnetic shear. Density control, with divertor cryopumps, of AT discharges with edge localized moding H-mode edges facilitates high current drive efficiency at reactor relevant collisionalities. A sophisticated plasma control system allows integrated control of these elements. Close coupling between modeling and experiment is key to understanding the separate elements, their complex nonlinear interactions, and their integration into self-consistent high performance scenarios. Progress on this development, and its implications for next-step devices, will be illustrated by results of recent experiment and simulation efforts.
Core transport properties in JT-60U and JET identity plasmas
NASA Astrophysics Data System (ADS)
Litaudon, X.; Sakamoto, Y.; de Vries, P. C.; Salmi, A.; Tala, T.; Angioni, C.; Benkadda, S.; Beurskens, M. N. A.; Bourdelle, C.; Brix, M.; Crombé, K.; Fujita, T.; Futatani, S.; Garbet, X.; Giroud, C.; Hawkes, N. C.; Hayashi, N.; Hoang, G. T.; Hogeweij, G. M. D.; Matsunaga, G.; Nakano, T.; Oyama, N.; Parail, V.; Shinohara, K.; Suzuki, T.; Takechi, M.; Takenaga, H.; Takizuka, T.; Urano, H.; Voitsekhovitch, I.; Yoshida, M.; ITPA Transport Group; JT-60 Team; EFDA contributors, JET
2011-07-01
The paper compares the transport properties of a set of dimensionless identity experiments performed between JET and JT-60U in the advanced tokamak regime with internal transport barrier, ITB. These International Tokamak Physics Activity, ITPA, joint experiments were carried out with the same plasma shape, toroidal magnetic field ripple and dimensionless profiles as close as possible during the ITB triggering phase in terms of safety factor, normalized Larmor radius, normalized collision frequency, thermal beta, ratio of ion to electron temperatures. Similarities in the ITB triggering mechanisms and sustainment were observed when a good match was achieved of the most relevant normalized profiles except the toroidal Mach number. Similar thermal ion transport levels in the two devices have been measured in either monotonic or non-monotonic q-profiles. In contrast, differences between JET and JT-60U were observed on the electron thermal and particle confinement in reversed magnetic shear configurations. It was found that the larger shear reversal in the very centre (inside normalized radius of 0.2) of JT-60U plasmas allowed the sustainment of stronger electron density ITBs compared with JET. As a consequence of peaked density profile, the core bootstrap current density is more than five times higher in JT-60U compared with JET. Thanks to the bootstrap effect and the slightly broader neutral beam deposition, reversed magnetic shear configurations are self-sustained in JT-60U scenarios. Analyses of similarities and differences between the two devices address key questions on the validity of the usual assumptions made in ITER steady scenario modelling, e.g. a flat density profile in the core with thermal transport barrier? Such assumptions have consequences on the prediction of fusion performance, bootstrap current and on the sustainment of the scenario.
A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment
ERIC Educational Resources Information Center
Finch, Holmes; Monahan, Patrick
2008-01-01
This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…
NASA Astrophysics Data System (ADS)
Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker
2017-10-01
Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c≥ 13/24 for the central charge of such models, which we argue cannot be saturated by an interacting SCFT.
Epistemic uncertainty in the location and magnitude of earthquakes in Italy from Macroseismic data
Bakun, W.H.; Gomez, Capera A.; Stucchi, M.
2011-01-01
Three independent techniques (Bakun and Wentworth, 1997; Boxer from Gasperini et al., 1999; and Macroseismic Estimation of Earthquake Parameters [MEEP; see Data and Resources section, deliverable D3] from R.M.W. Musson and M.J. Jimenez) have been proposed for estimating an earthquake location and magnitude from intensity data alone. The locations and magnitudes obtained for a given set of intensity data are almost always different, and no one technique is consistently best at matching instrumental locations and magnitudes of recent well-recorded earthquakes in Italy. Rather than attempting to select one of the three solutions as best, we use all three techniques to estimate the location and the magnitude and the epistemic uncertainties among them. The estimates are calculated using bootstrap resampled data sets with Monte Carlo sampling of a decision tree. The decision-tree branch weights are based on goodness-of-fit measures of location and magnitude for recent earthquakes. The location estimates are based on the spatial distribution of locations calculated from the bootstrap resampled data. The preferred source location is the locus of the maximum bootstrap location spatial density. The location uncertainty is obtained from contours of the bootstrap spatial density: 68% of the bootstrap locations are within the 68% confidence region, and so on. For large earthquakes, our preferred location is not associated with the epicenter but with a location on the extended rupture surface. For small earthquakes, the epicenters are generally consistent with the location uncertainties inferred from the intensity data if an epicenter inaccuracy of 2-3 km is allowed. The preferred magnitude is the median of the distribution of bootstrap magnitudes. As with location uncertainties, the uncertainties in magnitude are obtained from the distribution of bootstrap magnitudes: the bounds of the 68% uncertainty range enclose 68% of the bootstrap magnitudes, and so on. The instrumental magnitudes for large and small earthquakes are generally consistent with the confidence intervals inferred from the distribution of bootstrap resampled magnitudes.
Bootstrapped Learning Analysis and Curriculum Development Environment (BLADE)
2012-02-01
framework Development of the automated teacher The software development aspect of the BL program was conducted primarily in the Java programming...parameters are analogous to Java class data members or to fields in a C structure. Here is an example composite IL object from Blocks World, an...2 and 3, alternative methods of implementing generators were developed, first in Java , later in Ruby. Both of these alternatives lowered the
Terribile, L C; Diniz-Filho, J A F; De Marco, P
2010-05-01
The use of ecological niche models (ENM) to generate potential geographic distributions of species has rapidly increased in ecology, conservation and evolutionary biology. Many methods are available and the most used are Maximum Entropy Method (MAXENT) and the Genetic Algorithm for Rule Set Production (GARP). Recent studies have shown that MAXENT perform better than GARP. Here we used the statistics methods of ROC - AUC (area under the Receiver Operating Characteristics curve) and bootstrap to evaluate the performance of GARP and MAXENT in generate potential distribution models for 39 species of New World coral snakes. We found that values of AUC for GARP ranged from 0.923 to 0.999, whereas those for MAXENT ranged from 0.877 to 0.999. On the whole, the differences in AUC were very small, but for 10 species GARP outperformed MAXENT. Means and standard deviations for 100 bootstrapped samples with sample sizes ranging from 3 to 30 species did not show any trends towards deviations from a zero difference in AUC values of GARP minus AUC values of MAXENT. Ours results suggest that further studies are still necessary to establish under which circumstances the statistical performance of the methods vary. However, it is also important to consider the possibility that this empirical inductive reasoning may fail in the end, because we almost certainly could not establish all potential scenarios generating variation in the relative performance of models.
Chaikh, Abdulhamid; Balosso, Jacques
2016-12-01
To apply the statistical bootstrap analysis and dosimetric criteria's to assess the change of prescribed dose (PD) for lung cancer to maintain the same clinical results when using new generations of dose calculation algorithms. Nine lung cancer cases were studied. For each patient, three treatment plans were generated using exactly the same beams arrangements. In plan 1, the dose was calculated using pencil beam convolution (PBC) algorithm turning on heterogeneity correction with modified batho (PBC-MB). In plan 2, the dose was calculated using anisotropic analytical algorithm (AAA) and the same PD, as plan 1. In plan 3, the dose was calculated using AAA with monitor units (MUs) obtained from PBC-MB, as input. The dosimetric criteria's include MUs, delivered dose at isocentre (Diso) and calculated dose to 95% of the target volume (D95). The bootstrap method was used to assess the significance of the dose differences and to accurately estimate the 95% confidence interval (95% CI). Wilcoxon and Spearman's rank tests were used to calculate P values and the correlation coefficient (ρ). Statistically significant for dose difference was found using point kernel model. A good correlation was observed between both algorithms types, with ρ>0.9. Using AAA instead of PBC-MB, an adjustment of the PD in the isocentre is suggested. For a given set of patients, we assessed the need to readjust the PD for lung cancer using dosimetric indices and bootstrap statistical method. Thus, if the goal is to keep on with the same clinical results, the PD for lung tumors has to be adjusted with AAA. According to our simulation we suggest to readjust the PD by 5% and an optimization for beam arrangements to better protect the organs at risks (OARs).
Confidence Intervals for the Mean: To Bootstrap or Not to Bootstrap
ERIC Educational Resources Information Center
Calzada, Maria E.; Gardner, Holly
2011-01-01
The results of a simulation conducted by a research team involving undergraduate and high school students indicate that when data is symmetric the student's "t" confidence interval for a mean is superior to the studied non-parametric bootstrap confidence intervals. When data is skewed and for sample sizes n greater than or equal to 10,…
The Beginner's Guide to the Bootstrap Method of Resampling.
ERIC Educational Resources Information Center
Lane, Ginny G.
The bootstrap method of resampling can be useful in estimating the replicability of study results. The bootstrap procedure creates a mock population from a given sample of data from which multiple samples are then drawn. The method extends the usefulness of the jackknife procedure as it allows for computation of a given statistic across a maximal…
Application of a New Resampling Method to SEM: A Comparison of S-SMART with the Bootstrap
ERIC Educational Resources Information Center
Bai, Haiyan; Sivo, Stephen A.; Pan, Wei; Fan, Xitao
2016-01-01
Among the commonly used resampling methods of dealing with small-sample problems, the bootstrap enjoys the widest applications because it often outperforms its counterparts. However, the bootstrap still has limitations when its operations are contemplated. Therefore, the purpose of this study is to examine an alternative, new resampling method…
A Primer on Bootstrap Factor Analysis as Applied to Health Studies Research
ERIC Educational Resources Information Center
Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.
2014-01-01
Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…
Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A
2017-06-30
Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Solenoid-free plasma start-up in spherical tokamaks
NASA Astrophysics Data System (ADS)
Raman, R.; Shevchenko, V. F.
2014-10-01
The central solenoid is an intrinsic part of all present-day tokamaks and most spherical tokamaks. The spherical torus (ST) confinement concept is projected to operate at high toroidal beta and at a high fraction of the non-inductive bootstrap current as required for an efficient reactor system. The use of a conventional solenoid in a ST-based fusion nuclear facility is generally believed to not be a possibility. Solenoid-free plasma start-up is therefore an area of extensive worldwide research activity. Solenoid-free plasma start-up is also relevant to steady-state tokamak operation, as the central transformer coil of a conventional aspect ratio tokamak reactor would be located in a high radiation environment but would be needed only during the initial discharge initiation and current ramp-up phases. Solenoid-free operation also provides greater flexibility in the selection of the aspect ratio and simplifies the reactor design. Plasma start-up methods based on induction from external poloidal field coils, helicity injection and radio frequency current drive have all made substantial progress towards meeting this important need for the ST. Some of these systems will now undergo the final stages of test in a new generation of large STs, which are scheduled to begin operations during the next two years. This paper reviews research to date on methods for inducing the initial start-up current in STs without reliance on the conventional central solenoid.
Discharge start-up and ramp-up development for NSTX-U and MAST-U
NASA Astrophysics Data System (ADS)
Battaglia, D. J.; Boyer, M. D.; Gerhardt, S. P.; Menard, J. E.; Mueller, D.; Cunningham, G.; Kirk, A.; Kogan, L.; McArdle, G.; Pangione, L.; Thornton, A. J.; Ren, E.
2017-10-01
A collaborative modeling effort is underway to develop robust inductive start-up and ramp-up scenarios for NSTX-U and MAST-U. These complementary spherical tokamak devices aim to generate the physics basis for achieving steady-state, high-beta and high-confinement plasma discharges with a self-consistent solution for managing the divertor heat flux. High-performance discharges in these devices require sufficient plasma elongation (κ = 2.4 - 2.8) to maximize the bootstrap and beam-driven current drive, increase MHD stability at high Ip and high βN, and realize advanced divertor geometries such as the snowflake and super-X. Achieving the target elongation on NSTX-U is enabled by an L-H transition in the current ramp-up that slows the current diffusion and maintains a low internal inductance (li <= 0.8). Modeling focuses on developing scenarios that achieve a suitable field null for breakdown and discharge conditions conducive to an early L-H transition while maintaining vertical and MHD stability, with appropriate margin for variation in experimental conditions. The toroidal currents induced in conducting structures and the specifications of the real-time control and power supply systems are unique constraints for the two devices. Work Supported by U.S. DOE Contract No. DE-AC02-09CH11466 and the RCUK Energy Programme [Grant Number EP/P012450/1].
Solid oxide fuel cell power plant having a bootstrap start-up system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lines, Michael T
The bootstrap start-up system (42) achieves an efficient start-up of the power plant (10) that minimizes formation of soot within a reformed hydrogen rich fuel. A burner (48) receives un-reformed fuel directly from the fuel supply (30) and combusts the fuel to heat cathode air which then heats an electrolyte (24) within the fuel cell (12). A dilute hydrogen forming gas (68) cycles through a sealed heat-cycling loop (66) to transfer heat and generated steam from an anode side (32) of the electrolyte (24) through fuel processing system (36) components (38, 40) and back to an anode flow field (26)more » until fuel processing system components (38, 40) achieve predetermined optimal temperatures and steam content. Then, the heat-cycling loop (66) is unsealed and the un-reformed fuel is admitted into the fuel processing system (36) and anode flow (26) field to commence ordinary operation of the power plant (10).« less
Smart, Joan E Hunter; Cumming, Sean P; Sherar, Lauren B; Standage, Martyn; Neville, Helen; Malina, Robert M
2012-01-01
This study tested a mediated effects model of psychological and behavioral adaptation to puberty within the context of physical activity (PA). Biological maturity status, physical self-concept, PA, and health-related quality of life (HRQoL) were assessed in 222 female British year 7 to 9 pupils (mean age = 12.7 years, SD = .8). Structural equation modeling using maximum likelihood estimation and bootstrapping procedures supported the hypothesized model. Maturation status was inversely related to perceptions of sport competence, body attractiveness, and physical condition; and indirectly and inversely related to physical self-worth, PA, and HRQoL. Examination of the bootstrap-generated bias-corrected confidence intervals representing the direct and indirect paths between suggested that physical self-concept partially mediated the relations between maturity status and PA, and maturity status and HRQoL. Evidence supports the contention that perceptions of the physical self partially mediate relations maturity, PA, and HRQoL in adolescent females.
ERIC Educational Resources Information Center
Cui, Zhongmin; Kolen, Michael J.
2008-01-01
This article considers two methods of estimating standard errors of equipercentile equating: the parametric bootstrap method and the nonparametric bootstrap method. Using a simulation study, these two methods are compared under three sample sizes (300, 1,000, and 3,000), for two test content areas (the Iowa Tests of Basic Skills Maps and Diagrams…
A note on the kappa statistic for clustered dichotomous data.
Zhou, Ming; Yang, Zhao
2014-06-30
The kappa statistic is widely used to assess the agreement between two raters. Motivated by a simulation-based cluster bootstrap method to calculate the variance of the kappa statistic for clustered physician-patients dichotomous data, we investigate its special correlation structure and develop a new simple and efficient data generation algorithm. For the clustered physician-patients dichotomous data, based on the delta method and its special covariance structure, we propose a semi-parametric variance estimator for the kappa statistic. An extensive Monte Carlo simulation study is performed to evaluate the performance of the new proposal and five existing methods with respect to the empirical coverage probability, root-mean-square error, and average width of the 95% confidence interval for the kappa statistic. The variance estimator ignoring the dependence within a cluster is generally inappropriate, and the variance estimators from the new proposal, bootstrap-based methods, and the sampling-based delta method perform reasonably well for at least a moderately large number of clusters (e.g., the number of clusters K ⩾50). The new proposal and sampling-based delta method provide convenient tools for efficient computations and non-simulation-based alternatives to the existing bootstrap-based methods. Moreover, the new proposal has acceptable performance even when the number of clusters is as small as K = 25. To illustrate the practical application of all the methods, one psychiatric research data and two simulated clustered physician-patients dichotomous data are analyzed. Copyright © 2014 John Wiley & Sons, Ltd.
Non-inductive current drive and transport in high βN plasmas in JET
NASA Astrophysics Data System (ADS)
Voitsekhovitch, I.; Alper, B.; Brix, M.; Budny, R. V.; Buratti, P.; Challis, C. D.; Ferron, J.; Giroud, C.; Joffrin, E.; Laborde, L.; Luce, T. C.; McCune, D.; Menard, J.; Murakami, M.; Park, J. M.; JET-EFDA contributors
2009-05-01
A route to stationary MHD stable operation at high βN has been explored at the Joint European Torus (JET) by optimizing the current ramp-up, heating start time and the waveform of neutral beam injection (NBI) power. In these scenarios the current ramp-up has been accompanied by plasma pre-heat (or the NBI has been started before the current flat-top) and NBI power up to 22 MW has been applied during the current flat-top. In the discharges considered transient total βN ≈ 3.3 and stationary (during high power phase) βN ≈ 3 have been achieved by applying the feedback control of βN with the NBI power in configurations with monotonic or flat core safety factor profile and without an internal transport barrier (ITB). The transport and current drive in this scenario is analysed here by using the TRANSP and ASTRA codes. The interpretative analysis performed with TRANSP shows that 50-70% of current is driven non-inductively; half of this current is due to the bootstrap current which has a broad profile since an ITB was deliberately avoided. The GLF23 transport model predicts the temperature profiles within a ±22% discrepancy with the measurements over the explored parameter space. Predictive simulations with this model show that the E × B rotational shear plays an important role for thermal ion transport in this scenario, producing up to a 40% increase of the ion temperature. By applying transport and current drive models validated in self-consistent simulations of given reference scenarios in a wider parameter space, the requirements for fully non-inductive stationary operation at JET are estimated. It is shown that the strong stiffness of the temperature profiles predicted by the GLF23 model restricts the bootstrap current at larger heating power. In this situation full non-inductive operation without an ITB can be rather expensive strongly relying on the external non-inductive current drive sources.
NASA Astrophysics Data System (ADS)
Coupon, Jean; Leauthaud, Alexie; Kilbinger, Martin; Medezinski, Elinor
2017-07-01
SWOT (Super W Of Theta) computes two-point statistics for very large data sets, based on “divide and conquer” algorithms, mainly, but not limited to data storage in binary trees, approximation at large scale, parellelization (open MPI), and bootstrap and jackknife resampling methods “on the fly”. It currently supports projected and 3D galaxy auto and cross correlations, galaxy-galaxy lensing, and weighted histograms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. Brooks; A.H. Reiman; G.H. Neilson
High-beta, low-aspect-ratio (compact) stellarators are promising solutions to the problem of developing a magnetic plasma configuration for magnetic fusion power plants that can be sustained in steady-state without disrupting. These concepts combine features of stellarators and advanced tokamaks and have aspect ratios similar to those of tokamaks (2-4). They are based on computed plasma configurations that are shaped in three dimensions to provide desired stability and transport properties. Experiments are planned as part of a program to develop this concept. A beta = 4% quasi-axisymmetric plasma configuration has been evaluated for the National Compact Stellarator Experiment (NCSX). It has amore » substantial bootstrap current and is shaped to stabilize ballooning, external kink, vertical, and neoclassical tearing modes without feedback or close-fitting conductors. Quasi-omnigeneous plasma configurations stable to ballooning modes at beta = 4% have been evaluated for the Quasi-Omnigeneous Stellarator (QOS) experiment. These equilibria have relatively low bootstrap currents and are insensitive to changes in beta. Coil configurations have been calculated that reconstruct these plasma configurations, preserving their important physics properties. Theory- and experiment-based confinement analyses are used to evaluate the technical capabilities needed to reach target plasma conditions. The physics basis for these complementary experiments is described.« less
Helicopter In-Flight Monitoring System Second Generation (HIMS II).
1983-08-01
acquisition cycle. B. Computer Chassis CPU (DEC LSI-II/2) -- Executes instructions contained in the memory. 32K memory (DEC MSVII-DD) --Contains program...when the operator executes command #2, 3, or 5 (display data). New cartridges can be inserted as required for truly unlimited, continuous data...is called bootstrapping. The software, which is stored on a tape cartridge, is loaded into memory by execution of a small program stored in read-only
2009-01-01
selection and uncertainty sampling signif- icantly. Index Terms: Transcription, labeling, submodularity, submod- ular selection, active learning , sequence...name of batch active learning , where a subset of data that is most informative and represen- tative of the whole is selected for labeling. Often...representative subset. Note that our Fisher ker- nel is over an unsupervised generative model, which enables us to bootstrap our active learning approach
Detection of counterfeit electronic components through ambient mass spectrometry and chemometrics.
Pfeuffer, Kevin P; Caldwell, Jack; Shelley, Jake T; Ray, Steven J; Hieftje, Gary M
2014-09-21
In the last several years, illicit electronic components have been discovered in the inventories of several distributors and even installed in commercial and military products. Illicit or counterfeit electronic components include a broad category of devices that can range from the correct unit with a more recent date code to lower-specification or non-working systems with altered names, manufacturers and date codes. Current methodologies for identification of counterfeit electronics rely on visual microscopy by expert users and, while effective, are very time-consuming. Here, a plasma-based ambient desorption/ionization source, the flowing atmospheric pressure afterglow (FAPA) is used to generate a mass-spectral fingerprint from the surface of a variety of discrete electronic integrated circuits (ICs). Chemometric methods, specifically principal component analysis (PCA) and the bootstrapped error-adjusted single-sample technique (BEAST), are used successfully to differentiate between genuine and counterfeit ICs. In addition, chemical and physical surface-removal techniques are explored and suggest which surface-altering techniques were utilized by counterfeiters.
NASA Astrophysics Data System (ADS)
Komachi, Mamoru; Kudo, Taku; Shimbo, Masashi; Matsumoto, Yuji
Bootstrapping has a tendency, called semantic drift, to select instances unrelated to the seed instances as the iteration proceeds. We demonstrate the semantic drift of Espresso-style bootstrapping has the same root as the topic drift of Kleinberg's HITS, using a simplified graph-based reformulation of bootstrapping. We confirm that two graph-based algorithms, the von Neumann kernels and the regularized Laplacian, can reduce the effect of semantic drift in the task of word sense disambiguation (WSD) on Senseval-3 English Lexical Sample Task. Proposed algorithms achieve superior performance to Espresso and previous graph-based WSD methods, even though the proposed algorithms have less parameters and are easy to calibrate.
Confidence limit calculation for antidotal potency ratio derived from lethal dose 50
Manage, Ananda; Petrikovics, Ilona
2013-01-01
AIM: To describe confidence interval calculation for antidotal potency ratios using bootstrap method. METHODS: We can easily adapt the nonparametric bootstrap method which was invented by Efron to construct confidence intervals in such situations like this. The bootstrap method is a resampling method in which the bootstrap samples are obtained by resampling from the original sample. RESULTS: The described confidence interval calculation using bootstrap method does not require the sampling distribution antidotal potency ratio. This can serve as a substantial help for toxicologists, who are directed to employ the Dixon up-and-down method with the application of lower number of animals to determine lethal dose 50 values for characterizing the investigated toxic molecules and eventually for characterizing the antidotal protections by the test antidotal systems. CONCLUSION: The described method can serve as a useful tool in various other applications. Simplicity of the method makes it easier to do the calculation using most of the programming software packages. PMID:25237618
Topics in Statistical Calibration
2014-03-27
on a parametric bootstrap where, instead of sampling directly from the residuals , samples are drawn from a normal distribution. This procedure will...addition to centering them (Davison and Hinkley, 1997). When there are outliers in the residuals , the bootstrap distribution of x̂0 can become skewed or...based and inversion methods using the linear mixed-effects model. Then, a simple parametric bootstrap algorithm is proposed that can be used to either
Variable selection under multiple imputation using the bootstrap in a prognostic study
Heymans, Martijn W; van Buuren, Stef; Knol, Dirk L; van Mechelen, Willem; de Vet, Henrica CW
2007-01-01
Background Missing data is a challenging problem in many prognostic studies. Multiple imputation (MI) accounts for imputation uncertainty that allows for adequate statistical testing. We developed and tested a methodology combining MI with bootstrapping techniques for studying prognostic variable selection. Method In our prospective cohort study we merged data from three different randomized controlled trials (RCTs) to assess prognostic variables for chronicity of low back pain. Among the outcome and prognostic variables data were missing in the range of 0 and 48.1%. We used four methods to investigate the influence of respectively sampling and imputation variation: MI only, bootstrap only, and two methods that combine MI and bootstrapping. Variables were selected based on the inclusion frequency of each prognostic variable, i.e. the proportion of times that the variable appeared in the model. The discriminative and calibrative abilities of prognostic models developed by the four methods were assessed at different inclusion levels. Results We found that the effect of imputation variation on the inclusion frequency was larger than the effect of sampling variation. When MI and bootstrapping were combined at the range of 0% (full model) to 90% of variable selection, bootstrap corrected c-index values of 0.70 to 0.71 and slope values of 0.64 to 0.86 were found. Conclusion We recommend to account for both imputation and sampling variation in sets of missing data. The new procedure of combining MI with bootstrapping for variable selection, results in multivariable prognostic models with good performance and is therefore attractive to apply on data sets with missing values. PMID:17629912
Assessing uncertainties in superficial water provision by different bootstrap-based techniques
NASA Astrophysics Data System (ADS)
Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo Mario
2014-05-01
An assessment of water security can incorporate several water-related concepts, characterizing the interactions between societal needs, ecosystem functioning, and hydro-climatic conditions. The superficial freshwater provision level depends on the methods chosen for 'Environmental Flow Requirement' estimations, which integrate the sources of uncertainty in the understanding of how water-related threats to aquatic ecosystem security arise. Here, we develop an uncertainty assessment of superficial freshwater provision based on different bootstrap techniques (non-parametric resampling with replacement). To illustrate this approach, we use an agricultural basin (291 km2) within the Cantareira water supply system in Brazil monitored by one daily streamflow gage (24-year period). The original streamflow time series has been randomly resampled for different times or sample sizes (N = 500; ...; 1000), then applied to the conventional bootstrap approach and variations of this method, such as: 'nearest neighbor bootstrap'; and 'moving blocks bootstrap'. We have analyzed the impact of the sampling uncertainty on five Environmental Flow Requirement methods, based on: flow duration curves or probability of exceedance (Q90%, Q75% and Q50%); 7-day 10-year low-flow statistic (Q7,10); and presumptive standard (80% of the natural monthly mean ?ow). The bootstrap technique has been also used to compare those 'Environmental Flow Requirement' (EFR) methods among themselves, considering the difference between the bootstrap estimates and the "true" EFR characteristic, which has been computed averaging the EFR values of the five methods and using the entire streamflow record at monitoring station. This study evaluates the bootstrapping strategies, the representativeness of streamflow series for EFR estimates and their confidence intervals, in addition to overview of the performance differences between the EFR methods. The uncertainties arisen during EFR methods assessment will be propagated through water security indicators referring to water scarcity and vulnerability, seeking to provide meaningful support to end-users and water managers facing the incorporation of uncertainties in the decision making process.
Zhou, Hanzhi; Elliott, Michael R; Raghunathan, Trivellore E
2016-06-01
Multistage sampling is often employed in survey samples for cost and convenience. However, accounting for clustering features when generating datasets for multiple imputation is a nontrivial task, particularly when, as is often the case, cluster sampling is accompanied by unequal probabilities of selection, necessitating case weights. Thus, multiple imputation often ignores complex sample designs and assumes simple random sampling when generating imputations, even though failing to account for complex sample design features is known to yield biased estimates and confidence intervals that have incorrect nominal coverage. In this article, we extend a recently developed, weighted, finite-population Bayesian bootstrap procedure to generate synthetic populations conditional on complex sample design data that can be treated as simple random samples at the imputation stage, obviating the need to directly model design features for imputation. We develop two forms of this method: one where the probabilities of selection are known at the first and second stages of the design, and the other, more common in public use files, where only the final weight based on the product of the two probabilities is known. We show that this method has advantages in terms of bias, mean square error, and coverage properties over methods where sample designs are ignored, with little loss in efficiency, even when compared with correct fully parametric models. An application is made using the National Automotive Sampling System Crashworthiness Data System, a multistage, unequal probability sample of U.S. passenger vehicle crashes, which suffers from a substantial amount of missing data in "Delta-V," a key crash severity measure.
Cell genealogies in a plant meristem deduced with the aid of a 'bootstrap' L-system.
Lück, J; Barlow, P W; Lück, H B
1994-01-01
The primary root meristem of maize (Zea mays L.) contains longitudinal files of cells arranged in groups of familial descent (sisters, cousins, etc.). These groups, or packets, show ordered sequences of cell division which are transverse with respect to the apico-basal axis of the root. The sequences have been analysed in three zones of the meristem during the course of the first four cell generations following germination. In this period, the number of cells in the packets increases from one to 16. Theoretically, there are 48 possible division pathways that lead to the eight-cell stage, and nearly 2 x 10(6) that lead to the 16-cell stage. However, analysis shows that only a few of all the possible pathways are used in any particular zone of the root. This restriction of pathways results from inherited sequences of asymmetric cell divisions which lead to sister cells of unequal length. All possible division pathways can be generated by deterministic 'bootstrap' L-systems which assign different lifespans to sister cells of successive generations and hence specify their subsequent sequence of divisions. These systems simulate propagating patterns of cell divisions which agree with those actually found within the growing packets that comprise the root meristem. The patterns of division are specific to cells originating in various regions of the meristem of the germinating root. The importance of such systems is that they simulate patterns of cellular proliferation where there is ancestral dependency. They can therefore be applied in other growing and proliferating systems where this is suspected.
Zhou, Hanzhi; Elliott, Michael R.; Raghunathan, Trivellore E.
2017-01-01
Multistage sampling is often employed in survey samples for cost and convenience. However, accounting for clustering features when generating datasets for multiple imputation is a nontrivial task, particularly when, as is often the case, cluster sampling is accompanied by unequal probabilities of selection, necessitating case weights. Thus, multiple imputation often ignores complex sample designs and assumes simple random sampling when generating imputations, even though failing to account for complex sample design features is known to yield biased estimates and confidence intervals that have incorrect nominal coverage. In this article, we extend a recently developed, weighted, finite-population Bayesian bootstrap procedure to generate synthetic populations conditional on complex sample design data that can be treated as simple random samples at the imputation stage, obviating the need to directly model design features for imputation. We develop two forms of this method: one where the probabilities of selection are known at the first and second stages of the design, and the other, more common in public use files, where only the final weight based on the product of the two probabilities is known. We show that this method has advantages in terms of bias, mean square error, and coverage properties over methods where sample designs are ignored, with little loss in efficiency, even when compared with correct fully parametric models. An application is made using the National Automotive Sampling System Crashworthiness Data System, a multistage, unequal probability sample of U.S. passenger vehicle crashes, which suffers from a substantial amount of missing data in “Delta-V,” a key crash severity measure. PMID:29226161
Counting conformal correlators
NASA Astrophysics Data System (ADS)
Kravchuk, Petr; Simmons-Duffin, David
2018-02-01
We introduce simple group-theoretic techniques for classifying conformallyinvariant tensor structures. With them, we classify tensor structures of general n-point functions of non-conserved operators, and n ≥ 4-point functions of general conserved currents, with or without permutation symmetries, and in any spacetime dimension d. Our techniques are useful for bootstrap applications. The rules we derive simultaneously count tensor structures for flat-space scattering amplitudes in d + 1 dimensions.
Toma, Tudor; Bosman, Robert-Jan; Siebes, Arno; Peek, Niels; Abu-Hanna, Ameen
2010-08-01
An important problem in the Intensive Care is how to predict on a given day of stay the eventual hospital mortality for a specific patient. A recent approach to solve this problem suggested the use of frequent temporal sequences (FTSs) as predictors. Methods following this approach were evaluated in the past by inducing a model from a training set and validating the prognostic performance on an independent test set. Although this evaluative approach addresses the validity of the specific models induced in an experiment, it falls short of evaluating the inductive method itself. To achieve this, one must account for the inherent sources of variation in the experimental design. The main aim of this work is to demonstrate a procedure based on bootstrapping, specifically the .632 bootstrap procedure, for evaluating inductive methods that discover patterns, such as FTSs. A second aim is to apply this approach to find out whether a recently suggested inductive method that discovers FTSs of organ functioning status is superior over a traditional method that does not use temporal sequences when compared on each successive day of stay at the Intensive Care Unit. The use of bootstrapping with logistic regression using pre-specified covariates is known in the statistical literature. Using inductive methods of prognostic models based on temporal sequence discovery within the bootstrap procedure is however novel at least in predictive models in the Intensive Care. Our results of applying the bootstrap-based evaluative procedure demonstrate the superiority of the FTS-based inductive method over the traditional method in terms of discrimination as well as accuracy. In addition we illustrate the insights gained by the analyst into the discovered FTSs from the bootstrap samples. Copyright 2010 Elsevier Inc. All rights reserved.
Elkomy, Mohammed H; Elmenshawe, Shahira F; Eid, Hussein M; Ali, Ahmed M A
2016-11-01
This work aimed at investigating the potential of solid lipid nanoparticles (SLN) as carriers for topical delivery of Ketoprofen (KP); evaluating a novel technique incorporating Artificial Neural Network (ANN) and clustered bootstrap for optimization of KP-loaded SLN (KP-SLN); and demonstrating a longitudinal dose response (LDR) modeling-based approach to compare the activity of topical non-steroidal anti-inflammatory drug formulations. KP-SLN was fabricated by a modified emulsion/solvent evaporation method. Box-Behnken design was implemented to study the influence of glycerylpalmitostearate-to-KP ratio, Tween 80, and lecithin concentrations on particle size, entrapment efficiency, and amount of drug permeated through rat skin in 24 hours. Following clustered bootstrap ANN optimization, the optimized KP-SLN was incorporated into an aqueous gel and evaluated for rheology, in vitro release, permeability, skin irritation and in vivo activity using carrageenan-induced rat paw edema model and LDR mathematical model to analyze the time course of anti-inflammatory effect at various application durations. Lipid-to-drug ratio of 7.85 [bootstrap 95%CI: 7.63-8.51], Tween 80 of 1.27% [bootstrap 95%CI: 0.601-2.40%], and Lecithin of 0.263% [bootstrap 95%CI: 0.263-0.328%] were predicted to produce optimal characteristics. Compared with profenid® gel, the optimized KP-SLN gel exhibited slower release, faster permeability, better texture properties, greater efficacy, and similar potency. SLNs are safe and effective permeation enhancers. ANN coupled with clustered bootstrap is a useful method for finding optimal solutions and estimating uncertainty associated with them. LDR models allow mechanistic understanding of comparative in vivo performances of different topical formulations, and help design efficient dermatological bioequivalence assessment methods.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things.
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-03-11
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-01-01
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length. PMID:26978362
Vorburger, Robert S; Habeck, Christian G; Narkhede, Atul; Guzman, Vanessa A; Manly, Jennifer J; Brickman, Adam M
2016-01-01
Diffusion tensor imaging suffers from an intrinsic low signal-to-noise ratio. Bootstrap algorithms have been introduced to provide a non-parametric method to estimate the uncertainty of the measured diffusion parameters. To quantify the variability of the principal diffusion direction, bootstrap-derived metrics such as the cone of uncertainty have been proposed. However, bootstrap-derived metrics are not independent of the underlying diffusion profile. A higher mean diffusivity causes a smaller signal-to-noise ratio and, thus, increases the measurement uncertainty. Moreover, the goodness of the tensor model, which relies strongly on the complexity of the underlying diffusion profile, influences bootstrap-derived metrics as well. The presented simulations clearly depict the cone of uncertainty as a function of the underlying diffusion profile. Since the relationship of the cone of uncertainty and common diffusion parameters, such as the mean diffusivity and the fractional anisotropy, is not linear, the cone of uncertainty has a different sensitivity. In vivo analysis of the fornix reveals the cone of uncertainty to be a predictor of memory function among older adults. No significant correlation occurs with the common diffusion parameters. The present work not only demonstrates the cone of uncertainty as a function of the actual diffusion profile, but also discloses the cone of uncertainty as a sensitive predictor of memory function. Future studies should incorporate bootstrap-derived metrics to provide more comprehensive analysis.
Using Cluster Bootstrapping to Analyze Nested Data With a Few Clusters.
Huang, Francis L
2018-04-01
Cluster randomized trials involving participants nested within intact treatment and control groups are commonly performed in various educational, psychological, and biomedical studies. However, recruiting and retaining intact groups present various practical, financial, and logistical challenges to evaluators and often, cluster randomized trials are performed with a low number of clusters (~20 groups). Although multilevel models are often used to analyze nested data, researchers may be concerned of potentially biased results due to having only a few groups under study. Cluster bootstrapping has been suggested as an alternative procedure when analyzing clustered data though it has seen very little use in educational and psychological studies. Using a Monte Carlo simulation that varied the number of clusters, average cluster size, and intraclass correlations, we compared standard errors using cluster bootstrapping with those derived using ordinary least squares regression and multilevel models. Results indicate that cluster bootstrapping, though more computationally demanding, can be used as an alternative procedure for the analysis of clustered data when treatment effects at the group level are of primary interest. Supplementary material showing how to perform cluster bootstrapped regressions using R is also provided.
Prospects for steady-state scenarios on JET
NASA Astrophysics Data System (ADS)
Litaudon, X.; Bizarro, J. P. S.; Challis, C. D.; Crisanti, F.; DeVries, P. C.; Lomas, P.; Rimini, F. G.; Tala, T. J. J.; Akers, R.; Andrew, Y.; Arnoux, G.; Artaud, J. F.; Baranov, Yu F.; Beurskens, M.; Brix, M.; Cesario, R.; DeLa Luna, E.; Fundamenski, W.; Giroud, C.; Hawkes, N. C.; Huber, A.; Joffrin, E.; Pitts, R. A.; Rachlew, E.; Reyes-Cortes, S. D. A.; Sharapov, S. E.; Zastrow, K. D.; Zimmermann, O.; JET EFDA contributors, the
2007-09-01
In the 2006 experimental campaign, progress has been made on JET to operate non-inductive scenarios at higher applied powers (31 MW) and density (nl ~ 4 × 1019 m-3), with ITER-relevant safety factor (q95 ~ 5) and plasma shaping, taking advantage of the new divertor capabilities. The extrapolation of the performance using transport modelling benchmarked on the experimental database indicates that the foreseen power upgrade (~45 MW) will allow the development of non-inductive scenarios where the bootstrap current is maximized together with the fusion yield and not, as in present-day experiments, at its expense. The tools for the long-term JET programme are the new ITER-like ICRH antenna (~15 MW), an upgrade of the NB power (35 MW/20 s or 17.5 MW/40 s), a new ITER-like first wall, a new pellet injector for edge localized mode control together with improved diagnostic and control capability. Operation with the new wall will set new constraints on non-inductive scenarios that are already addressed experimentally and in the modelling. The fusion performance and driven current that could be reached at high density and power have been estimated using either 0D or 1-1/2D validated transport models. In the high power case (45 MW), the calculations indicate the potential for the operational space of the non-inductive regime to be extended in terms of current (~2.5 MA) and density (nl > 5 × 1019 m-3), with high βN (βN > 3.0) and a fraction of the bootstrap current within 60-70% at high toroidal field (~3.5 T).
Bootstrapping Least Squares Estimates in Biochemical Reaction Networks
Linder, Daniel F.
2015-01-01
The paper proposes new computational methods of computing confidence bounds for the least squares estimates (LSEs) of rate constants in mass-action biochemical reaction network and stochastic epidemic models. Such LSEs are obtained by fitting the set of deterministic ordinary differential equations (ODEs), corresponding to the large volume limit of a reaction network, to network’s partially observed trajectory treated as a continuous-time, pure jump Markov process. In the large volume limit the LSEs are asymptotically Gaussian, but their limiting covariance structure is complicated since it is described by a set of nonlinear ODEs which are often ill-conditioned and numerically unstable. The current paper considers two bootstrap Monte-Carlo procedures, based on the diffusion and linear noise approximations for pure jump processes, which allow one to avoid solving the limiting covariance ODEs. The results are illustrated with both in-silico and real data examples from the LINE 1 gene retrotranscription model and compared with those obtained using other methods. PMID:25898769
Percolation in education and application in the 21st century
NASA Astrophysics Data System (ADS)
Adler, Joan; Elfenbaum, Shaked; Sharir, Liran
2017-03-01
Percolation, "so simple you could teach it to your wife" (Chuck Newman, last century) is an ideal system to introduce young students to phase transitions. Two recent projects in the Computational Physics group at the Technion make this easy. One is a set of analog models to be mounted on our walls and enable visitors to switch between samples to see which mixtures of glass and metal objects have a percolating current. The second is a website enabling the creation of stereo samples of two and three dimensional clusters (suited for viewing with Oculus rift) on desktops, tablets and smartphones. Although there have been many physical applications for regular percolation in the past, for Bootstrap Percolation, where only sites with sufficient occupied neighbours remain active, there have not been a surfeit of condensed matter applications. We have found that the creation of diamond membranes for quantum computers can be modeled with a bootstrap process of graphitization in diamond, enabling prediction of optimal processing procedures.
NASA Astrophysics Data System (ADS)
Georgopoulos, A. P.; Tan, H.-R. M.; Lewis, S. M.; Leuthold, A. C.; Winskowski, A. M.; Lynch, J. K.; Engdahl, B.
2010-02-01
Traumatic experiences can produce post-traumatic stress disorder (PTSD) which is a debilitating condition and for which no biomarker currently exists (Institute of Medicine (US) 2006 Posttraumatic Stress Disorder: Diagnosis and Assessment (Washington, DC: National Academies)). Here we show that the synchronous neural interactions (SNI) test which assesses the functional interactions among neural populations derived from magnetoencephalographic (MEG) recordings (Georgopoulos A P et al 2007 J. Neural Eng. 4 349-55) can successfully differentiate PTSD patients from healthy control subjects. Externally cross-validated, bootstrap-based analyses yielded >90% overall accuracy of classification. In addition, all but one of 18 patients who were not receiving medications for their disease were correctly classified. Altogether, these findings document robust differences in brain function between the PTSD and control groups that can be used for differential diagnosis and which possess the potential for assessing and monitoring disease progression and effects of therapy.
NASA Astrophysics Data System (ADS)
Brandic, Ivona; Music, Dejan; Dustdar, Schahram
Nowadays, novel computing paradigms as for example Cloud Computing are gaining more and more on importance. In case of Cloud Computing users pay for the usage of the computing power provided as a service. Beforehand they can negotiate specific functional and non-functional requirements relevant for the application execution. However, providing computing power as a service bears different research challenges. On one hand dynamic, versatile, and adaptable services are required, which can cope with system failures and environmental changes. On the other hand, human interaction with the system should be minimized. In this chapter we present the first results in establishing adaptable, versatile, and dynamic services considering negotiation bootstrapping and service mediation achieved in context of the Foundations of Self-Governing ICT Infrastructures (FoSII) project. We discuss novel meta-negotiation and SLA mapping solutions for Cloud services bridging the gap between current QoS models and Cloud middleware and representing important prerequisites for the establishment of autonomic Cloud services.
Phylogenetic relationships among arecoid palms (Arecaceae: Arecoideae)
Baker, William J.; Norup, Maria V.; Clarkson, James J.; Couvreur, Thomas L. P.; Dowe, John L.; Lewis, Carl E.; Pintaud, Jean-Christophe; Savolainen, Vincent; Wilmot, Tomas; Chase, Mark W.
2011-01-01
Background and Aims The Arecoideae is the largest and most diverse of the five subfamilies of palms (Arecaceae/Palmae), containing >50 % of the species in the family. Despite its importance, phylogenetic relationships among Arecoideae are poorly understood. Here the most densely sampled phylogenetic analysis of Arecoideae available to date is presented. The results are used to test the current classification of the subfamily and to identify priority areas for future research. Methods DNA sequence data for the low-copy nuclear genes PRK and RPB2 were collected from 190 palm species, covering 103 (96 %) genera of Arecoideae. The data were analysed using the parsimony ratchet, maximum likelihood, and both likelihood and parsimony bootstrapping. Key Results and Conclusions Despite the recovery of paralogues and pseudogenes in a small number of taxa, PRK and RPB2 were both highly informative, producing well-resolved phylogenetic trees with many nodes well supported by bootstrap analyses. Simultaneous analyses of the combined data sets provided additional resolution and support. Two areas of incongruence between PRK and RPB2 were strongly supported by the bootstrap relating to the placement of tribes Chamaedoreeae, Iriarteeae and Reinhardtieae; the causes of this incongruence remain uncertain. The current classification within Arecoideae was strongly supported by the present data. Of the 14 tribes and 14 sub-tribes in the classification, only five sub-tribes from tribe Areceae (Basseliniinae, Linospadicinae, Oncospermatinae, Rhopalostylidinae and Verschaffeltiinae) failed to receive support. Three major higher level clades were strongly supported: (1) the RRC clade (Roystoneeae, Reinhardtieae and Cocoseae), (2) the POS clade (Podococceae, Oranieae and Sclerospermeae) and (3) the core arecoid clade (Areceae, Euterpeae, Geonomateae, Leopoldinieae, Manicarieae and Pelagodoxeae). However, new data sources are required to elucidate ambiguities that remain in phylogenetic relationships among and within the major groups of Arecoideae, as well as within the Areceae, the largest tribe in the palm family. PMID:21325340
Comparison of parametric and bootstrap method in bioequivalence test.
Ahn, Byung-Jin; Yim, Dong-Seok
2009-10-01
The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption.
Comparison of Parametric and Bootstrap Method in Bioequivalence Test
Ahn, Byung-Jin
2009-01-01
The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption. PMID:19915699
Prenatal Drug Exposure and Adolescent Cortisol Reactivity: Association with Behavioral Concerns.
Buckingham-Howes, Stacy; Mazza, Dayna; Wang, Yan; Granger, Douglas A; Black, Maureen M
2016-09-01
To examine stress reactivity in a sample of adolescents with prenatal drug exposure (PDE) by examining the consequences of PDE on stress-related adrenocortical reactivity, behavioral problems, and drug experimentation during adolescence. Participants (76 PDE, 61 non-drug exposed [NE]; 99% African-American; 50% male; mean age = 14.17 yr, SD = 1.17) provided a urine sample, completed a drug use questionnaire, and provided saliva samples (later assayed for cortisol) before and after a mild laboratory stress task. Caregivers completed the Behavior Assessment System for Children, Second Edition (BASC II) and reported their relationship to the adolescent. The NE group was more likely to exhibit task-related cortisol reactivity compared to the PDE group. Overall behavior problems and drug experimentation were comparable across groups with no differences between PDE and NE groups. In unadjusted mediation analyses, cortisol reactivity mediated the association between PDE and BASC II aggression scores (95% bootstrap confidence interval [CI], 0.04-4.28), externalizing problems scores (95% bootstrap CI, 0.03-4.50), and drug experimentation (95% bootstrap CI, 0.001-0.54). The associations remain with the inclusion of gender as a covariate but not when age is included. Findings support and expand current research in cortisol reactivity and PDE by demonstrating that cortisol reactivity attenuates the association between PDE and behavioral problems (aggression) and drug experimentation. If replicated, PDE may have long-lasting effects on stress-sensitive physiological mechanisms associated with behavioral problems (aggression) and drug experimentation in adolescence.
Deep learning ensemble with asymptotic techniques for oscillometric blood pressure estimation.
Lee, Soojeong; Chang, Joon-Hyuk
2017-11-01
This paper proposes a deep learning based ensemble regression estimator with asymptotic techniques, and offers a method that can decrease uncertainty for oscillometric blood pressure (BP) measurements using the bootstrap and Monte-Carlo approach. While the former is used to estimate SBP and DBP, the latter attempts to determine confidence intervals (CIs) for SBP and DBP based on oscillometric BP measurements. This work originally employs deep belief networks (DBN)-deep neural networks (DNN) to effectively estimate BPs based on oscillometric measurements. However, there are some inherent problems with these methods. First, it is not easy to determine the best DBN-DNN estimator, and worthy information might be omitted when selecting one DBN-DNN estimator and discarding the others. Additionally, our input feature vectors, obtained from only five measurements per subject, represent a very small sample size; this is a critical weakness when using the DBN-DNN technique and can cause overfitting or underfitting, depending on the structure of the algorithm. To address these problems, an ensemble with an asymptotic approach (based on combining the bootstrap with the DBN-DNN technique) is utilized to generate the pseudo features needed to estimate the SBP and DBP. In the first stage, the bootstrap-aggregation technique is used to create ensemble parameters. Afterward, the AdaBoost approach is employed for the second-stage SBP and DBP estimation. We then use the bootstrap and Monte-Carlo techniques in order to determine the CIs based on the target BP estimated using the DBN-DNN ensemble regression estimator with the asymptotic technique in the third stage. The proposed method can mitigate the estimation uncertainty such as large the standard deviation of error (SDE) on comparing the proposed DBN-DNN ensemble regression estimator with the DBN-DNN single regression estimator, we identify that the SDEs of the SBP and DBP are reduced by 0.58 and 0.57 mmHg, respectively. These indicate that the proposed method actually enhances the performance by 9.18% and 10.88% compared with the DBN-DNN single estimator. The proposed methodology improves the accuracy of BP estimation and reduces the uncertainty for BP estimation. Copyright © 2017 Elsevier B.V. All rights reserved.
Bootstrap investigation of the stability of a Cox regression model.
Altman, D G; Andersen, P K
1989-07-01
We describe a bootstrap investigation of the stability of a Cox proportional hazards regression model resulting from the analysis of a clinical trial of azathioprine versus placebo in patients with primary biliary cirrhosis. We have considered stability to refer both to the choice of variables included in the model and, more importantly, to the predictive ability of the model. In stepwise Cox regression analyses of 100 bootstrap samples using 17 candidate variables, the most frequently selected variables were those selected in the original analysis, and no other important variable was identified. Thus there was no reason to doubt the model obtained in the original analysis. For each patient in the trial, bootstrap confidence intervals were constructed for the estimated probability of surviving two years. It is shown graphically that these intervals are markedly wider than those obtained from the original model.
NASA Astrophysics Data System (ADS)
Artemenko, M. V.; Chernetskaia, I. E.; Kalugina, N. M.; Shchekina, E. N.
2018-04-01
This article describes the solution of the actual problem of the productive formation of a cortege of informative measured features of the object of observation and / or control using author's algorithms for the use of bootstraps and counter-bootstraps technologies for processing the results of measurements of various states of the object on the basis of different volumes of the training sample. The work that is presented in this paper considers aggregation by specific indicators of informative capacity by linear, majority, logical and “greedy” methods, applied both individually and integrally. The results of the computational experiment are discussed, and in conclusion is drawn that the application of the proposed methods contributes to an increase in the efficiency of classification of the states of the object from the results of measurements.
How bootstrap can help in forecasting time series with more than one seasonal pattern
NASA Astrophysics Data System (ADS)
Cordeiro, Clara; Neves, M. Manuela
2012-09-01
The search for the future is an appealing challenge in time series analysis. The diversity of forecasting methodologies is inevitable and is still in expansion. Exponential smoothing methods are the launch platform for modelling and forecasting in time series analysis. Recently this methodology has been combined with bootstrapping revealing a good performance. The algorithm (Boot. EXPOS) using exponential smoothing and bootstrap methodologies, has showed promising results for forecasting time series with one seasonal pattern. In case of more than one seasonal pattern, the double seasonal Holt-Winters methods and the exponential smoothing methods were developed. A new challenge was now to combine these seasonal methods with bootstrap and carry over a similar resampling scheme used in Boot. EXPOS procedure. The performance of such partnership will be illustrated for some well-know data sets existing in software.
Phu, Jack; Bui, Bang V; Kalloniatis, Michael; Khuu, Sieu K
2018-03-01
The number of subjects needed to establish the normative limits for visual field (VF) testing is not known. Using bootstrap resampling, we determined whether the ground truth mean, distribution limits, and standard deviation (SD) could be approximated using different set size ( x ) levels, in order to provide guidance for the number of healthy subjects required to obtain robust VF normative data. We analyzed the 500 Humphrey Field Analyzer (HFA) SITA-Standard results of 116 healthy subjects and 100 HFA full threshold results of 100 psychophysically experienced healthy subjects. These VFs were resampled (bootstrapped) to determine mean sensitivity, distribution limits (5th and 95th percentiles), and SD for different ' x ' and numbers of resamples. We also used the VF results of 122 glaucoma patients to determine the performance of ground truth and bootstrapped results in identifying and quantifying VF defects. An x of 150 (for SITA-Standard) and 60 (for full threshold) produced bootstrapped descriptive statistics that were no longer different to the original distribution limits and SD. Removing outliers produced similar results. Differences between original and bootstrapped limits in detecting glaucomatous defects were minimized at x = 250. Ground truth statistics of VF sensitivities could be approximated using set sizes that are significantly smaller than the original cohort. Outlier removal facilitates the use of Gaussian statistics and does not significantly affect the distribution limits. We provide guidance for choosing the cohort size for different levels of error when performing normative comparisons with glaucoma patients.
Bootstrap data methodology for sequential hybrid model building
NASA Technical Reports Server (NTRS)
Volponi, Allan J. (Inventor); Brotherton, Thomas (Inventor)
2007-01-01
A method for modeling engine operation comprising the steps of: 1. collecting a first plurality of sensory data, 2. partitioning a flight envelope into a plurality of sub-regions, 3. assigning the first plurality of sensory data into the plurality of sub-regions, 4. generating an empirical model of at least one of the plurality of sub-regions, 5. generating a statistical summary model for at least one of the plurality of sub-regions, 6. collecting an additional plurality of sensory data, 7. partitioning the second plurality of sensory data into the plurality of sub-regions, 8. generating a plurality of pseudo-data using the empirical model, and 9. concatenating the plurality of pseudo-data and the additional plurality of sensory data to generate an updated empirical model and an updated statistical summary model for at least one of the plurality of sub-regions.
Rapid processing of PET list-mode data for efficient uncertainty estimation and data analysis
NASA Astrophysics Data System (ADS)
Markiewicz, P. J.; Thielemans, K.; Schott, J. M.; Atkinson, D.; Arridge, S. R.; Hutton, B. F.; Ourselin, S.
2016-07-01
In this technical note we propose a rapid and scalable software solution for the processing of PET list-mode data, which allows the efficient integration of list mode data processing into the workflow of image reconstruction and analysis. All processing is performed on the graphics processing unit (GPU), making use of streamed and concurrent kernel execution together with data transfers between disk and CPU memory as well as CPU and GPU memory. This approach leads to fast generation of multiple bootstrap realisations, and when combined with fast image reconstruction and analysis, it enables assessment of uncertainties of any image statistic and of any component of the image generation process (e.g. random correction, image processing) within reasonable time frames (e.g. within five minutes per realisation). This is of particular value when handling complex chains of image generation and processing. The software outputs the following: (1) estimate of expected random event data for noise reduction; (2) dynamic prompt and random sinograms of span-1 and span-11 and (3) variance estimates based on multiple bootstrap realisations of (1) and (2) assuming reasonable count levels for acceptable accuracy. In addition, the software produces statistics and visualisations for immediate quality control and crude motion detection, such as: (1) count rate curves; (2) centre of mass plots of the radiodistribution for motion detection; (3) video of dynamic projection views for fast visual list-mode skimming and inspection; (4) full normalisation factor sinograms. To demonstrate the software, we present an example of the above processing for fast uncertainty estimation of regional SUVR (standard uptake value ratio) calculation for a single PET scan of 18F-florbetapir using the Siemens Biograph mMR scanner.
Feder, Paul I; Ma, Zhenxu J; Bull, Richard J; Teuschler, Linda K; Rice, Glenn
2009-01-01
In chemical mixtures risk assessment, the use of dose-response data developed for one mixture to estimate risk posed by a second mixture depends on whether the two mixtures are sufficiently similar. While evaluations of similarity may be made using qualitative judgments, this article uses nonparametric statistical methods based on the "bootstrap" resampling technique to address the question of similarity among mixtures of chemical disinfectant by-products (DBP) in drinking water. The bootstrap resampling technique is a general-purpose, computer-intensive approach to statistical inference that substitutes empirical sampling for theoretically based parametric mathematical modeling. Nonparametric, bootstrap-based inference involves fewer assumptions than parametric normal theory based inference. The bootstrap procedure is appropriate, at least in an asymptotic sense, whether or not the parametric, distributional assumptions hold, even approximately. The statistical analysis procedures in this article are initially illustrated with data from 5 water treatment plants (Schenck et al., 2009), and then extended using data developed from a study of 35 drinking-water utilities (U.S. EPA/AMWA, 1989), which permits inclusion of a greater number of water constituents and increased structure in the statistical models.
1993-09-10
1993). A bootstrap generalizedlikelihood ratio test in discriminant analysis, Proc. 15th Annual Seismic Research Symposium, in press. I Hedlin, M., J... ratio indicate that the event does not belong to the first class. The bootstrap technique is used here as well to set the critical value of the test ...Methodist University. Baek, J., H. L. Gray, W. A. Woodward and M.D. Fisk (1993). A Bootstrap Generalized Likelihood Ratio Test in Discriminant
Cui, Ming; Xu, Lili; Wang, Huimin; Ju, Shaoqing; Xu, Shuizhu; Jing, Rongrong
2017-12-01
Measurement uncertainty (MU) is a metrological concept, which can be used for objectively estimating the quality of test results in medical laboratories. The Nordtest guide recommends an approach that uses both internal quality control (IQC) and external quality assessment (EQA) data to evaluate the MU. Bootstrap resampling is employed to simulate the unknown distribution based on the mathematical statistics method using an existing small sample of data, where the aim is to transform the small sample into a large sample. However, there have been no reports of the utilization of this method in medical laboratories. Thus, this study applied the Nordtest guide approach based on bootstrap resampling for estimating the MU. We estimated the MU for the white blood cell (WBC) count, red blood cell (RBC) count, hemoglobin (Hb), and platelets (Plt). First, we used 6months of IQC data and 12months of EQA data to calculate the MU according to the Nordtest method. Second, we combined the Nordtest method and bootstrap resampling with the quality control data and calculated the MU using MATLAB software. We then compared the MU results obtained using the two approaches. The expanded uncertainty results determined for WBC, RBC, Hb, and Plt using the bootstrap resampling method were 4.39%, 2.43%, 3.04%, and 5.92%, respectively, and 4.38%, 2.42%, 3.02%, and 6.00% with the existing quality control data (U [k=2]). For WBC, RBC, Hb, and Plt, the differences between the results obtained using the two methods were lower than 1.33%. The expanded uncertainty values were all less than the target uncertainties. The bootstrap resampling method allows the statistical analysis of the MU. Combining the Nordtest method and bootstrap resampling is considered a suitable alternative method for estimating the MU. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Xiaojing; Yu, Qingquan; Zhang, Xiaodong; Zhang, Yang; Zhu, Sizheng; Wang, Xiaoguang; Wu, Bin
2018-04-01
Numerical studies on the stabilization of neoclassical tearing modes (NTMs) by electron cyclotron current drive (ECCD) have been carried out based on reduced MHD equations, focusing on the amount of the required driven current for mode stabilization and the comparison with analytical results. The dependence of the minimum driven current required for NTM stabilization on some parameters, including the bootstrap current density, radial width of the driven current, radial deviation of the driven current from the resonant surface, and the island width when applying ECCD, are studied. By fitting the numerical results, simple expressions for these dependences are obtained. Analysis based on the modified Rutherford equation (MRE) has also been carried out, and the corresponding results have the same trend as numerical ones, while a quantitative difference between them exists. This difference becomes smaller when the applied radio frequency (rf) current is smaller.
Exploring the Replicability of a Study's Results: Bootstrap Statistics for the Multivariate Case.
ERIC Educational Resources Information Center
Thompson, Bruce
1995-01-01
Use of the bootstrap method in a canonical correlation analysis to evaluate the replicability of a study's results is illustrated. More confidence may be vested in research results that replicate. (SLD)
Towards a bootstrap approach to higher orders of epsilon expansion
NASA Astrophysics Data System (ADS)
Dey, Parijat; Kaviraj, Apratim
2018-02-01
We employ a hybrid approach in determining the anomalous dimension and OPE coefficient of higher spin operators in the Wilson-Fisher theory. First we do a large spin analysis for CFT data where we use results obtained from the usual and the Mellin bootstrap and also from Feynman diagram literature. This gives new predictions at O( ɛ 4) and O( ɛ 5) for anomalous dimensions and OPE coefficients, and also provides a cross-check for the results from Mellin bootstrap. These higher orders get contributions from all higher spin operators in the crossed channel. We also use the bootstrap in Mellin space method for ϕ 3 in d = 6 - ɛ CFT where we calculate general higher spin OPE data. We demonstrate a higher loop order calculation in this approach by summing over contributions from higher spin operators of the crossed channel in the same spirit as before.
Carnegie, Nicole Bohme
2011-04-15
The incidence of new infections is a key measure of the status of the HIV epidemic, but accurate measurement of incidence is often constrained by limited data. Karon et al. (Statist. Med. 2008; 27:4617–4633) developed a model to estimate the incidence of HIV infection from surveillance data with biologic testing for recent infection for newly diagnosed cases. This method has been implemented by public health departments across the United States and is behind the new national incidence estimates, which are about 40 per cent higher than previous estimates. We show that the delta method approximation given for the variance of the estimator is incomplete, leading to an inflated variance estimate. This contributes to the generation of overly conservative confidence intervals, potentially obscuring important differences between populations. We demonstrate via simulation that an innovative model-based bootstrap method using the specified model for the infection and surveillance process improves confidence interval coverage and adjusts for the bias in the point estimate. Confidence interval coverage is about 94–97 per cent after correction, compared with 96–99 per cent before. The simulated bias in the estimate of incidence ranges from −6.3 to +14.6 per cent under the original model but is consistently under 1 per cent after correction by the model-based bootstrap. In an application to data from King County, Washington in 2007 we observe correction of 7.2 per cent relative bias in the incidence estimate and a 66 per cent reduction in the width of the 95 per cent confidence interval using this method. We provide open-source software to implement the method that can also be extended for alternate models.
Abstract: Inference and Interval Estimation for Indirect Effects With Latent Variable Models.
Falk, Carl F; Biesanz, Jeremy C
2011-11-30
Models specifying indirect effects (or mediation) and structural equation modeling are both popular in the social sciences. Yet relatively little research has compared methods that test for indirect effects among latent variables and provided precise estimates of the effectiveness of different methods. This simulation study provides an extensive comparison of methods for constructing confidence intervals and for making inferences about indirect effects with latent variables. We compared the percentile (PC) bootstrap, bias-corrected (BC) bootstrap, bias-corrected accelerated (BC a ) bootstrap, likelihood-based confidence intervals (Neale & Miller, 1997), partial posterior predictive (Biesanz, Falk, and Savalei, 2010), and joint significance tests based on Wald tests or likelihood ratio tests. All models included three reflective latent variables representing the independent, dependent, and mediating variables. The design included the following fully crossed conditions: (a) sample size: 100, 200, and 500; (b) number of indicators per latent variable: 3 versus 5; (c) reliability per set of indicators: .7 versus .9; (d) and 16 different path combinations for the indirect effect (α = 0, .14, .39, or .59; and β = 0, .14, .39, or .59). Simulations were performed using a WestGrid cluster of 1680 3.06GHz Intel Xeon processors running R and OpenMx. Results based on 1,000 replications per cell and 2,000 resamples per bootstrap method indicated that the BC and BC a bootstrap methods have inflated Type I error rates. Likelihood-based confidence intervals and the PC bootstrap emerged as methods that adequately control Type I error and have good coverage rates.
Combining test statistics and models in bootstrapped model rejection: it is a balancing act
2014-01-01
Background Model rejections lie at the heart of systems biology, since they provide conclusive statements: that the corresponding mechanistic assumptions do not serve as valid explanations for the experimental data. Rejections are usually done using e.g. the chi-square test (χ2) or the Durbin-Watson test (DW). Analytical formulas for the corresponding distributions rely on assumptions that typically are not fulfilled. This problem is partly alleviated by the usage of bootstrapping, a computationally heavy approach to calculate an empirical distribution. Bootstrapping also allows for a natural extension to estimation of joint distributions, but this feature has so far been little exploited. Results We herein show that simplistic combinations of bootstrapped tests, like the max or min of the individual p-values, give inconsistent, i.e. overly conservative or liberal, results. A new two-dimensional (2D) approach based on parametric bootstrapping, on the other hand, is found both consistent and with a higher power than the individual tests, when tested on static and dynamic examples where the truth is known. In the same examples, the most superior test is a 2D χ2vsχ2, where the second χ2-value comes from an additional help model, and its ability to describe bootstraps from the tested model. This superiority is lost if the help model is too simple, or too flexible. If a useful help model is found, the most powerful approach is the bootstrapped log-likelihood ratio (LHR). We show that this is because the LHR is one-dimensional, because the second dimension comes at a cost, and because LHR has retained most of the crucial information in the 2D distribution. These approaches statistically resolve a previously published rejection example for the first time. Conclusions We have shown how to, and how not to, combine tests in a bootstrap setting, when the combination is advantageous, and when it is advantageous to include a second model. These results also provide a deeper insight into the original motivation for formulating the LHR, for the more general setting of nonlinear and non-nested models. These insights are valuable in cases when accuracy and power, rather than computational speed, are prioritized. PMID:24742065
A Critical Meta-Analysis of Lens Model Studies in Human Judgment and Decision-Making
Kaufmann, Esther; Reips, Ulf-Dietrich; Wittmann, Werner W.
2013-01-01
Achieving accurate judgment (‘judgmental achievement’) is of utmost importance in daily life across multiple domains. The lens model and the lens model equation provide useful frameworks for modeling components of judgmental achievement and for creating tools to help decision makers (e.g., physicians, teachers) reach better judgments (e.g., a correct diagnosis, an accurate estimation of intelligence). Previous meta-analyses of judgment and decision-making studies have attempted to evaluate overall judgmental achievement and have provided the basis for evaluating the success of bootstrapping (i.e., replacing judges by linear models that guide decision making). However, previous meta-analyses have failed to appropriately correct for a number of study design artifacts (e.g., measurement error, dichotomization), which may have potentially biased estimations (e.g., of the variability between studies) and led to erroneous interpretations (e.g., with regards to moderator variables). In the current study we therefore conduct the first psychometric meta-analysis of judgmental achievement studies that corrects for a number of study design artifacts. We identified 31 lens model studies (N = 1,151, k = 49) that met our inclusion criteria. We evaluated overall judgmental achievement as well as whether judgmental achievement depended on decision domain (e.g., medicine, education) and/or the level of expertise (expert vs. novice). We also evaluated whether using corrected estimates affected conclusions with regards to the success of bootstrapping with psychometrically-corrected models. Further, we introduce a new psychometric trim-and-fill method to estimate the effect sizes of potentially missing studies correct psychometric meta-analyses for effects of publication bias. Comparison of the results of the psychometric meta-analysis with the results of a traditional meta-analysis (which only corrected for sampling error) indicated that artifact correction leads to a) an increase in values of the lens model components, b) reduced heterogeneity between studies, and c) increases the success of bootstrapping. We argue that psychometric meta-analysis is useful for accurately evaluating human judgment and show the success of bootstrapping. PMID:24391781
A critical meta-analysis of lens model studies in human judgment and decision-making.
Kaufmann, Esther; Reips, Ulf-Dietrich; Wittmann, Werner W
2013-01-01
Achieving accurate judgment ('judgmental achievement') is of utmost importance in daily life across multiple domains. The lens model and the lens model equation provide useful frameworks for modeling components of judgmental achievement and for creating tools to help decision makers (e.g., physicians, teachers) reach better judgments (e.g., a correct diagnosis, an accurate estimation of intelligence). Previous meta-analyses of judgment and decision-making studies have attempted to evaluate overall judgmental achievement and have provided the basis for evaluating the success of bootstrapping (i.e., replacing judges by linear models that guide decision making). However, previous meta-analyses have failed to appropriately correct for a number of study design artifacts (e.g., measurement error, dichotomization), which may have potentially biased estimations (e.g., of the variability between studies) and led to erroneous interpretations (e.g., with regards to moderator variables). In the current study we therefore conduct the first psychometric meta-analysis of judgmental achievement studies that corrects for a number of study design artifacts. We identified 31 lens model studies (N = 1,151, k = 49) that met our inclusion criteria. We evaluated overall judgmental achievement as well as whether judgmental achievement depended on decision domain (e.g., medicine, education) and/or the level of expertise (expert vs. novice). We also evaluated whether using corrected estimates affected conclusions with regards to the success of bootstrapping with psychometrically-corrected models. Further, we introduce a new psychometric trim-and-fill method to estimate the effect sizes of potentially missing studies correct psychometric meta-analyses for effects of publication bias. Comparison of the results of the psychometric meta-analysis with the results of a traditional meta-analysis (which only corrected for sampling error) indicated that artifact correction leads to a) an increase in values of the lens model components, b) reduced heterogeneity between studies, and c) increases the success of bootstrapping. We argue that psychometric meta-analysis is useful for accurately evaluating human judgment and show the success of bootstrapping.
Migration of the ATLAS Metadata Interface (AMI) to Web 2.0 and cloud
NASA Astrophysics Data System (ADS)
Odier, J.; Albrand, S.; Fulachier, J.; Lambert, F.
2015-12-01
The ATLAS Metadata Interface (AMI), a mature application of more than 10 years of existence, is currently under adaptation to some recently available technologies. The web interfaces, which previously manipulated XML documents using XSL transformations, are being migrated to Asynchronous JavaScript (AJAX). Web development is considerably simplified by the introduction of a framework based on JQuery and Twitter Bootstrap. Finally, the AMI services are being migrated to an OpenStack cloud infrastructure.
1999-01-01
distances and identities and Roger?’ genetic distances were clustered by the unweighted pair group method using arithmetic average ( UPGMA ) to produce...Seattle, WA) using the NEIGHBOR program with the UPGMA option and a phenogram was produced with DRAWGRAM, also in PHYLIP 3.X RAPDBOOT5’ was used to...generate 100 pseudoreplicate distance matrices, which were collapsed to form 100 trees with UPGMA . The bootstrap consensus tree was derived from the 100
Bootstrapping Student Understanding of What Is Going on in Econometrics.
ERIC Educational Resources Information Center
Kennedy, Peter E.
2001-01-01
Explains that econometrics is an intellectual game played by rules based on the sampling distribution concept. Contains explanations for why many students are uncomfortable with econometrics. Encourages instructors to use explain-how-to-bootstrap exercises to promote student understanding. (RLH)
2013-01-01
Background Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Methods Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups of patients compared to the most discriminating (reference) measure. Statistical significance of RV was quantified by the 95% bootstrap confidence interval. Simulations examined the effects of sample size, denominator F-statistic, correlation between comparator and reference measures, and number of bootstrap replicates. Results The statistical significance of the RV increased as the magnitude of denominator F-statistic increased or as the correlation between comparator and reference measures increased. A denominator F-statistic of 57 conveyed sufficient power (80%) to detect an RV of 0.6 for two measures correlated at r = 0.7. Larger denominator F-statistics or higher correlations provided greater power. Larger sample size with a fixed denominator F-statistic or more bootstrap replicates (beyond 500) had minimal impact. Conclusions The bootstrap is valuable for establishing the statistical significance of RV estimates. A reasonably large denominator F-statistic (F > 57) is required for adequate power when using the RV to compare the validity of measures with small or moderate correlations (r < 0.7). Substantially greater power can be achieved when comparing measures of a very high correlation (r > 0.9). PMID:23721463
Forensic surface metrology: tool mark evidence.
Gambino, Carol; McLaughlin, Patrick; Kuo, Loretta; Kammerman, Frani; Shenkin, Peter; Diaczuk, Peter; Petraco, Nicholas; Hamby, James; Petraco, Nicholas D K
2011-01-01
Over the last several decades, forensic examiners of impression evidence have come under scrutiny in the courtroom due to analysis methods that rely heavily on subjective morphological comparisons. Currently, there is no universally accepted system that generates numerical data to independently corroborate visual comparisons. Our research attempts to develop such a system for tool mark evidence, proposing a methodology that objectively evaluates the association of striated tool marks with the tools that generated them. In our study, 58 primer shear marks on 9 mm cartridge cases, fired from four Glock model 19 pistols, were collected using high-resolution white light confocal microscopy. The resulting three-dimensional surface topographies were filtered to extract all "waviness surfaces"-the essential "line" information that firearm and tool mark examiners view under a microscope. Extracted waviness profiles were processed with principal component analysis (PCA) for dimension reduction. Support vector machines (SVM) were used to make the profile-gun associations, and conformal prediction theory (CPT) for establishing confidence levels. At the 95% confidence level, CPT coupled with PCA-SVM yielded an empirical error rate of 3.5%. Complementary, bootstrap-based computations for estimated error rates were 0%, indicating that the error rate for the algorithmic procedure is likely to remain low on larger data sets. Finally, suggestions are made for practical courtroom application of CPT for assigning levels of confidence to SVM identifications of tool marks recorded with confocal microscopy. Copyright © 2011 Wiley Periodicals, Inc.
Reassessing the NTCTCS Staging Systems for Differentiated Thyroid Cancer, Including Age at Diagnosis
McLeod, Donald S.A.; Jonklaas, Jacqueline; Brierley, James D.; Ain, Kenneth B.; Cooper, David S.; Fein, Henry G.; Haugen, Bryan R.; Ladenson, Paul W.; Magner, James; Ross, Douglas S.; Skarulis, Monica C.; Steward, David L.; Xing, Mingzhao; Litofsky, Danielle R.; Maxon, Harry R.
2015-01-01
Background: Thyroid cancer is unique for having age as a staging variable. Recently, the commonly used age cut-point of 45 years has been questioned. Objective: This study assessed alternate staging systems on the outcome of overall survival, and compared these with current National Thyroid Cancer Treatment Cooperative Study (NTCTCS) staging systems for papillary and follicular thyroid cancer. Methods: A total of 4721 patients with differentiated thyroid cancer were assessed. Five potential alternate staging systems were generated at age cut-points in five-year increments from 35 to 70 years, and tested for model discrimination (Harrell's C-statistic) and calibration (R2). The best five models for papillary and follicular cancer were further tested with bootstrap resampling and significance testing for discrimination. Results: The best five alternate papillary cancer systems had age cut-points of 45–50 years, with the highest scoring model using 50 years. No significant difference in C-statistic was found between the best alternate and current NTCTCS systems (p = 0.200). The best five alternate follicular cancer systems had age cut-points of 50–55 years, with the highest scoring model using 50 years. All five best alternate staging systems performed better compared with the current system (p = 0.003–0.035). There was no significant difference in discrimination between the best alternate system (cut-point age 50 years) and the best system of cut-point age 45 years (p = 0.197). Conclusions: No alternate papillary cancer systems assessed were significantly better than the current system. New alternate staging systems for follicular cancer appear to be better than the current NTCTCS system, although they require external validation. PMID:26203804
Four Bootstrap Confidence Intervals for the Binomial-Error Model.
ERIC Educational Resources Information Center
Lin, Miao-Hsiang; Hsiung, Chao A.
1992-01-01
Four bootstrap methods are identified for constructing confidence intervals for the binomial-error model. The extent to which similar results are obtained and the theoretical foundation of each method and its relevance and ranges of modeling the true score uncertainty are discussed. (SLD)
Nonparametric Regression and the Parametric Bootstrap for Local Dependence Assessment.
ERIC Educational Resources Information Center
Habing, Brian
2001-01-01
Discusses ideas underlying nonparametric regression and the parametric bootstrap with an overview of their application to item response theory and the assessment of local dependence. Illustrates the use of the method in assessing local dependence that varies with examinee trait levels. (SLD)
Application of the Bootstrap Statistical Method in Deriving Vibroacoustic Specifications
NASA Technical Reports Server (NTRS)
Hughes, William O.; Paez, Thomas L.
2006-01-01
This paper discusses the Bootstrap Method for specification of vibroacoustic test specifications. Vibroacoustic test specifications are necessary to properly accept or qualify a spacecraft and its components for the expected acoustic, random vibration and shock environments seen on an expendable launch vehicle. Traditionally, NASA and the U.S. Air Force have employed methods of Normal Tolerance Limits to derive these test levels based upon the amount of data available, and the probability and confidence levels desired. The Normal Tolerance Limit method contains inherent assumptions about the distribution of the data. The Bootstrap is a distribution-free statistical subsampling method which uses the measured data themselves to establish estimates of statistical measures of random sources. This is achieved through the computation of large numbers of Bootstrap replicates of a data measure of interest and the use of these replicates to derive test levels consistent with the probability and confidence desired. The comparison of the results of these two methods is illustrated via an example utilizing actual spacecraft vibroacoustic data.
The Reliability and Stability of an Inferred Phylogenetic Tree from Empirical Data.
Katsura, Yukako; Stanley, Craig E; Kumar, Sudhir; Nei, Masatoshi
2017-03-01
The reliability of a phylogenetic tree obtained from empirical data is usually measured by the bootstrap probability (Pb) of interior branches of the tree. If the bootstrap probability is high for most branches, the tree is considered to be reliable. If some interior branches show relatively low bootstrap probabilities, we are not sure that the inferred tree is really reliable. Here, we propose another quantity measuring the reliability of the tree called the stability of a subtree. This quantity refers to the probability of obtaining a subtree (Ps) of an inferred tree obtained. We then show that if the tree is to be reliable, both Pb and Ps must be high. We also show that Ps is given by a bootstrap probability of the subtree with the closest outgroup sequence, and computer program RESTA for computing the Pb and Ps values will be presented. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Closure of the operator product expansion in the non-unitary bootstrap
DOE Office of Scientific and Technical Information (OSTI.GOV)
Esterlis, Ilya; Fitzpatrick, A. Liam; Ramirez, David M.
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a specialmore » case of the “Gliozzi” bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.« less
Lin, Jyh-Jiuan; Chang, Ching-Hui; Pal, Nabendu
2015-01-01
To test the mutual independence of two qualitative variables (or attributes), it is a common practice to follow the Chi-square tests (Pearson's as well as likelihood ratio test) based on data in the form of a contingency table. However, it should be noted that these popular Chi-square tests are asymptotic in nature and are useful when the cell frequencies are "not too small." In this article, we explore the accuracy of the Chi-square tests through an extensive simulation study and then propose their bootstrap versions that appear to work better than the asymptotic Chi-square tests. The bootstrap tests are useful even for small-cell frequencies as they maintain the nominal level quite accurately. Also, the proposed bootstrap tests are more convenient than the Fisher's exact test which is often criticized for being too conservative. Finally, all test methods are applied to a few real-life datasets for demonstration purposes.
Closure of the operator product expansion in the non-unitary bootstrap
Esterlis, Ilya; Fitzpatrick, A. Liam; Ramirez, David M.
2016-11-07
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a specialmore » case of the “Gliozzi” bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.« less
NASA Astrophysics Data System (ADS)
Poli, Francesca
2012-10-01
Steady state scenarios envisaged for ITER aim at optimizing the bootstrap current, while maintaining sufficient confinement and stability to provide the necessary fusion yield. Non-inductive scenarios will need to operate with Internal Transport Barriers (ITBs) in order to reach adequate fusion gain at typical currents of 9 MA. However, the large pressure gradients associated with ITBs in regions of weak or negative magnetic shear can be conducive to ideal MHD instabilities in a wide range of βN, reducing the no-wall limit. Scenarios are established as relaxed flattop states with time-dependent transport simulations with TSC [1]. Fully non-inductive configurations with current in the range of 7-10 MA and various heating mixes (NB, EC, IC and LH) have been studied against variations of the pressure profile peaking and of the Greenwald fraction. It is found that stable equilibria have qmin> 2 and moderate ITBs at 2/3 of the minor radius [2]. The ExB flow shear from toroidal plasma rotation is expected to be low in ITER, with a major role in the ITB dynamics being played by magnetic geometry. Combinations of H&CD sources that maintain reverse or weak magnetic shear profiles throughout the discharge and ρ(qmin)>=0.5 are the focus of this work. The ITER EC upper launcher, designed for NTM control, can provide enough current drive off-axis to sustain moderate ITBs at mid-radius and maintain a non-inductive current of 8-9MA and H98>=1.5 with the day one heating mix. LH heating and current drive is effective in modifying the current profile off-axis, facilitating the formation of stronger ITBs in the rampup phase, their sustainment at larger radii and larger bootstrap fraction. The implications for steady state operation and fusion performance are discussed.[4pt] [1] Jardin S.C. et al, J. Comput. Phys. 66 (1986) 481[0pt] [2] Poli F.M. et al, Nucl. Fusion 52 (2012) 063027.
Confidence Interval Coverage for Cohen's Effect Size Statistic
ERIC Educational Resources Information Center
Algina, James; Keselman, H. J.; Penfield, Randall D.
2006-01-01
Kelley compared three methods for setting a confidence interval (CI) around Cohen's standardized mean difference statistic: the noncentral-"t"-based, percentile (PERC) bootstrap, and biased-corrected and accelerated (BCA) bootstrap methods under three conditions of nonnormality, eight cases of sample size, and six cases of population…
A Bootstrap Procedure of Propensity Score Estimation
ERIC Educational Resources Information Center
Bai, Haiyan
2013-01-01
Propensity score estimation plays a fundamental role in propensity score matching for reducing group selection bias in observational data. To increase the accuracy of propensity score estimation, the author developed a bootstrap propensity score. The commonly used propensity score matching methods: nearest neighbor matching, caliper matching, and…
1984-09-28
variables before simula- tion of model - Search for reality checks a, - Express uncertainty as a probability density distribution. a. H2 a, H-22 TWIF... probability that the software con- tains errors. This prior is updated as test failure data are accumulated. Only a p of 1 (software known to contain...discusssed; both parametric and nonparametric versions are presented. It is shown by the author that the bootstrap underlies the jackknife method and
Gueto, Carlos; Ruiz, José L; Torres, Juan E; Méndez, Jefferson; Vivas-Reyes, Ricardo
2008-03-01
Comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) were performed on a series of benzotriazine derivatives, as Src inhibitors. Ligand molecular superimposition on the template structure was performed by database alignment method. The statistically significant model was established of 72 molecules, which were validated by a test set of six compounds. The CoMFA model yielded a q(2)=0.526, non cross-validated R(2) of 0.781, F value of 88.132, bootstrapped R(2) of 0.831, standard error of prediction=0.587, and standard error of estimate=0.351 while the CoMSIA model yielded the best predictive model with a q(2)=0.647, non cross-validated R(2) of 0.895, F value of 115.906, bootstrapped R(2) of 0.953, standard error of prediction=0.519, and standard error of estimate=0.178. The contour maps obtained from 3D-QSAR studies were appraised for activity trends for the molecules analyzed. Results indicate that small steric volumes in the hydrophobic region, electron-withdrawing groups next to the aryl linker region, and atoms close to the solvent accessible region increase the Src inhibitory activity of the compounds. In fact, adding substituents at positions 5, 6, and 8 of the benzotriazine nucleus were generated new compounds having a higher predicted activity. The data generated from the present study will further help to design novel, potent, and selective Src inhibitors as anticancer therapeutic agents.
Bootstrapping Methods Applied for Simulating Laboratory Works
ERIC Educational Resources Information Center
Prodan, Augustin; Campean, Remus
2005-01-01
Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…
ERIC Educational Resources Information Center
Zhang, Guangjian; Preacher, Kristopher J.; Luo, Shanhong
2010-01-01
This article is concerned with using the bootstrap to assign confidence intervals for rotated factor loadings and factor correlations in ordinary least squares exploratory factor analysis. Coverage performances of "SE"-based intervals, percentile intervals, bias-corrected percentile intervals, bias-corrected accelerated percentile…
Bootstrapping the Syntactic Bootstrapper: Probabilistic Labeling of Prosodic Phrases
ERIC Educational Resources Information Center
Gutman, Ariel; Dautriche, Isabelle; Crabbé, Benoît; Christophe, Anne
2015-01-01
The "syntactic bootstrapping" hypothesis proposes that syntactic structure provides children with cues for learning the meaning of novel words. In this article, we address the question of how children might start acquiring some aspects of syntax before they possess a sizeable lexicon. The study presents two models of early syntax…
ERIC Educational Resources Information Center
Larwin, Karen H.; Larwin, David A.
2011-01-01
Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…
Bootstrapping N=2 chiral correlators
NASA Astrophysics Data System (ADS)
Lemos, Madalena; Liendo, Pedro
2016-01-01
We apply the numerical bootstrap program to chiral operators in four-dimensional N=2 SCFTs. In the first part of this work we study four-point functions in which all fields have the same conformal dimension. We give special emphasis to bootstrapping a specific theory: the simplest Argyres-Douglas fixed point with no flavor symmetry. In the second part we generalize our setup and consider correlators of fields with unequal dimension. This is an example of a mixed correlator and allows us to probe new regions in the parameter space of N=2 SCFTs. In particular, our results put constraints on relations in the Coulomb branch chiral ring and on the curvature of the Zamolodchikov metric.
The effect of anisotropic heat transport on magnetic islands in 3-D configurations
NASA Astrophysics Data System (ADS)
Schlutt, M. G.; Hegna, C. C.
2012-08-01
An analytic theory of nonlinear pressure-induced magnetic island formation using a boundary layer analysis is presented. This theory extends previous work by including the effects of finite parallel heat transport and is applicable to general three dimensional magnetic configurations. In this work, particular attention is paid to the role of finite parallel heat conduction in the context of pressure-induced island physics. It is found that localized currents that require self-consistent deformation of the pressure profile, such as resistive interchange and bootstrap currents, are attenuated by finite parallel heat conduction when the magnetic islands are sufficiently small. However, these anisotropic effects do not change saturated island widths caused by Pfirsch-Schlüter current effects. Implications for finite pressure-induced island healing are discussed.
Exploring the Replicability of a Study's Results: Bootstrap Statistics for the Multivariate Case.
ERIC Educational Resources Information Center
Thompson, Bruce
Conventional statistical significance tests do not inform the researcher regarding the likelihood that results will replicate. One strategy for evaluating result replication is to use a "bootstrap" resampling of a study's data so that the stability of results across numerous configurations of the subjects can be explored. This paper…
Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization
ERIC Educational Resources Information Center
Lock, Robin H.; Lock, Patti Frazer
2008-01-01
Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…
Computing Robust, Bootstrap-Adjusted Fit Indices for Use with Nonnormal Data
ERIC Educational Resources Information Center
Walker, David A.; Smith, Thomas J.
2017-01-01
Nonnormality of data presents unique challenges for researchers who wish to carry out structural equation modeling. The subsequent SPSS syntax program computes bootstrap-adjusted fit indices (comparative fit index, Tucker-Lewis index, incremental fit index, and root mean square error of approximation) that adjust for nonnormality, along with the…
Forgetski Vygotsky: Or, a Plea for Bootstrapping Accounts of Learning
ERIC Educational Resources Information Center
Luntley, Michael
2017-01-01
This paper argues that sociocultural accounts of learning fail to answer the key question about learning--how is it possible? Accordingly, we should adopt an individualist bootstrapping methodology in providing a theory of learning. Such a methodology takes seriously the idea that learning is staged and distinguishes between a non-comprehending…
Higher curvature gravities, unlike GR, cannot be bootstrapped from their (usual) linearizations
NASA Astrophysics Data System (ADS)
Deser, S.
2017-12-01
We show that higher curvature order gravities, in particular the propagating quadratic curvature models, cannot be derived by self-coupling from their linear, flat space, forms, except through an unphysical version of linearization; only GR can. Separately, we comment on an early version of the self-coupling bootstrap.
The new version of EPA’s positive matrix factorization (EPA PMF) software, 5.0, includes three error estimation (EE) methods for analyzing factor analytic solutions: classical bootstrap (BS), displacement of factor elements (DISP), and bootstrap enhanced by displacement (BS-DISP)...
Bootsie: estimation of coefficient of variation of AFLP data by bootstrap analysis
USDA-ARS?s Scientific Manuscript database
Bootsie is an English-native replacement for ASG Coelho’s “DBOOT” utility for estimating coefficient of variation of a population of AFLP marker data using bootstrapping. Bootsie improves on DBOOT by supporting batch processing, time-to-completion estimation, built-in graphs, and a suite of export t...
How to Bootstrap a Human Communication System
ERIC Educational Resources Information Center
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified…
Li, Hao; Dong, Siping
2015-01-01
China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. © The Author(s) 2015.
Weak percolation on multiplex networks
NASA Astrophysics Data System (ADS)
Baxter, Gareth J.; Dorogovtsev, Sergey N.; Mendes, José F. F.; Cellai, Davide
2014-04-01
Bootstrap percolation is a simple but nontrivial model. It has applications in many areas of science and has been explored on random networks for several decades. In single-layer (simplex) networks, it has been recently observed that bootstrap percolation, which is defined as an incremental process, can be seen as the opposite of pruning percolation, where nodes are removed according to a connectivity rule. Here we propose models of both bootstrap and pruning percolation for multiplex networks. We collectively refer to these two models with the concept of "weak" percolation, to distinguish them from the somewhat classical concept of ordinary ("strong") percolation. While the two models coincide in simplex networks, we show that they decouple when considering multiplexes, giving rise to a wealth of critical phenomena. Our bootstrap model constitutes the simplest example of a contagion process on a multiplex network and has potential applications in critical infrastructure recovery and information security. Moreover, we show that our pruning percolation model may provide a way to diagnose missing layers in a multiplex network. Finally, our analytical approach allows us to calculate critical behavior and characterize critical clusters.
Calia, Clara; Darling, Stephen; Havelka, Jelena; Allen, Richard J
2018-05-01
Immediate serial recall of digits is better when the digits are shown by highlighting them in a familiar array, such as a phone keypad, compared with presenting them serially in a single location, a pattern referred to as "visuospatial bootstrapping." This pattern implies the establishment of temporary links between verbal and spatial working memory, alongside access to information in long-term memory. However, the role of working memory control processes like those implied by the "Central Executive" in bootstrapping has not been directly investigated. Here, we report a study addressing this issue, focusing on executive processes of attentional shifting. Tasks in which information has to be sequenced are thought to be heavily dependent on shifting. Memory for digits presented in keypads versus single locations was assessed under two secondary task load conditions, one with and one without a sequencing requirement, and hence differing in the degree to which they invoke shifting. Results provided clear evidence that multimodal binding (visuospatial bootstrapping) can operate independently of this form of executive control process.
NASA Astrophysics Data System (ADS)
Chatthong, B.; Onjun, T.
2016-01-01
A set of heat and particle transport equations with the inclusion of E × B flow and magnetic shear is used to understand the formation and behaviors of edge transport barriers (ETBs) and internal transport barriers (ITBs) in tokamak plasmas based on two-field bifurcation concept. A simple model that can describe the E × B flow shear and magnetic shear effect in tokamak plasma is used for anomalous transport suppression with the effect of bootstrap current included. Consequently, conditions and formations of ETB and ITB can be visualized and studied. It can be seen that the ETB formation depends sensitively on the E × B flow shear suppression with small dependence on the magnetic shear suppression. However, the ITB formation depends sensitively on the magnetic shear suppression with a small dependence on the E × B flow shear suppression. Once the H-mode is achieved, the s-curve bifurcation diagram is modified due to an increase of bootstrap current at the plasma edge, resulting in reductions of both L-H and H-L transition thresholds with stronger hysteresis effects. It is also found that both ITB and ETB widths appear to be governed by heat or particle sources and the location of the current peaking. In addition, at a marginal flux just below the L-H threshold, a small perturbation in terms of heat or density fluctuation can result in a transition, which can remain after the perturbation is removed due to the hysteresis effect.
Efficiency determinants and capacity issues in Brazilian for-profit hospitals.
Araújo, Cláudia; Barros, Carlos P; Wanke, Peter
2014-06-01
This paper reports on the use of different approaches for assessing efficiency of a sample of major Brazilian for-profit hospitals. Starting out with the bootstrapping technique, several DEA estimates were generated, allowing the use of confidence intervals and bias correction in central estimates to test for significant differences in efficiency levels and input-decreasing/output-increasing potentials. The findings indicate that efficiency is mixed in Brazilian for-profit hospitals. Opportunities for accommodating future demand appear to be scarce and strongly dependent on particular conditions related to the accreditation and specialization of a given hospital.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, L. J.; Kotschenreuther, M. T.; Valanju, P.
2014-06-15
The diamagnetic drift effects on the low-n magnetohydrodynamic instabilities at the high-mode (H-mode) pedestal are investigated in this paper with the inclusion of bootstrap current for equilibrium and rotation effects for stability, where n is the toroidal mode number. The AEGIS (Adaptive EiGenfunction Independent Solutions) code [L. J. Zheng and M. T. Kotschenreuther, J. Comp. Phys. 211 (2006)] is extended to include the diamagnetic drift effects. This can be viewed as the lowest order approximation of the finite Larmor radius effects in consideration of the pressure gradient steepness at the pedestal. The H-mode discharges at Jointed European Torus is reconstructedmore » numerically using the VMEC code [P. Hirshman and J. C. Whitson, Phys. Fluids 26, 3553 (1983)], with bootstrap current taken into account. Generally speaking, the diamagnetic drift effects are stabilizing. Our results show that the effectiveness of diamagnetic stabilization depends sensitively on the safe factor value (q{sub s}) at the safety-factor reversal or plateau region. The diamagnetic stabilization are weaker, when q{sub s} is larger than an integer; while stronger, when q{sub s} is smaller or less larger than an integer. We also find that the diamagnetic drift effects also depend sensitively on the rotation direction. The diamagnetic stabilization in the co-rotation case is stronger than in the counter rotation case with respect to the ion diamagnetic drift direction.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wingen, Andreas; Ferraro, Nathaniel M.; Shafer, Morgan W.
Calculations of the plasma response to applied non-axisymmetric fields in several DIII-D discharges show that predicted displacements depend strongly on the edge current density. This result is found using both a linear two-fluid-MHD model (M3D-C1) and a nonlinear ideal-MHD model (VMEC). Furthermore, it is observed that the probability of a discharge being edge localized mode (ELM)-suppressed is most closely related to the edge current density, as opposed to the pressure gradient. It is found that discharges with a stronger kink response are closer to the peeling–ballooning stability limit in ELITE simulations and eventually cross into the unstable region, causing ELMsmore » to reappear. Thus for effective ELM suppression, the RMP has to prevent the plasma from generating a large kink response, associated with ELM instability. Experimental observations are in agreement with the finding; discharges which have a strong kink response in the MHD simulations show ELMs or ELM mitigation during the RMP phase of the experiment, while discharges with a small kink response in the MHD simulations are fully ELM suppressed in the experiment by the applied resonant magnetic perturbation. The results are cross-checked against modeled 3D ideal MHD equilibria using the VMEC code. The procedure of constructing optimal 3D equilibria for diverted H-mode discharges using VMEC is presented. As a result, kink displacements in VMEC are found to scale with the edge current density, similar to M3D-C1, but the displacements are smaller. A direct correlation in the flux surface displacements to the bootstrap current is shown.« less
Wingen, Andreas; Ferraro, Nathaniel M.; Shafer, Morgan W.; ...
2015-09-03
Calculations of the plasma response to applied non-axisymmetric fields in several DIII-D discharges show that predicted displacements depend strongly on the edge current density. This result is found using both a linear two-fluid-MHD model (M3D-C1) and a nonlinear ideal-MHD model (VMEC). Furthermore, it is observed that the probability of a discharge being edge localized mode (ELM)-suppressed is most closely related to the edge current density, as opposed to the pressure gradient. It is found that discharges with a stronger kink response are closer to the peeling–ballooning stability limit in ELITE simulations and eventually cross into the unstable region, causing ELMsmore » to reappear. Thus for effective ELM suppression, the RMP has to prevent the plasma from generating a large kink response, associated with ELM instability. Experimental observations are in agreement with the finding; discharges which have a strong kink response in the MHD simulations show ELMs or ELM mitigation during the RMP phase of the experiment, while discharges with a small kink response in the MHD simulations are fully ELM suppressed in the experiment by the applied resonant magnetic perturbation. The results are cross-checked against modeled 3D ideal MHD equilibria using the VMEC code. The procedure of constructing optimal 3D equilibria for diverted H-mode discharges using VMEC is presented. As a result, kink displacements in VMEC are found to scale with the edge current density, similar to M3D-C1, but the displacements are smaller. A direct correlation in the flux surface displacements to the bootstrap current is shown.« less
NASA Astrophysics Data System (ADS)
LeBlanc, B.; Batha, S.; Bell, R.; Bernabei, S.; Blush, L.; de la Luna, E.; Doerner, R.; Dunlap, J.; England, A.; Garcia, I.; Ignat, D.; Isler, R.; Jones, S.; Kaita, R.; Kaye, S.; Kugel, H.; Levinton, F.; Luckhardt, S.; Mutoh, T.; Okabayashi, M.; Ono, M.; Paoletti, F.; Paul, S.; Petravich, G.; Post-Zwicker, A.; Sauthoff, N.; Schmitz, L.; Sesnic, S.; Takahashi, H.; Talvard, M.; Tighe, W.; Tynan, G.; von Goeler, S.; Woskov, P.; Zolfaghari, A.
1995-03-01
Application of Ion Bernstein Wave Heating (IBWH) into the Princeton Beta Experiment-Modification (PBX-M) [Phys. Fluids B 2, 1271 (1990)] tokamak stabilizes sawtooth oscillations and generates peaked density profiles. A transport barrier, spatially correlated with the IBWH power deposition profile, is observed in the core of IBWH-assisted neutral beam injection (NBI) discharges. A precursor to the fully developed barrier is seen in the soft x-ray data during edge localized mode (ELM) activity. Sustained IBWH operation is conducive to a regime where the barrier supports large ∇ne, ∇Te, ∇νφ, and ∇Ti, delimiting the confinement zone. This regime is reminiscent of the H(high) mode, but with a confinement zone moved inward. The core region has better than H-mode confinement while the peripheral region is L(low)-mode-like. The peaked profile enhances NBI core deposition and increases nuclear reactivity. An increase in central Ti results from χi reduction (compared to the H mode) and better beam penetration. Bootstrap current fractions of up to 0.32-0.35 locally and 0.28 overall were obtained when an additional NBI burst is applied to this plasma.
Tourscape: A systematic approach towards a sustainable rural tourism management
NASA Astrophysics Data System (ADS)
Lo, M. C.; Wang, Y. C.; Songan, P.; Yeo, A. W.
2014-02-01
Tourism plays an important role in the Malaysian economy as it is considered to be one of the corner stones of the country's economy. The purpose of this research is to conduct an analysis based on the existing tourism industry in rural tourism destinations in Malaysia by examining the impact of economics, environmental, social and cultural factors of the tourism industry on the local communities in Malaysia. 516 respondents comprising of tourism stakeholders from 34 rural tourism sites in Malaysia took part voluntarily in this study. To assess the developed model, SmartPLS 2.0 (M3) was applied based on path modeling and then bootstrapping with 200 re-samples was applied to generate the standard error of the estimate and t-values. Subsequently, a system named Tourscape was designed to manage the information. This system can be considered as a benchmark for tourism industry stakeholders as it is able to display the current situational analysis and the tourism health of selected tourism destination sites by capturing data and information, not only from local communities but industry players and tourists as well. The findings from this study revealed that the cooperation from various stakeholders has created significant impact on the development of rural tourism.
Hurd, Noelle M; Albright, Jamie; Wittrup, Audrey; Negrete, Andrea; Billingsley, Janelle
2018-05-01
The current study explored whether cumulative appraisal support from as many as five natural mentors (i.e., nonparental adults from youth's pre-existing social networks who serve a mentoring role in youth's lives) led to reduced symptoms of depression and anxiety via improved global self-worth among underrepresented college students. Participants in the current study included 340 college students (69% female) attending a 4-year, predominantly White institution of higher education. Participants were first-generation college students, students from economically disadvantaged backgrounds, and/or students from underrepresented racial/ethnic minority groups. Participants completed surveys during the Fall and Spring of their first year of college and in the Spring of their second and third years of college. Results of the structural equation model (including gender, race/ethnicity, and extraversion as covariates) indicated that greater total appraisal support from natural mentoring relationships predicted decreases in students' psychological distress via increases in self-worth (indirect effects assessed via boot-strapped confidence intervals; 95% CI). The strength of association between appraisal support and self-worth was not moderated by the proportion of academic natural mentors. Findings from the current study extend previous research by measuring multiple natural mentoring relationships and pinpointing supportive exchanges that may be of particular consequence for the promotion of healthy youth development. Institutional efforts to reinforce pre-existing natural mentoring relationships and encourage the onset of new natural mentoring relationships may serve to bolster the well-being and success of underrepresented students attending predominantly White universities.
Bleiweiss, R; Kirsch, J A; Lapointe, F J
1994-09-01
A matrix of delta T mode values for 10 birds, including 9 nonpasserines and a suboscine passerine flycatcher, was generated by DNA-DNA hybridization. Within the most derived lineages, all bootstrapped and jackknifed FITCH trees lend strong support to sister-groupings of the two swift families, of hummingbirds to swifts, and of these to a clade containing both owls and night-hawks. The outgroup duck roots the tree between the woodpecker (Piciformes) and the remaining taxa, indicating that Piciformes are among the earliest branches within nonpasserines. However, the succeeding branches to kingfisher, mousebird, and suboscine passerine flycatcher are based on short internodes that are poorly supported by bootstrapping and that give inconsistent results in jackknifing. Although these 3 orders may have arisen through rapid or near-simultaneous divergence, placement of the "advanced" Passeriformes deep within a more "primitive" radiation indicates that nonpasserines are paraphyletic, echoing the same distinction for reptiles with respect to their advanced descendants. Despite significant rate variation among different taxa, these results largely concur with those obtained with the same technique by Sibley and Ahlquist, who used the delta T50H measure and UPGMA analysis. This agreement lends credence to some of their more controversial claims.
Phylogenomics provides strong evidence for relationships of butterflies and moths
Kawahara, Akito Y.; Breinholt, Jesse W.
2014-01-01
Butterflies and moths constitute some of the most popular and charismatic insects. Lepidoptera include approximately 160 000 described species, many of which are important model organisms. Previous studies on the evolution of Lepidoptera did not confidently place butterflies, and many relationships among superfamilies in the megadiverse clade Ditrysia remain largely uncertain. We generated a molecular dataset with 46 taxa, combining 33 new transcriptomes with 13 available genomes, transcriptomes and expressed sequence tags (ESTs). Using HaMStR with a Lepidoptera-specific core-orthologue set of single copy loci, we identified 2696 genes for inclusion into the phylogenomic analysis. Nucleotides and amino acids of the all-gene, all-taxon dataset yielded nearly identical, well-supported trees. Monophyly of butterflies (Papilionoidea) was strongly supported, and the group included skippers (Hesperiidae) and the enigmatic butterfly–moths (Hedylidae). Butterflies were placed sister to the remaining obtectomeran Lepidoptera, and the latter was grouped with greater than or equal to 87% bootstrap support. Establishing confident relationships among the four most diverse macroheteroceran superfamilies was previously challenging, but we recovered 100% bootstrap support for the following relationships: ((Geometroidea, Noctuoidea), (Bombycoidea, Lasiocampoidea)). We present the first robust, transcriptome-based tree of Lepidoptera that strongly contradicts historical placement of butterflies, and provide an evolutionary framework for genomic, developmental and ecological studies on this diverse insect order. PMID:24966318
Sieberg, Christine B; Manganella, Juliana; Manalo, Gem; Simons, Laura E; Hresko, M Timothy
2017-12-01
There is a need to better assess patient satisfaction and surgical outcomes. The purpose of the current study is to identify how preoperative expectations can impact postsurgical satisfaction among youth with adolescent idiopathic scoliosis undergoing spinal fusion surgery. The present study includes patients with adolescent idiopathic scoliosis undergoing spinal fusion surgery enrolled in a prospective, multicentered registry examining postsurgical outcomes. The Scoliosis Research Society Questionnaire-Version 30, which assesses pain, self-image, mental health, and satisfaction with management, along with the Spinal Appearance Questionnaire, which measures surgical expectations was administered to 190 patients before surgery and 1 and 2 years postoperatively. Regression analyses with bootstrapping (with n=5000 bootstrap samples) were conducted with 99% bias-corrected confidence intervals to examine the extent to which preoperative expectations for spinal appearance mediated the relationship between presurgical mental health and pain and 2-year postsurgical satisfaction. Results indicate that preoperative mental health, pain, and expectations are predictive of postsurgical satisfaction. With the shifting health care system, physicians may want to consider patient mental health, pain, and expectations before surgery to optimize satisfaction and ultimately improve clinical care and patient outcomes. Level I-prognostic study.
Ishwaran, Hemant; Lu, Min
2018-06-04
Random forests are a popular nonparametric tree ensemble procedure with broad applications to data analysis. While its widespread popularity stems from its prediction performance, an equally important feature is that it provides a fully nonparametric measure of variable importance (VIMP). A current limitation of VIMP, however, is that no systematic method exists for estimating its variance. As a solution, we propose a subsampling approach that can be used to estimate the variance of VIMP and for constructing confidence intervals. The method is general enough that it can be applied to many useful settings, including regression, classification, and survival problems. Using extensive simulations, we demonstrate the effectiveness of the subsampling estimator and in particular find that the delete-d jackknife variance estimator, a close cousin, is especially effective under low subsampling rates due to its bias correction properties. These 2 estimators are highly competitive when compared with the .164 bootstrap estimator, a modified bootstrap procedure designed to deal with ties in out-of-sample data. Most importantly, subsampling is computationally fast, thus making it especially attractive for big data settings. Copyright © 2018 John Wiley & Sons, Ltd.
Palmer, Tom M; Holmes, Michael V; Keating, Brendan J; Sheehan, Nuala A
2017-01-01
Abstract Mendelian randomization studies use genotypes as instrumental variables to test for and estimate the causal effects of modifiable risk factors on outcomes. Two-stage residual inclusion (TSRI) estimators have been used when researchers are willing to make parametric assumptions. However, researchers are currently reporting uncorrected or heteroscedasticity-robust standard errors for these estimates. We compared several different forms of the standard error for linear and logistic TSRI estimates in simulations and in real-data examples. Among others, we consider standard errors modified from the approach of Newey (1987), Terza (2016), and bootstrapping. In our simulations Newey, Terza, bootstrap, and corrected 2-stage least squares (in the linear case) standard errors gave the best results in terms of coverage and type I error. In the real-data examples, the Newey standard errors were 0.5% and 2% larger than the unadjusted standard errors for the linear and logistic TSRI estimators, respectively. We show that TSRI estimators with modified standard errors have correct type I error under the null. Researchers should report TSRI estimates with modified standard errors instead of reporting unadjusted or heteroscedasticity-robust standard errors. PMID:29106476
Peace of Mind, Academic Motivation, and Academic Achievement in Filipino High School Students.
Datu, Jesus Alfonso D
2017-04-09
Recent literature has recognized the advantageous role of low-arousal positive affect such as feelings of peacefulness and internal harmony in collectivist cultures. However, limited research has explored the benefits of low-arousal affective states in the educational setting. The current study examined the link of peace of mind (PoM) to academic motivation (i.e., amotivation, controlled motivation, and autonomous motivation) and academic achievement among 525 Filipino high school students. Findings revealed that PoM was positively associated with academic achievement β = .16, p < .05, autonomous motivation β = .48, p < .001, and controlled motivation β = .25, p < .01. As expected, PoM was negatively related to amotivation β = -.19, p < .05, and autonomous motivation was positively associated with academic achievement β = .52, p < .01. Furthermore, the results of bias-corrected bootstrap analyses at 95% confidence interval based on 5,000 bootstrapped resamples demonstrated that peace of mind had an indirect influence on academic achievement through the mediating effects of autonomous motivation. In terms of the effect sizes, the findings showed that PoM explained about 1% to 18% of the variance in academic achievement and motivation. The theoretical and practical implications of the results are elucidated.
NASA Technical Reports Server (NTRS)
Patterson, Richard; Hammoud, Ahmad
2009-01-01
Electronic systems designed for use in deep space and planetary exploration missions are expected to encounter extreme temperatures and wide thermal swings. Silicon-based devices are limited in their wide-temperature capability and usually require extra measures, such as cooling or heating mechanisms, to provide adequate ambient temperature for proper operation. Silicon-On-Insulator (SOI) technology, on the other hand, lately has been gaining wide spread use in applications where high temperatures are encountered. Due to their inherent design, SOI-based integrated circuit chips are able to operate at temperatures higher than those of the silicon devices by virtue of reducing leakage currents, eliminating parasitic junctions, and limiting internal heating. In addition, SOI devices provide faster switching, consume less power, and offer improved radiation-tolerance. Very little data, however, exist on the performance of such devices and circuits under cryogenic temperatures. In this work, the performance of an SOI bootstrapped, full-bridge driver integrated circuit was evaluated under extreme temperatures and thermal cycling. The investigations were carried out to establish a baseline on the functionality and to determine suitability of this device for use in space exploration missions under extreme temperature conditions.
Pulling Econometrics Students up by Their Bootstraps
ERIC Educational Resources Information Center
O'Hara, Michael E.
2014-01-01
Although the concept of the sampling distribution is at the core of much of what we do in econometrics, it is a concept that is often difficult for students to grasp. The thought process behind bootstrapping provides a way for students to conceptualize the sampling distribution in a way that is intuitive and visual. However, teaching students to…
Accuracy assessment of percent canopy cover, cover type, and size class
H. T. Schreuder; S. Bain; R. C. Czaplewski
2003-01-01
Truth for vegetation cover percent and type is obtained from very large-scale photography (VLSP), stand structure as measured by size classes, and vegetation types from a combination of VLSP and ground sampling. We recommend using the Kappa statistic with bootstrap confidence intervals for overall accuracy, and similarly bootstrap confidence intervals for percent...
ERIC Educational Resources Information Center
Barner, David; Chow, Katherine; Yang, Shu-Ju
2009-01-01
We explored children's early interpretation of numerals and linguistic number marking, in order to test the hypothesis (e.g., Carey (2004). Bootstrapping and the origin of concepts. "Daedalus", 59-68) that children's initial distinction between "one" and other numerals (i.e., "two," "three," etc.) is bootstrapped from a prior distinction between…
A Class of Population Covariance Matrices in the Bootstrap Approach to Covariance Structure Analysis
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Hayashi, Kentaro; Yanagihara, Hirokazu
2007-01-01
Model evaluation in covariance structure analysis is critical before the results can be trusted. Due to finite sample sizes and unknown distributions of real data, existing conclusions regarding a particular statistic may not be applicable in practice. The bootstrap procedure automatically takes care of the unknown distribution and, for a given…
ERIC Educational Resources Information Center
Hand, Michael L.
1990-01-01
Use of the bootstrap resampling technique (BRT) is assessed in its application to resampling analysis associated with measurement of payment allocation errors by federally funded Family Assistance Programs. The BRT is applied to a food stamp quality control database in Oregon. This analysis highlights the outlier-sensitivity of the…
Comparison of Methods for Estimating Low Flow Characteristics of Streams
Tasker, Gary D.
1987-01-01
Four methods for estimating the 7-day, 10-year and 7-day, 20-year low flows for streams are compared by the bootstrap method. The bootstrap method is a Monte Carlo technique in which random samples are drawn from an unspecified sampling distribution defined from observed data. The nonparametric nature of the bootstrap makes it suitable for comparing methods based on a flow series for which the true distribution is unknown. Results show that the two methods based on hypothetical distribution (Log-Pearson III and Weibull) had lower mean square errors than did the G. E. P. Box-D. R. Cox transformation method or the Log-W. C. Boughton method which is based on a fit of plotting positions.
NASA Astrophysics Data System (ADS)
Chatziantonaki, Ioanna; Tsironis, Christos; Isliker, Heinz; Vlahos, Loukas
2013-11-01
The most promising technique for the control of neoclassical tearing modes in tokamak experiments is the compensation of the missing bootstrap current with an electron-cyclotron current drive (ECCD). In this frame, the dynamics of magnetic islands has been studied extensively in terms of the modified Rutherford equation (MRE), including the presence of a current drive, either analytically described or computed by numerical methods. In this article, a self-consistent model for the dynamic evolution of the magnetic island and the driven current is derived, which takes into account the island's magnetic topology and its effect on the current drive. The model combines the MRE with a ray-tracing approach to electron-cyclotron wave-propagation and absorption. Numerical results exhibit a decrease in the time required for complete stabilization with respect to the conventional computation (not taking into account the island geometry), which increases by increasing the initial island size and radial misalignment of the deposition.
Sturgeon, Jared D; Cox, John A; Mayo, Lauren L; Gunn, G Brandon; Zhang, Lifei; Balter, Peter A; Dong, Lei; Awan, Musaddiq; Kocak-Uzel, Esengul; Mohamed, Abdallah Sherif Radwan; Rosenthal, David I; Fuller, Clifton David
2015-10-01
Digitally reconstructed radiographs (DRRs) are routinely used as an a priori reference for setup correction in radiotherapy. The spatial resolution of DRRs may be improved to reduce setup error in fractionated radiotherapy treatment protocols. The influence of finer CT slice thickness reconstruction (STR) and resultant increased resolution DRRs on physician setup accuracy was prospectively evaluated. Four head and neck patient CT-simulation images were acquired and used to create DRR cohorts by varying STRs at 0.5, 1, 2, 2.5, and 3 mm. DRRs were displaced relative to a fixed isocenter using 0-5 mm random shifts in the three cardinal axes. Physician observers reviewed DRRs of varying STRs and displacements and then aligned reference and test DRRs replicating daily KV imaging workflow. A total of 1,064 images were reviewed by four blinded physicians. Observer errors were analyzed using nonparametric statistics (Friedman's test) to determine whether STR cohorts had detectably different displacement profiles. Post hoc bootstrap resampling was applied to evaluate potential generalizability. The observer-based trial revealed a statistically significant difference between cohort means for observer displacement vector error ([Formula: see text]) and for [Formula: see text]-axis [Formula: see text]. Bootstrap analysis suggests a 15% gain in isocenter translational setup error with reduction of STR from 3 mm to [Formula: see text]2 mm, though interobserver variance was a larger feature than STR-associated measurement variance. Higher resolution DRRs generated using finer CT scan STR resulted in improved observer performance at shift detection and could decrease operator-dependent geometric error. Ideally, CT STRs [Formula: see text]2 mm should be utilized for DRR generation in the head and neck.
A Bootstrap Approach to Martian Manufacturing
NASA Technical Reports Server (NTRS)
Dorais, Gregory A.
2004-01-01
In-Situ Resource Utilization (ISRU) is an essential element of any affordable strategy for a sustained human presence on Mars. Ideally, Martian habitats would be extremely massive to allow plenty of room to comfortably live and work, as well as to protect the occupants from the environment. Moreover, transportation and power generation systems would also require significant mass if affordable. For our approach to ISRU, we use the industrialization of the U.S. as a metaphor. The 19th century started with small blacksmith shops and ended with massive steel mills primarily accomplished by blacksmiths increasing their production capacity and product size to create larger shops, which produced small mills, which produced the large steel mills that industrialized the country. Most of the mass of a steel mill is comprised of steel in simple shapes, which are produced and repaired with few pieces of equipment also mostly made of steel in basic shapes. Due to this simplicity, we expect that the 19th century manufacturing growth can be repeated on Mars in the 21st century using robots as the primary labor force. We suggest a "bootstrap" approach to manufacturing on Mars that uses a "seed" manufacturing system that uses regolith to create major structural components and spare parts. The regolith would be melted, foamed, and sintered as needed to fabricate parts using casting and solid freeform fabrication techniques. Complex components, such as electronics, would be brought from Earth and integrated as needed. These parts would be assembled to create additional manufacturing systems, which can be both more capable and higher capacity. These subsequent manufacturing systems could refine vast amounts of raw materials to create large components, as well as assemble equipment, habitats, pressure vessels, cranes, pipelines, railways, trains, power generation stations, and other facilities needed to economically maintain a sustained human presence on Mars.
Exact Mass-Coupling Relation for the Homogeneous Sine-Gordon Model.
Bajnok, Zoltán; Balog, János; Ito, Katsushi; Satoh, Yuji; Tóth, Gábor Zsolt
2016-05-06
We derive the exact mass-coupling relation of the simplest multiscale quantum integrable model, i.e., the homogeneous sine-Gordon model with two mass scales. The relation is obtained by comparing the perturbed conformal field theory description of the model valid at short distances to the large distance bootstrap description based on the model's integrability. In particular, we find a differential equation for the relation by constructing conserved tensor currents, which satisfy a generalization of the Θ sum rule Ward identity. The mass-coupling relation is written in terms of hypergeometric functions.
Technical and scale efficiency in public and private Irish nursing homes - a bootstrap DEA approach.
Ni Luasa, Shiovan; Dineen, Declan; Zieba, Marta
2016-10-27
This article provides methodological and empirical insights into the estimation of technical efficiency in the nursing home sector. Focusing on long-stay care and using primary data, we examine technical and scale efficiency in 39 public and 73 private Irish nursing homes by applying an input-oriented data envelopment analysis (DEA). We employ robust bootstrap methods to validate our nonparametric DEA scores and to integrate the effects of potential determinants in estimating the efficiencies. Both the homogenous and two-stage double bootstrap procedures are used to obtain confidence intervals for the bias-corrected DEA scores. Importantly, the application of the double bootstrap approach affords true DEA technical efficiency scores after adjusting for the effects of ownership, size, case-mix, and other determinants such as location, and quality. Based on our DEA results for variable returns to scale technology, the average technical efficiency score is 62 %, and the mean scale efficiency is 88 %, with nearly all units operating on the increasing returns to scale part of the production frontier. Moreover, based on the double bootstrap results, Irish nursing homes are less technically efficient, and more scale efficient than the conventional DEA estimates suggest. Regarding the efficiency determinants, in terms of ownership, we find that private facilities are less efficient than the public units. Furthermore, the size of the nursing home has a positive effect, and this reinforces our finding that Irish homes produce at increasing returns to scale. Also, notably, we find that a tendency towards quality improvements can lead to poorer technical efficiency performance.
Empirical single sample quantification of bias and variance in Q-ball imaging.
Hainline, Allison E; Nath, Vishwesh; Parvathaneni, Prasanna; Blaber, Justin A; Schilling, Kurt G; Anderson, Adam W; Kang, Hakmook; Landman, Bennett A
2018-02-06
The bias and variance of high angular resolution diffusion imaging methods have not been thoroughly explored in the literature and may benefit from the simulation extrapolation (SIMEX) and bootstrap techniques to estimate bias and variance of high angular resolution diffusion imaging metrics. The SIMEX approach is well established in the statistics literature and uses simulation of increasingly noisy data to extrapolate back to a hypothetical case with no noise. The bias of calculated metrics can then be computed by subtracting the SIMEX estimate from the original pointwise measurement. The SIMEX technique has been studied in the context of diffusion imaging to accurately capture the bias in fractional anisotropy measurements in DTI. Herein, we extend the application of SIMEX and bootstrap approaches to characterize bias and variance in metrics obtained from a Q-ball imaging reconstruction of high angular resolution diffusion imaging data. The results demonstrate that SIMEX and bootstrap approaches provide consistent estimates of the bias and variance of generalized fractional anisotropy, respectively. The RMSE for the generalized fractional anisotropy estimates shows a 7% decrease in white matter and an 8% decrease in gray matter when compared with the observed generalized fractional anisotropy estimates. On average, the bootstrap technique results in SD estimates that are approximately 97% of the true variation in white matter, and 86% in gray matter. Both SIMEX and bootstrap methods are flexible, estimate population characteristics based on single scans, and may be extended for bias and variance estimation on a variety of high angular resolution diffusion imaging metrics. © 2018 International Society for Magnetic Resonance in Medicine.
Bootstrapping rapidity anomalous dimensions for transverse-momentum resummation
Li, Ye; Zhu, Hua Xing
2017-01-11
Soft function relevant for transverse-momentum resummation for Drell-Yan or Higgs production at hadron colliders are computed through to three loops in the expansion of strong coupling, with the help of bootstrap technique and supersymmetric decomposition. The corresponding rapidity anomalous dimension is extracted. Furthermore, an intriguing relation between anomalous dimensions for transverse-momentum resummation and threshold resummation is found.
H. T. Schreuder; M. S. Williams
2000-01-01
In simulation sampling from forest populations using sample sizes of 20, 40, and 60 plots respectively, confidence intervals based on the bootstrap (accelerated, percentile, and t-distribution based) were calculated and compared with those based on the classical t confidence intervals for mapped populations and subdomains within those populations. A 68.1 ha mapped...
ERIC Educational Resources Information Center
Ural, A. Engin; Yuret, Deniz; Ketrez, F. Nihan; Kocbas, Dilara; Kuntay, Aylin C.
2009-01-01
The syntactic bootstrapping mechanism of verb learning was evaluated against child-directed speech in Turkish, a language with rich morphology, nominal ellipsis and free word order. Machine-learning algorithms were run on transcribed caregiver speech directed to two Turkish learners (one hour every two weeks between 0;9 to 1;10) of different…
ERIC Educational Resources Information Center
Seco, Guillermo Vallejo; Izquierdo, Marcelino Cuesta; Garcia, M. Paula Fernandez; Diez, F. Javier Herrero
2006-01-01
The authors compare the operating characteristics of the bootstrap-F approach, a direct extension of the work of Berkovits, Hancock, and Nevitt, with Huynh's improved general approximation (IGA) and the Brown-Forsythe (BF) multivariate approach in a mixed repeated measures design when normality and multisample sphericity assumptions do not hold.…
Sample-based estimation of tree species richness in a wet tropical forest compartment
Steen Magnussen; Raphael Pelissier
2007-01-01
Petersen's capture-recapture ratio estimator and the well-known bootstrap estimator are compared across a range of simulated low-intensity simple random sampling with fixed-area plots of 100 m? in a rich wet tropical forest compartment with 93 tree species in the Western Ghats of India. Petersen's ratio estimator was uniformly superior to the bootstrap...
Common Ground between Form and Content: The Pragmatic Solution to the Bootstrapping Problem
ERIC Educational Resources Information Center
Oller, John W.
2005-01-01
The frame of reference for this article is second or foreign language (L2 or FL) acquisition, but the pragmatic bootstrapping hypothesis applies to language processing and acquisition in any context or modality. It is relevant to teaching children to read. It shows how connections between target language surface forms and their content can be made…
2006-06-13
with arithmetic mean ( UPGMA ) using random tie breaking and uncorrected pairwise distances in MacVector 7.0 (Oxford Molecular). Numbers on branches...denote the UPGMA bootstrap percentage using a highly stringent number (1000) of replications (Felsenstein, 1985). All bootstrap values are 50%, as shown
A Comparison of Single Sample and Bootstrap Methods to Assess Mediation in Cluster Randomized Trials
ERIC Educational Resources Information Center
Pituch, Keenan A.; Stapleton, Laura M.; Kang, Joo Youn
2006-01-01
A Monte Carlo study examined the statistical performance of single sample and bootstrap methods that can be used to test and form confidence interval estimates of indirect effects in two cluster randomized experimental designs. The designs were similar in that they featured random assignment of clusters to one of two treatment conditions and…
Multilingual Phoneme Models for Rapid Speech Processing System Development
2006-09-01
processes are used to develop an Arabic speech recognition system starting from monolingual English models, In- ternational Phonetic Association (IPA...clusters. It was found that multilingual bootstrapping methods out- perform monolingual English bootstrapping methods on the Arabic evaluation data initially...International Phonetic Alphabet . . . . . . . . . 7 2.3.2 Multilingual vs. Monolingual Speech Recognition 7 2.3.3 Data-Driven Approaches
Ramírez-Prado, Dolores; Cortés, Ernesto; Aguilar-Segura, María Soledad; Gil-Guillén, Vicente Francisco
2016-01-01
In January 2012, a review of the cases of chromosome 15q24 microdeletion syndrome was published. However, this study did not include inferential statistics. The aims of the present study were to update the literature search and calculate confidence intervals for the prevalence of each phenotype using bootstrap methodology. Published case reports of patients with the syndrome that included detailed information about breakpoints and phenotype were sought and 36 were included. Deletions in megabase (Mb) pairs were determined to calculate the size of the interstitial deletion of the phenotypes studied in 2012. To determine confidence intervals for the prevalence of the phenotype and the interstitial loss, we used bootstrap methodology. Using the bootstrap percentiles method, we found wide variability in the prevalence of the different phenotypes (3–100%). The mean interstitial deletion size was 2.72 Mb (95% CI [2.35–3.10 Mb]). In comparison with our work, which expanded the literature search by 45 months, there were differences in the prevalence of 17% of the phenotypes, indicating that more studies are needed to analyze this rare disease. PMID:26925314
van Walraven, Carl
2017-04-01
Diagnostic codes used in administrative databases cause bias due to misclassification of patient disease status. It is unclear which methods minimize this bias. Serum creatinine measures were used to determine severe renal failure status in 50,074 hospitalized patients. The true prevalence of severe renal failure and its association with covariates were measured. These were compared to results for which renal failure status was determined using surrogate measures including the following: (1) diagnostic codes; (2) categorization of probability estimates of renal failure determined from a previously validated model; or (3) bootstrap methods imputation of disease status using model-derived probability estimates. Bias in estimates of severe renal failure prevalence and its association with covariates were minimal when bootstrap methods were used to impute renal failure status from model-based probability estimates. In contrast, biases were extensive when renal failure status was determined using codes or methods in which model-based condition probability was categorized. Bias due to misclassification from inaccurate diagnostic codes can be minimized using bootstrap methods to impute condition status using multivariable model-derived probability estimates. Copyright © 2017 Elsevier Inc. All rights reserved.
Estimating uncertainty in respondent-driven sampling using a tree bootstrap method.
Baraff, Aaron J; McCormick, Tyler H; Raftery, Adrian E
2016-12-20
Respondent-driven sampling (RDS) is a network-based form of chain-referral sampling used to estimate attributes of populations that are difficult to access using standard survey tools. Although it has grown quickly in popularity since its introduction, the statistical properties of RDS estimates remain elusive. In particular, the sampling variability of these estimates has been shown to be much higher than previously acknowledged, and even methods designed to account for RDS result in misleadingly narrow confidence intervals. In this paper, we introduce a tree bootstrap method for estimating uncertainty in RDS estimates based on resampling recruitment trees. We use simulations from known social networks to show that the tree bootstrap method not only outperforms existing methods but also captures the high variability of RDS, even in extreme cases with high design effects. We also apply the method to data from injecting drug users in Ukraine. Unlike other methods, the tree bootstrap depends only on the structure of the sampled recruitment trees, not on the attributes being measured on the respondents, so correlations between attributes can be estimated as well as variability. Our results suggest that it is possible to accurately assess the high level of uncertainty inherent in RDS.
A bootstrap lunar base: Preliminary design review 2
NASA Technical Reports Server (NTRS)
1987-01-01
A bootstrap lunar base is the gateway to manned solar system exploration and requires new ideas and new designs on the cutting edge of technology. A preliminary design for a Bootstrap Lunar Base, the second provided by this contractor, is presented. An overview of the work completed is discussed as well as the technical, management, and cost strategies to complete the program requirements. The lunar base design stresses the transforming capabilities of its lander vehicles to aid in base construction. The design also emphasizes modularity and expandability in the base configuration to support the long-term goals of scientific research and profitable lunar resource exploitation. To successfully construct, develop, and inhabit a permanent lunar base, however, several technological advancements must first be realized. Some of these technological advancements are also discussed.
Spheres, charges, instantons, and bootstrap: A five-dimensional odyssey
NASA Astrophysics Data System (ADS)
Chang, Chi-Ming; Fluder, Martin; Lin, Ying-Hsuan; Wang, Yifan
2018-03-01
We combine supersymmetric localization and the conformal bootstrap to study five-dimensional superconformal field theories. To begin, we classify the admissible counter-terms and derive a general relation between the five-sphere partition function and the conformal and flavor central charges. Along the way, we discover a new superconformal anomaly in five dimensions. We then propose a precise triple factorization formula for the five-sphere partition function, that incorporates instantons and is consistent with flavor symmetry enhancement. We numerically evaluate the central charges for the rank-one Seiberg and Morrison-Seiberg theories, and find strong evidence for their saturation of bootstrap bounds, thereby determining the spectra of long multiplets in these theories. Lastly, our results provide new evidence for the F-theorem and possibly a C-theorem in five-dimensional superconformal theories.
Improved Rainfall Estimates and Predictions for 21st Century Drought Early Warning
NASA Technical Reports Server (NTRS)
Funk, Chris; Peterson, Pete; Shukla, Shraddhanand; Husak, Gregory; Landsfeld, Marty; Hoell, Andrew; Pedreros, Diego; Roberts, J. B.; Robertson, F. R.; Tadesse, Tsegae;
2015-01-01
As temperatures increase, the onset and severity of droughts is likely to become more intense. Improved tools for understanding, monitoring and predicting droughts will be a key component of 21st century climate adaption. The best drought monitoring systems will bring together accurate precipitation estimates with skillful climate and weather forecasts. Such systems combine the predictive power inherent in the current land surface state with the predictive power inherent in low frequency ocean-atmosphere dynamics. To this end, researchers at the Climate Hazards Group (CHG), in collaboration with partners at the USGS and NASA, have developed i) a long (1981-present) quasi-global (50degS-50degN, 180degW-180degE) high resolution (0.05deg) homogenous precipitation data set designed specifically for drought monitoring, ii) tools for understanding and predicting East African boreal spring droughts, and iii) an integrated land surface modeling (LSM) system that combines rainfall observations and predictions to provide effective drought early warning. This talk briefly describes these three components. Component 1: CHIRPS The Climate Hazards group InfraRed Precipitation with Stations (CHIRPS), blends station data with geostationary satellite observations to provide global near real time daily, pentadal and monthly precipitation estimates. We describe the CHIRPS algorithm and compare CHIRPS and other estimates to validation data. The CHIRPS is shown to have high correlation, low systematic errors (bias) and low mean absolute errors. Component 2: Hybrid statistical-dynamic forecast strategies East African droughts have increased in frequency, but become more predictable as Indo- Pacific SST gradients and Walker circulation disruptions intensify. We describe hybrid statistical-dynamic forecast strategies that are far superior to the raw output of coupled forecast models. These forecasts can be translated into probabilities that can be used to generate bootstrapped ensembles describing future climate conditions. Component 3: Assimilation using LSMs CHIRPS rainfall observations (component 1) and bootstrapped forecast ensembles (component 2) can be combined using LSMs to predict soil moisture deficits. We evaluate the skill such a system in East Africa, and demonstrate results for 2013.
ERIC Educational Resources Information Center
Wagstaff, David A.; Elek, Elvira; Kulis, Stephen; Marsiglia, Flavio
2009-01-01
A nonparametric bootstrap was used to obtain an interval estimate of Pearson's "r," and test the null hypothesis that there was no association between 5th grade students' positive substance use expectancies and their intentions to not use substances. The students were participating in a substance use prevention program in which the unit of…
Bootstrapping a five-loop amplitude using Steinmann relations
Caron-Huot, Simon; Dixon, Lance J.; McLeod, Andrew; ...
2016-12-05
Here, the analytic structure of scattering amplitudes is restricted by Steinmann relations, which enforce the vanishing of certain discontinuities of discontinuities. We show that these relations dramatically simplify the function space for the hexagon function bootstrap in planar maximally supersymmetric Yang-Mills theory. Armed with this simplification, along with the constraints of dual conformal symmetry and Regge exponentiation, we obtain the complete five-loop six-particle amplitude.
A Bootstrap Algorithm for Mixture Models and Interval Data in Inter-Comparisons
2001-07-01
parametric bootstrap. The present algorithm will be applied to a thermometric inter-comparison, where data cannot be assumed to be normally distributed. 2 Data...experimental methods, used in each laboratory) often imply that the statistical assumptions are not satisfied, as for example in several thermometric ...triangular). Indeed, in thermometric experiments these three probabilistic models can represent several common stochastic variabilities for
On the Model-Based Bootstrap with Missing Data: Obtaining a "P"-Value for a Test of Exact Fit
ERIC Educational Resources Information Center
Savalei, Victoria; Yuan, Ke-Hai
2009-01-01
Evaluating the fit of a structural equation model via bootstrap requires a transformation of the data so that the null hypothesis holds exactly in the sample. For complete data, such a transformation was proposed by Beran and Srivastava (1985) for general covariance structure models and applied to structural equation modeling by Bollen and Stine…
ERIC Educational Resources Information Center
Choi, Sae Il
2009-01-01
This study used simulation (a) to compare the kernel equating method to traditional equipercentile equating methods under the equivalent-groups (EG) design and the nonequivalent-groups with anchor test (NEAT) design and (b) to apply the parametric bootstrap method for estimating standard errors of equating. A two-parameter logistic item response…
BootGraph: probabilistic fiber tractography using bootstrap algorithms and graph theory.
Vorburger, Robert S; Reischauer, Carolin; Boesiger, Peter
2013-02-01
Bootstrap methods have recently been introduced to diffusion-weighted magnetic resonance imaging to estimate the measurement uncertainty of ensuing diffusion parameters directly from the acquired data without the necessity to assume a noise model. These methods have been previously combined with deterministic streamline tractography algorithms to allow for the assessment of connection probabilities in the human brain. Thereby, the local noise induced disturbance in the diffusion data is accumulated additively due to the incremental progression of streamline tractography algorithms. Graph based approaches have been proposed to overcome this drawback of streamline techniques. For this reason, the bootstrap method is in the present work incorporated into a graph setup to derive a new probabilistic fiber tractography method, called BootGraph. The acquired data set is thereby converted into a weighted, undirected graph by defining a vertex in each voxel and edges between adjacent vertices. By means of the cone of uncertainty, which is derived using the wild bootstrap, a weight is thereafter assigned to each edge. Two path finding algorithms are subsequently applied to derive connection probabilities. While the first algorithm is based on the shortest path approach, the second algorithm takes all existing paths between two vertices into consideration. Tracking results are compared to an established algorithm based on the bootstrap method in combination with streamline fiber tractography and to another graph based algorithm. The BootGraph shows a very good performance in crossing situations with respect to false negatives and permits incorporating additional constraints, such as a curvature threshold. By inheriting the advantages of the bootstrap method and graph theory, the BootGraph method provides a computationally efficient and flexible probabilistic tractography setup to compute connection probability maps and virtual fiber pathways without the drawbacks of streamline tractography algorithms or the assumption of a noise distribution. Moreover, the BootGraph can be applied to common DTI data sets without further modifications and shows a high repeatability. Thus, it is very well suited for longitudinal studies and meta-studies based on DTI. Copyright © 2012 Elsevier Inc. All rights reserved.
Predict-first experimental analysis using automated and integrated magnetohydrodynamic modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyons, B. C.; Paz-Soldan, C.; Meneghini, O.
An integrated-modeling workflow has been developed in this paper for the purpose of performing predict-first analysis of transient-stability experiments. Starting from an existing equilibrium reconstruction from a past experiment, the workflow couples together the EFIT Grad-Shafranov solver [L. Lao et al., Fusion Sci. Technol. 48, 968 (2005)], the EPED model for the pedestal structure [P. B. Snyder et al., Phys. Plasmas 16, 056118 (2009)], and the NEO drift-kinetic-equation solver [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)] (for bootstrap current calculations) in order to generate equilibria with self-consistent pedestal structures as the plasma shape andmore » various scalar parameters (e.g., normalized β, pedestal density, and edge safety factor [q 95]) are changed. These equilibria are then analyzed using automated M3D-C1 extended-magnetohydrodynamic modeling [S. C. Jardin et al., Comput. Sci. Discovery 5, 014002 (2012)] to compute the plasma response to three-dimensional magnetic perturbations. This workflow was created in conjunction with a DIII-D experiment examining the effect of triangularity on the 3D plasma response. Several versions of the workflow were developed, and the initial ones were used to help guide experimental planning (e.g., determining the plasma current necessary to maintain the constant edge safety factor in various shapes). Subsequent validation with the experimental results was then used to revise the workflow, ultimately resulting in the complete model presented here. We show that quantitative agreement was achieved between the M3D-C1 plasma response calculated for equilibria generated by the final workflow and equilibria reconstructed from experimental data. A comparison of results from earlier workflows is used to show the importance of properly matching certain experimental parameters in the generated equilibria, including the normalized β, pedestal density, and q 95. On the other hand, the details of the pedestal current did not significantly impact the plasma response in these equilibria. A comparison to the experimentally measured plasma response shows mixed agreement, indicating that while the equilibria are predicted well, additional analysis tools may be needed. In conclusion, we note the implications that these results have for the success of future predict-first studies, particularly the need for scans of uncertain parameters and for close collaboration between experimentalists and theorists.« less
Predict-first experimental analysis using automated and integrated magnetohydrodynamic modeling
Lyons, B. C.; Paz-Soldan, C.; Meneghini, O.; ...
2018-05-07
An integrated-modeling workflow has been developed in this paper for the purpose of performing predict-first analysis of transient-stability experiments. Starting from an existing equilibrium reconstruction from a past experiment, the workflow couples together the EFIT Grad-Shafranov solver [L. Lao et al., Fusion Sci. Technol. 48, 968 (2005)], the EPED model for the pedestal structure [P. B. Snyder et al., Phys. Plasmas 16, 056118 (2009)], and the NEO drift-kinetic-equation solver [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)] (for bootstrap current calculations) in order to generate equilibria with self-consistent pedestal structures as the plasma shape andmore » various scalar parameters (e.g., normalized β, pedestal density, and edge safety factor [q 95]) are changed. These equilibria are then analyzed using automated M3D-C1 extended-magnetohydrodynamic modeling [S. C. Jardin et al., Comput. Sci. Discovery 5, 014002 (2012)] to compute the plasma response to three-dimensional magnetic perturbations. This workflow was created in conjunction with a DIII-D experiment examining the effect of triangularity on the 3D plasma response. Several versions of the workflow were developed, and the initial ones were used to help guide experimental planning (e.g., determining the plasma current necessary to maintain the constant edge safety factor in various shapes). Subsequent validation with the experimental results was then used to revise the workflow, ultimately resulting in the complete model presented here. We show that quantitative agreement was achieved between the M3D-C1 plasma response calculated for equilibria generated by the final workflow and equilibria reconstructed from experimental data. A comparison of results from earlier workflows is used to show the importance of properly matching certain experimental parameters in the generated equilibria, including the normalized β, pedestal density, and q 95. On the other hand, the details of the pedestal current did not significantly impact the plasma response in these equilibria. A comparison to the experimentally measured plasma response shows mixed agreement, indicating that while the equilibria are predicted well, additional analysis tools may be needed. In conclusion, we note the implications that these results have for the success of future predict-first studies, particularly the need for scans of uncertain parameters and for close collaboration between experimentalists and theorists.« less
Predict-first experimental analysis using automated and integrated magnetohydrodynamic modeling
NASA Astrophysics Data System (ADS)
Lyons, B. C.; Paz-Soldan, C.; Meneghini, O.; Lao, L. L.; Weisberg, D. B.; Belli, E. A.; Evans, T. E.; Ferraro, N. M.; Snyder, P. B.
2018-05-01
An integrated-modeling workflow has been developed for the purpose of performing predict-first analysis of transient-stability experiments. Starting from an existing equilibrium reconstruction from a past experiment, the workflow couples together the EFIT Grad-Shafranov solver [L. Lao et al., Fusion Sci. Technol. 48, 968 (2005)], the EPED model for the pedestal structure [P. B. Snyder et al., Phys. Plasmas 16, 056118 (2009)], and the NEO drift-kinetic-equation solver [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)] (for bootstrap current calculations) in order to generate equilibria with self-consistent pedestal structures as the plasma shape and various scalar parameters (e.g., normalized β, pedestal density, and edge safety factor [q95]) are changed. These equilibria are then analyzed using automated M3D-C1 extended-magnetohydrodynamic modeling [S. C. Jardin et al., Comput. Sci. Discovery 5, 014002 (2012)] to compute the plasma response to three-dimensional magnetic perturbations. This workflow was created in conjunction with a DIII-D experiment examining the effect of triangularity on the 3D plasma response. Several versions of the workflow were developed, and the initial ones were used to help guide experimental planning (e.g., determining the plasma current necessary to maintain the constant edge safety factor in various shapes). Subsequent validation with the experimental results was then used to revise the workflow, ultimately resulting in the complete model presented here. We show that quantitative agreement was achieved between the M3D-C1 plasma response calculated for equilibria generated by the final workflow and equilibria reconstructed from experimental data. A comparison of results from earlier workflows is used to show the importance of properly matching certain experimental parameters in the generated equilibria, including the normalized β, pedestal density, and q95. On the other hand, the details of the pedestal current did not significantly impact the plasma response in these equilibria. A comparison to the experimentally measured plasma response shows mixed agreement, indicating that while the equilibria are predicted well, additional analysis tools may be needed. Finally, we note the implications that these results have for the success of future predict-first studies, particularly the need for scans of uncertain parameters and for close collaboration between experimentalists and theorists.
Emura, Takeshi; Konno, Yoshihiko; Michimae, Hirofumi
2015-07-01
Doubly truncated data consist of samples whose observed values fall between the right- and left- truncation limits. With such samples, the distribution function of interest is estimated using the nonparametric maximum likelihood estimator (NPMLE) that is obtained through a self-consistency algorithm. Owing to the complicated asymptotic distribution of the NPMLE, the bootstrap method has been suggested for statistical inference. This paper proposes a closed-form estimator for the asymptotic covariance function of the NPMLE, which is computationally attractive alternative to bootstrapping. Furthermore, we develop various statistical inference procedures, such as confidence interval, goodness-of-fit tests, and confidence bands to demonstrate the usefulness of the proposed covariance estimator. Simulations are performed to compare the proposed method with both the bootstrap and jackknife methods. The methods are illustrated using the childhood cancer dataset.
Comulada, W. Scott
2015-01-01
Stata’s mi commands provide powerful tools to conduct multiple imputation in the presence of ignorable missing data. In this article, I present Stata code to extend the capabilities of the mi commands to address two areas of statistical inference where results are not easily aggregated across imputed datasets. First, mi commands are restricted to covariate selection. I show how to address model fit to correctly specify a model. Second, the mi commands readily aggregate model-based standard errors. I show how standard errors can be bootstrapped for situations where model assumptions may not be met. I illustrate model specification and bootstrapping on frequency counts for the number of times that alcohol was consumed in data with missing observations from a behavioral intervention. PMID:26973439
Heptagons from the Steinmann cluster bootstrap
Dixon, Lance J.; Drummond, James; Harrington, Thomas; ...
2017-02-28
We reformulate the heptagon cluster bootstrap to take advantage of the Steinmann relations, which require certain double discontinuities of any amplitude to vanish. These constraints vastly reduce the number of functions needed to bootstrap seven-point amplitudes in planarmore » $$ \\mathcal{N} $$ = 4 supersymmetric Yang-Mills theory, making higher-loop contributions to these amplitudes more computationally accessible. In particular, dual superconformal symmetry and well-defined collinear limits suffice to determine uniquely the symbols of the three-loop NMHV and four-loop MHV seven-point amplitudes. We also show that at three loops, relaxing the dual superconformal $$\\bar{Q}$$ relations and imposing dihedral symmetry (and for NMHV the absence of spurious poles) leaves only a single ambiguity in the heptagon amplitudes. These results point to a strong tension between the collinear properties of the amplitudes and the Steinmann relations.« less
Kepler Planet Detection Metrics: Statistical Bootstrap Test
NASA Technical Reports Server (NTRS)
Jenkins, Jon M.; Burke, Christopher J.
2016-01-01
This document describes the data produced by the Statistical Bootstrap Test over the final three Threshold Crossing Event (TCE) deliveries to NExScI: SOC 9.1 (Q1Q16)1 (Tenenbaum et al. 2014), SOC 9.2 (Q1Q17) aka DR242 (Seader et al. 2015), and SOC 9.3 (Q1Q17) aka DR253 (Twicken et al. 2016). The last few years have seen significant improvements in the SOC science data processing pipeline, leading to higher quality light curves and more sensitive transit searches. The statistical bootstrap analysis results presented here and the numerical results archived at NASAs Exoplanet Science Institute (NExScI) bear witness to these software improvements. This document attempts to introduce and describe the main features and differences between these three data sets as a consequence of the software changes.
Imaging with New Classic and Vision at the NPOI
NASA Astrophysics Data System (ADS)
Jorgensen, Anders
2018-04-01
The Navy Precision Optical Interferometer (NPOI) is unique among interferometric observatories for its ability to position telescopes in an equally-spaced array configuration. This configuration is optimal for interferometric imaging because it allows the use of bootstrapping to track fringes on long baselines with signal-to-noise ratio less than one. When combined with coherent integration techniques this can produce visibilities with acceptable SNR on baselines long enough to resolve features on the surfaces of stars. The stellar surface imaging project at NPOI combines the bootstrapping array configuration of the NPOI array, real-time fringe tracking, baseline- and wavelength bootstrapping with Earth rotation to provide dense coverage in the UV plane at a wide range of spatial frequencies. In this presentation, we provide an overview of the project and an update of the latest status and results from the project.
Bootstrapping and Maintaining Trust in the Cloud
2016-12-01
proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as Amazon Web Services and Google Compute Engine means...IaaS trusted computing system: • Secure Bootstrapping – the system should enable the tenant to securely install an initial root secret into each cloud ...elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features, but none achieve all. Excalibur [31] sup
Sample Reuse in Statistical Remodeling.
1987-08-01
as the jackknife and bootstrap, is an expansion of the functional, T(Fn), or of its distribution function or both. Frangos and Schucany (1987a) used...accelerated bootstrap. In the same report Frangos and Schucany demonstrated the small sample superiority of that approach over the proposals that take...higher order terms of an Edgeworth expansion into account. In a second report Frangos and Schucany (1987b) examined the small sample performance of
ERIC Educational Resources Information Center
Ramanarayanan, Vikram; Suendermann-Oeft, David; Lange, Patrick; Ivanov, Alexei V.; Evanini, Keelan; Yu, Zhou; Tsuprun, Eugene; Qian, Yao
2016-01-01
We propose a crowdsourcing-based framework to iteratively and rapidly bootstrap a dialog system from scratch for a new domain. We leverage the open-source modular HALEF dialog system to deploy dialog applications. We illustrate the usefulness of this framework using four different prototype dialog items with applications in the educational domain…
Guerrero, Erick G; Fenwick, Karissa; Kong, Yinfei
2017-11-14
Leadership style and specific organizational climates have emerged as critical mechanisms to implement targeted practices in organizations. Drawing from relevant theories, we propose that climate for implementation of cultural competence reflects how transformational leadership may enhance the organizational implementation of culturally responsive practices in health care organizations. Using multilevel data from 427 employees embedded in 112 addiction treatment programs collected in 2013, confirmatory factor analysis showed adequate fit statistics for our measure of climate for implementation of cultural competence (Cronbach's alpha = .88) and three outcomes: knowledge (Cronbach's alpha = .88), services (Cronbach's alpha = .86), and personnel (Cronbach's alpha = .86) practices. Results from multilevel path analyses indicate a positive relationship between employee perceptions of transformational leadership and climate for implementation of cultural competence (standardized indirect effect = .057, bootstrap p < .001). We also found a positive indirect effect between transformational leadership and each of the culturally competent practices: knowledge (standardized indirect effect = .006, bootstrap p = .004), services (standardized indirect effect = .019, bootstrap p < .001), and personnel (standardized indirect effect = .014, bootstrap p = .005). Findings contribute to implementation science. They build on leadership theory and offer evidence of the mediating role of climate in the implementation of cultural competence in addiction health service organizations.
Nixon, Richard M; Wonderling, David; Grieve, Richard D
2010-03-01
Cost-effectiveness analyses (CEA) alongside randomised controlled trials commonly estimate incremental net benefits (INB), with 95% confidence intervals, and compute cost-effectiveness acceptability curves and confidence ellipses. Two alternative non-parametric methods for estimating INB are to apply the central limit theorem (CLT) or to use the non-parametric bootstrap method, although it is unclear which method is preferable. This paper describes the statistical rationale underlying each of these methods and illustrates their application with a trial-based CEA. It compares the sampling uncertainty from using either technique in a Monte Carlo simulation. The experiments are repeated varying the sample size and the skewness of costs in the population. The results showed that, even when data were highly skewed, both methods accurately estimated the true standard errors (SEs) when sample sizes were moderate to large (n>50), and also gave good estimates for small data sets with low skewness. However, when sample sizes were relatively small and the data highly skewed, using the CLT rather than the bootstrap led to slightly more accurate SEs. We conclude that while in general using either method is appropriate, the CLT is easier to implement, and provides SEs that are at least as accurate as the bootstrap. (c) 2009 John Wiley & Sons, Ltd.
Elton-Marshall, Tara; Fong, Geoffrey T; Yong, Hua-Hie; Borland, Ron; Xu, Steve Shaowei; Quah, Anne C K; Feng, Guoze; Jiang, Yuan
2016-01-01
Background The sensory belief that ‘light/low tar’ cigarettes are smoother can also influence the belief that ‘light/low tar’ cigarettes are less harmful. However, the ‘light’ concept is one of several factors influencing beliefs. No studies have examined the impact of the sensory belief about one’s own brand of cigarettes on perceptions of harm. Objective The current study examines whether a smoker’s sensory belief that their brand is smoother is associated with the belief that their brand is less harmful and whether sensory beliefs mediate the relation between smoking a ‘light/low tar’ cigarette and relative perceptions of harm among smokers in China. Methods Data are from 5209 smokers who were recruited using a stratified multistage sampling design and participated in wave 3 of the International Tobacco Control (ITC) China Survey, a face-to-face survey of adult smokers and non-smokers in seven cities. Results Smokers who agreed that their brand of cigarettes was smoother were significantly more likely to say that their brand of cigarettes was less harmful (p<0.001, OR=6.86, 95% CI 5.64 to 8.33). Mediational analyses using the bootstrapping procedure indicated that both the direct effect of ‘light/low tar’ cigarette smokers on the belief that their cigarettes are less harmful (b=0.24, bootstrapped bias corrected 95% CI 0.13 to 0.34, p<0.001) and the indirect effect via their belief that their cigarettes are smoother were significant (b=0.32, bootstrapped bias-corrected 95% CI 0.28 to 0.37, p<0.001), suggesting that the mediation was partial. Conclusions These results demonstrate the importance of implementing tobacco control policies that address the impact that cigarette design and marketing can have in capitalising on the smoker’s natural associations between smoother sensations and lowered perceptions of harm. PMID:25370698
Elton-Marshall, Tara; Fong, Geoffrey T; Yong, Hua-Hie; Borland, Ron; Xu, Steve Shaowei; Quah, Anne C K; Feng, Guoze; Jiang, Yuan
2015-11-01
The sensory belief that 'light/low tar' cigarettes are smoother can also influence the belief that 'light/low tar' cigarettes are less harmful. However, the 'light' concept is one of several factors influencing beliefs. No studies have examined the impact of the sensory belief about one's own brand of cigarettes on perceptions of harm. The current study examines whether a smoker's sensory belief that their brand is smoother is associated with the belief that their brand is less harmful and whether sensory beliefs mediate the relation between smoking a 'light/low tar' cigarette and relative perceptions of harm among smokers in China. Data are from 5209 smokers who were recruited using a stratified multistage sampling design and participated in Wave 3 of the International Tobacco Control (ITC) China Survey, a face-to-face survey of adult smokers and non-smokers in seven cities. Smokers who agreed that their brand of cigarettes was smoother were significantly more likely to say that their brand of cigarettes was less harmful (p<0.001, OR=6.86, 95% CI 5.64 to 8.33). Mediational analyses using the bootstrapping procedure indicated that both the direct effect of 'light/low tar' cigarette smokers on the belief that their cigarettes are less harmful (b=0.24, bootstrapped bias corrected 95% CI 0.13 to 0.34, p<0.001) and the indirect effect via their belief that their cigarettes are smoother were significant (b=0.32, bootstrapped bias-corrected 95% CI 0.28 to 0.37, p<0.001), suggesting that the mediation was partial. These results demonstrate the importance of implementing tobacco control policies that address the impact that cigarette design and marketing can have in capitalising on the smoker's natural associations between smoother sensations and lowered perceptions of harm. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Symmetry as Bias: Rediscovering Special Relativity
NASA Technical Reports Server (NTRS)
Lowry, Michael R.
1992-01-01
This paper describes a rational reconstruction of Einstein's discovery of special relativity, validated through an implementation: the Erlanger program. Einstein's discovery of special relativity revolutionized both the content of physics and the research strategy used by theoretical physicists. This research strategy entails a mutual bootstrapping process between a hypothesis space for biases, defined through different postulated symmetries of the universe, and a hypothesis space for physical theories. The invariance principle mutually constrains these two spaces. The invariance principle enables detecting when an evolving physical theory becomes inconsistent with its bias, and also when the biases for theories describing different phenomena are inconsistent. Structural properties of the invariance principle facilitate generating a new bias when an inconsistency is detected. After a new bias is generated. this principle facilitates reformulating the old, inconsistent theory by treating the latter as a limiting approximation. The structural properties of the invariance principle can be suitably generalized to other types of biases to enable primal-dual learning.
Correcting for Optimistic Prediction in Small Data Sets
Smith, Gordon C. S.; Seaman, Shaun R.; Wood, Angela M.; Royston, Patrick; White, Ian R.
2014-01-01
The C statistic is a commonly reported measure of screening test performance. Optimistic estimation of the C statistic is a frequent problem because of overfitting of statistical models in small data sets, and methods exist to correct for this issue. However, many studies do not use such methods, and those that do correct for optimism use diverse methods, some of which are known to be biased. We used clinical data sets (United Kingdom Down syndrome screening data from Glasgow (1991–2003), Edinburgh (1999–2003), and Cambridge (1990–2006), as well as Scottish national pregnancy discharge data (2004–2007)) to evaluate different approaches to adjustment for optimism. We found that sample splitting, cross-validation without replication, and leave-1-out cross-validation produced optimism-adjusted estimates of the C statistic that were biased and/or associated with greater absolute error than other available methods. Cross-validation with replication, bootstrapping, and a new method (leave-pair-out cross-validation) all generated unbiased optimism-adjusted estimates of the C statistic and had similar absolute errors in the clinical data set. Larger simulation studies confirmed that all 3 methods performed similarly with 10 or more events per variable, or when the C statistic was 0.9 or greater. However, with lower events per variable or lower C statistics, bootstrapping tended to be optimistic but with lower absolute and mean squared errors than both methods of cross-validation. PMID:24966219
Phylogenomics provides strong evidence for relationships of butterflies and moths.
Kawahara, Akito Y; Breinholt, Jesse W
2014-08-07
Butterflies and moths constitute some of the most popular and charismatic insects. Lepidoptera include approximately 160 000 described species, many of which are important model organisms. Previous studies on the evolution of Lepidoptera did not confidently place butterflies, and many relationships among superfamilies in the megadiverse clade Ditrysia remain largely uncertain. We generated a molecular dataset with 46 taxa, combining 33 new transcriptomes with 13 available genomes, transcriptomes and expressed sequence tags (ESTs). Using HaMStR with a Lepidoptera-specific core-orthologue set of single copy loci, we identified 2696 genes for inclusion into the phylogenomic analysis. Nucleotides and amino acids of the all-gene, all-taxon dataset yielded nearly identical, well-supported trees. Monophyly of butterflies (Papilionoidea) was strongly supported, and the group included skippers (Hesperiidae) and the enigmatic butterfly-moths (Hedylidae). Butterflies were placed sister to the remaining obtectomeran Lepidoptera, and the latter was grouped with greater than or equal to 87% bootstrap support. Establishing confident relationships among the four most diverse macroheteroceran superfamilies was previously challenging, but we recovered 100% bootstrap support for the following relationships: ((Geometroidea, Noctuoidea), (Bombycoidea, Lasiocampoidea)). We present the first robust, transcriptome-based tree of Lepidoptera that strongly contradicts historical placement of butterflies, and provide an evolutionary framework for genomic, developmental and ecological studies on this diverse insect order. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Maternal depression and trait anger as risk factors for escalated physical discipline.
Shay, Nicole L; Knutson, John F
2008-02-01
To test the hypothesized anger-mediated relation between maternal depression and escalation of physical discipline, 122 economically disadvantaged mothers were assessed for current and lifetime diagnoses of depression using the Current Depressive Episode, Past Depression, and Dysthymia sections of the Structured Clinical Interview for DSM-IV (SCID) and a measure of current depressive symptoms, the Beck Depression Inventory-Second Edition (BDI-II). Escalation of physical discipline was assessed using a video analog parenting task; maternal anger not specific to discipline was assessed using the Spielberger Trait Anger Expression Inventory. Reports of anger were associated with the diagnosis of depression and depressive symptoms. Bootstrap analyses of indirect effects indicated that the link between depression and escalated discipline was mediated by anger. Parallel analyses based on BDI-II scores identified a marginally significant indirect effect of depression on discipline. Findings suggest that anger and irritability are central to the putative link between depression and harsh discipline.
BOOTSTRAPPING THE CORONAL MAGNETIC FIELD WITH STEREO: UNIPOLAR POTENTIAL FIELD MODELING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aschwanden, Markus J.; Sandman, Anne W., E-mail: aschwanden@lmsal.co
We investigate the recently quantified misalignment of {alpha}{sub mis} {approx} 20{sup 0}-40{sup 0} between the three-dimensional geometry of stereoscopically triangulated coronal loops observed with STEREO/EUVI (in four active regions (ARs)) and theoretical (potential or nonlinear force-free) magnetic field models extrapolated from photospheric magnetograms. We develop an efficient method of bootstrapping the coronal magnetic field by forward fitting a parameterized potential field model to the STEREO-observed loops. The potential field model consists of a number of unipolar magnetic charges that are parameterized by decomposing a photospheric magnetogram from the Michelson Doppler Imager. The forward-fitting method yields a best-fit magnetic field modelmore » with a reduced misalignment of {alpha}{sub PF} {approx} 13{sup 0}-20{sup 0}. We also evaluate stereoscopic measurement errors and find a contribution of {alpha}{sub SE} {approx} 7{sup 0}-12{sup 0}, which constrains the residual misalignment to {alpha}{sub NP} {approx} 11{sup 0}-17{sup 0}, which is likely due to the nonpotentiality of the ARs. The residual misalignment angle, {alpha}{sub NP}, of the potential field due to nonpotentiality is found to correlate with the soft X-ray flux of the AR, which implies a relationship between electric currents and plasma heating.« less
Palmer, Tom M; Holmes, Michael V; Keating, Brendan J; Sheehan, Nuala A
2017-11-01
Mendelian randomization studies use genotypes as instrumental variables to test for and estimate the causal effects of modifiable risk factors on outcomes. Two-stage residual inclusion (TSRI) estimators have been used when researchers are willing to make parametric assumptions. However, researchers are currently reporting uncorrected or heteroscedasticity-robust standard errors for these estimates. We compared several different forms of the standard error for linear and logistic TSRI estimates in simulations and in real-data examples. Among others, we consider standard errors modified from the approach of Newey (1987), Terza (2016), and bootstrapping. In our simulations Newey, Terza, bootstrap, and corrected 2-stage least squares (in the linear case) standard errors gave the best results in terms of coverage and type I error. In the real-data examples, the Newey standard errors were 0.5% and 2% larger than the unadjusted standard errors for the linear and logistic TSRI estimators, respectively. We show that TSRI estimators with modified standard errors have correct type I error under the null. Researchers should report TSRI estimates with modified standard errors instead of reporting unadjusted or heteroscedasticity-robust standard errors. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health.
Qian, Jinping P.; Garofalo, Andrea M.; Gong, Xianzu Z.; ...
2017-03-20
Recent EAST/DIII-D joint experiments on the high poloidal betamore » $${{\\beta}_{\\text{P}}}$$ regime in DIII-D have extended operation with internal transport barriers (ITBs) and excellent energy confinement (H 98y2 ~ 1.6) to higher plasma current, for lower q 95 ≤ 7.0, and more balanced neutral beam injection (NBI) (torque injection < 2 Nm), for lower plasma rotation than previous results. Transport analysis and experimental measurements at low toroidal rotation suggest that the E × B shear effect is not key to the ITB formation in these high $${{\\beta}_{\\text{P}}}$$ discharges. Experiments and TGLF modeling show that the Shafranov shift has a key stabilizing effect on turbulence. Extrapolation of the DIII-D results using a 0D model shows that with the improved confinement, the high bootstrap fraction regime could achieve fusion gain Q = 5 in ITER at $${{\\beta}_{\\text{N}}}$$ ~ 2.9 and q 95 ~ 7. With the optimization of q(0), the required improved confinement is achievable when using 1.5D TGLF-SAT1 for transport simulations. Furthermore, results reported in this paper suggest that the DIII-D high $${{\\beta}_{\\text{P}}}$$ scenario could be a candidate for ITER steady state operation.« less
Fagerland, Morten W; Sandvik, Leiv; Mowinckel, Petter
2011-04-13
The number of events per individual is a widely reported variable in medical research papers. Such variables are the most common representation of the general variable type called discrete numerical. There is currently no consensus on how to compare and present such variables, and recommendations are lacking. The objective of this paper is to present recommendations for analysis and presentation of results for discrete numerical variables. Two simulation studies were used to investigate the performance of hypothesis tests and confidence interval methods for variables with outcomes {0, 1, 2}, {0, 1, 2, 3}, {0, 1, 2, 3, 4}, and {0, 1, 2, 3, 4, 5}, using the difference between the means as an effect measure. The Welch U test (the T test with adjustment for unequal variances) and its associated confidence interval performed well for almost all situations considered. The Brunner-Munzel test also performed well, except for small sample sizes (10 in each group). The ordinary T test, the Wilcoxon-Mann-Whitney test, the percentile bootstrap interval, and the bootstrap-t interval did not perform satisfactorily. The difference between the means is an appropriate effect measure for comparing two independent discrete numerical variables that has both lower and upper bounds. To analyze this problem, we encourage more frequent use of parametric hypothesis tests and confidence intervals.
The effect of sheared toroidal rotation on pressure driven magnetic islands in toroidal plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hegna, C. C.
2016-05-15
The impact of sheared toroidal rotation on the evolution of pressure driven magnetic islands in tokamak plasmas is investigated using a resistive magnetohydrodynamics model augmented by a neoclassical Ohm's law. Particular attention is paid to the asymptotic matching data as the Mercier indices are altered in the presence of sheared flow. Analysis of the nonlinear island Grad-Shafranov equation shows that sheared flows tend to amplify the stabilizing pressure/curvature contribution to pressure driven islands in toroidal tokamaks relative to the island bootstrap current contribution. As such, sheared toroidal rotation tends to reduce saturated magnetic island widths.
Bootstrapping and Maintaining Trust in the Cloud
2016-03-16
of infrastructure-as-a- service (IaaS) cloud computing services such as Ama- zon Web Services, Google Compute Engine, Rackspace, et. al. means that...Implementation We implemented keylime in ∼3.2k lines of Python in four components: registrar, node, CV, and tenant. The registrar offers a REST-based web ...bootstrap key K. It provides an unencrypted REST-based web service for these two functions. As described earlier, the pro- tocols for exchanging data
Reduced Power Laer Designation Systems
2008-06-20
200KD, Ri = = 60Kfl, and R 2 = R4 = 2K yields an overall transimpedance gain of 200K x 30 x 30 = 180MV/A. Figure 3. Three stage photodiode amplifier ...transistor circuit for bootstrap buffering of the input stage, comparing the noise performance of the candidate amplifier designs, selecting the two...transistor bootstrap design as the circuit of choice, and comparing the performance of this circuit against that of a basic transconductance amplifier
Molinos-Senante, María; Donoso, Guillermo; Sala-Garrido, Ramon; Villegas, Andrés
2018-03-01
Benchmarking the efficiency of water companies is essential to set water tariffs and to promote their sustainability. In doing so, most of the previous studies have applied conventional data envelopment analysis (DEA) models. However, it is a deterministic method that does not allow to identify environmental factors influencing efficiency scores. To overcome this limitation, this paper evaluates the efficiency of a sample of Chilean water and sewerage companies applying a double-bootstrap DEA model. Results evidenced that the ranking of water and sewerage companies changes notably whether efficiency scores are computed applying conventional or double-bootstrap DEA models. Moreover, it was found that the percentage of non-revenue water and customer density are factors influencing the efficiency of Chilean water and sewerage companies. This paper illustrates the importance of using a robust and reliable method to increase the relevance of benchmarking tools.
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
Sabir, Jamal S M; Abo-Aba, Salah; Bafeel, Sameera; Zari, Talal A; Edris, Sherif; Shokry, Ahmed M; Atef, Ahmed; Gadalla, Nour O; Ramadan, Ahmed M; Al-Kordy, Magdy A; El-Domyati, Fotouh M; Jansen, Robert K; Bahieldin, Ahmed
2014-01-01
Date palm is the most economically important plant in the Middle East due to its nutritionally valuable fruit. The development of accurate DNA fingerprints to characterize cultivars and the detection of genetic diversity are of great value for breeding programs. The present study explores the usefulness of ISSR and AFLP molecular markers to detect relationships among 10 date palm (Phoenix dactylifera L.) cultivars from Saudi Arabia. Thirteen ISSR primers and six AFLP primer combinations were examined. The level of polymorphism among cultivars for ISSRs ranged from 20% to 100% with an average of 85%. Polymorphism levels for AFLPs ranged from 63% to 84% with an average of 76%. The total number of cultivar-specific markers was 241, 208 of which were generated from AFLP analysis. AJWA cultivar had the highest number of cultivar-specific ISSR markers, whereas DEK, PER, SUK-Q, SHA and MOS-H cultivars had the lowest. RAB and SHA cultivars had the most and least AFLP cultivar-specific markers, respectively. The highest pairwise similarity indices for ISSRs, AFLPs and combined markers were 84% between DEK (female) and PER (female), 81% between SUK-Q (male) and RAB (male), and 80% between SUK-Q (male) and RAB (male), respectively. The lowest similarity indices were 65% between TAB (female) and SUK-Q (male), 67% between SUK-A (female) and SUK-Q (male), and 67% between SUK-A (female) and SUK-Q (male). Cultivars of the same sex had higher pairwise similarities than those between cultivars of different sex. The Neighbor-Joining (NJ) tree generated from the ISSR dataset was not well resolved and bootstrap support for resolved nodes in the tree was low. AFLP and combined data generated completely resolved trees with high levels of bootstrap support. In conclusion, AFLP and ISSR approaches enabled discrimination among 10 date palm cultivars of from Saudi Arabia, which will provide valuable information for future improvement of this important crop. Copyright © 2013 Académie des sciences. All rights reserved.
Seasonal comparisons of sea ice concentration estimates derived from SSM/I, OKEAN, and RADARSAT data
Belchansky, Gennady I.; Douglas, David C.
2002-01-01
The Special Sensor Microwave Imager (SSM/I) microwave satellite radiometer and its predecessor SMMR are primary sources of information for global sea ice and climate studies. However, comparisons of SSM/I, Landsat, AVHRR, and ERS-1 synthetic aperture radar (SAR) have shown substantial seasonal and regional differences in their estimates of sea ice concentration. To evaluate these differences, we compared SSM/I estimates of sea ice coverage derived with the NASA Team and Bootstrap algorithms to estimates made using RADARSAT, and OKEAN-01 satellite sensor data. The study area included the Barents Sea, Kara Sea, Laptev Sea, and adjacent parts of the Arctic Ocean, during October 1995 through October 1999. Ice concentration estimates from spatially and temporally near-coincident imagery were calculated using independent algorithms for each sensor type. The OKEAN algorithm implemented the satellite's two-channel active (radar) and passive microwave data in a linear mixture model based on the measured values of brightness temperature and radar backscatter. The RADARSAT algorithm utilized a segmentation approach of the measured radar backscatter, and the SSM/I ice concentrations were derived at National Snow and Ice Data Center (NSIDC) using the NASA Team and Bootstrap algorithms. Seasonal and monthly differences between SSM/I, OKEAN, and RADARSAT ice concentrations were calculated and compared. Overall, total sea ice concentration estimates derived independently from near-coincident RADARSAT, OKEAN-01, and SSM/I satellite imagery demonstrated mean differences of less than 5.5% (S.D.<9.5%) during the winter period. Differences between the SSM/I NASA Team and the SSM/I Bootstrap concentrations were no more than 3.1% (S.D.<5.4%) during this period. RADARSAT and OKEAN-01 data both yielded higher total ice concentrations than the NASA Team and the Bootstrap algorithms. The Bootstrap algorithm yielded higher total ice concentrations than the NASA Team algorithm. Total ice concentrations derived from OKEAN-01 and SSM/I satellite imagery were highly correlated during winter, spring, and fall, with mean differences of less than 8.1% (S.D.<15%) for the NASA Team algorithm, and less than 2.8% (S.D.<13.8%) for the Bootstrap algorithm. Respective differences between SSM/I NASA Team and SSM/I Bootstrap total concentrations were less than 5.3% (S.D.<6.9%). Monthly mean differences between SSM/I and OKEAN differed annually by less than 6%, with smaller differences primarily in winter. The NASA Team and Bootstrap algorithms underestimated the total sea ice concentrations relative to the RADARSAT ScanSAR no more than 3.0% (S.D.<9%) and 1.2% (S.D.<7.5%) during cold months, and no more than 12% and 7% during summer, respectively. ScanSAR tended to estimate higher ice concentrations for ice concentrations greater than 50%, when compared to SSM/I during all months. ScanSAR underestimated total sea ice concentration by 2% compared to the OKEAN-01 algorithm during cold months, and gave an overestimation by 2% during spring and summer months. Total NASA Team and Bootstrap sea ice concentration estimates derived from coincident SSM/I and OKEAN-01 data demonstrated mean differences of no more than 5.3% (S.D.<7%), 3.1% (S.D.<5.5%), 2.0% (S.D.<5.5%), and 7.3% (S.D.<10%) for fall, winter, spring, and summer periods, respectively. Large disagreements were observed between the OKEAN and NASA Team results in spring and summer for estimates of the first-year (FY) and multiyear (MY) age classes. The OKEAN-01 algorithm and data tended to estimate, on average, lower concentrations of young or FY ice and higher concentrations of total and MY ice for all months and seasons. Our results contribute to the growing body of documentation about the levels of disparity obtained when seasonal sea ice concentrations are estimated using various types of satellite data and algorithms.
Quantifying the risk of extreme aviation accidents
NASA Astrophysics Data System (ADS)
Das, Kumer Pial; Dey, Asim Kumer
2016-12-01
Air travel is considered a safe means of transportation. But when aviation accidents do occur they often result in fatalities. Fortunately, the most extreme accidents occur rarely. However, 2014 was the deadliest year in the past decade causing 111 plane crashes, and among them worst four crashes cause 298, 239, 162 and 116 deaths. In this study, we want to assess the risk of the catastrophic aviation accidents by studying historical aviation accidents. Applying a generalized Pareto model we predict the maximum fatalities from an aviation accident in future. The fitted model is compared with some of its competitive models. The uncertainty in the inferences are quantified using simulated aviation accident series, generated by bootstrap resampling and Monte Carlo simulations.
A cluster bootstrap for two-loop MHV amplitudes
Golden, John; Spradlin, Marcus
2015-02-02
We apply a bootstrap procedure to two-loop MHV amplitudes in planar N=4 super-Yang-Mills theory. We argue that the mathematically most complicated part (the Λ 2 B 2 coproduct component) of the n-particle amplitude is uniquely determined by a simple cluster algebra property together with a few physical constraints (dihedral symmetry, analytic structure, supersymmetry, and well-defined collinear limits). Finally, we present a concise, closed-form expression which manifests these properties for all n.
Wrappers for Performance Enhancement and Oblivious Decision Graphs
1995-09-01
always select all relevant features. We test di erent search engines to search the space of feature subsets and introduce compound operators to speed...distinct instances from the original dataset appearing in the test set is thus 0:632m. The 0i accuracy estimate is derived by using bootstrap sample...i for training and the rest of the instances for testing . Given a number b, the number of bootstrap samples, let 0i be the accuracy estimate for
Simplified Estimation and Testing in Unbalanced Repeated Measures Designs.
Spiess, Martin; Jordan, Pascal; Wendt, Mike
2018-05-07
In this paper we propose a simple estimator for unbalanced repeated measures design models where each unit is observed at least once in each cell of the experimental design. The estimator does not require a model of the error covariance structure. Thus, circularity of the error covariance matrix and estimation of correlation parameters and variances are not necessary. Together with a weak assumption about the reason for the varying number of observations, the proposed estimator and its variance estimator are unbiased. As an alternative to confidence intervals based on the normality assumption, a bias-corrected and accelerated bootstrap technique is considered. We also propose the naive percentile bootstrap for Wald-type tests where the standard Wald test may break down when the number of observations is small relative to the number of parameters to be estimated. In a simulation study we illustrate the properties of the estimator and the bootstrap techniques to calculate confidence intervals and conduct hypothesis tests in small and large samples under normality and non-normality of the errors. The results imply that the simple estimator is only slightly less efficient than an estimator that correctly assumes a block structure of the error correlation matrix, a special case of which is an equi-correlation matrix. Application of the estimator and the bootstrap technique is illustrated using data from a task switch experiment based on an experimental within design with 32 cells and 33 participants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oberije, Cary, E-mail: cary.oberije@maastro.nl; De Ruysscher, Dirk; Universitaire Ziekenhuizen Leuven, KU Leuven
Purpose: Although patients with stage III non-small cell lung cancer (NSCLC) are homogeneous according to the TNM staging system, they form a heterogeneous group, which is reflected in the survival outcome. The increasing amount of information for an individual patient and the growing number of treatment options facilitate personalized treatment, but they also complicate treatment decision making. Decision support systems (DSS), which provide individualized prognostic information, can overcome this but are currently lacking. A DSS for stage III NSCLC requires the development and integration of multiple models. The current study takes the first step in this process by developing andmore » validating a model that can provide physicians with a survival probability for an individual NSCLC patient. Methods and Materials: Data from 548 patients with stage III NSCLC were available to enable the development of a prediction model, using stratified Cox regression. Variables were selected by using a bootstrap procedure. Performance of the model was expressed as the c statistic, assessed internally and on 2 external data sets (n=174 and n=130). Results: The final multivariate model, stratified for treatment, consisted of age, gender, World Health Organization performance status, overall treatment time, equivalent radiation dose, number of positive lymph node stations, and gross tumor volume. The bootstrapped c statistic was 0.62. The model could identify risk groups in external data sets. Nomograms were constructed to predict an individual patient's survival probability ( (www.predictcancer.org)). The data set can be downloaded at (https://www.cancerdata.org/10.1016/j.ijrobp.2015.02.048). Conclusions: The prediction model for overall survival of patients with stage III NSCLC highlights the importance of combining patient, clinical, and treatment variables. Nomograms were developed and validated. This tool could be used as a first building block for a decision support system.« less
PROGRESS IN THE PEELING-BALLOONING MODEL OF ELMS: TOROIDAL ROTATION AND 3D NONLINEAR DYNAMICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
SNYDER,P.B; WILSON,H.R; XU,X.Q
2004-06-01
Understanding the physics of the H-Mode pedestal and edge localized modes (ELMs) is very important to next-step fusion devices for two primary reasons: (1) The pressure at the top of the edge barrier (''pedestal height'') strongly impacts global confinement and fusion performance, and (2) large ELMs lead to localized transient heat loads on material surfaces that may constrain component lifetimes. The development of the peeling-ballooning model has shed light on these issues by positing a mechanism for ELM onset and constraints on the pedestal height. The mechanism involves instability of ideal coupled ''peeling-ballooning'' modes driven by the sharp pressure gradientmore » and consequent large bootstrap current in the H-mode edge. It was first investigated in the local, high-n limit [1], and later quantified for non-local, finite-n modes in general toroidal geometry [2,3]. Important aspects are that a range of wavelengths may potentially be unstable, with intermediate n's (n {approx} 3-30) generally limiting in high performance regimes, and that stability bounds are strongly sensitive to shape [Fig l(a)], and to collisionality (i.e. temperature and density) [4] through the bootstrap current. The development of efficient MHD stability codes such as ELITE [3,2] and MISHKA [5] has allowed detailed quantification of peeling-ballooning stability bounds (e.g. [6]) and extensive and largely successful comparisons with observation (e.g. [2,6-9]). These previous calculations are ideal, static, and linear. Here we extend this work to incorporate the impact of sheared toroidal rotation, and the non-ideal, nonlinear dynamics which must be studied to quantify ELM size and heat deposition on material surfaces.« less
Liu, Chunbo; Pan, Feng; Li, Yun
2016-07-29
Glutamate is of great importance in food and pharmaceutical industries. There is still lack of effective statistical approaches for fault diagnosis in the fermentation process of glutamate. To date, the statistical approach based on generalized additive model (GAM) and bootstrap has not been used for fault diagnosis in fermentation processes, much less the fermentation process of glutamate with small samples sets. A combined approach of GAM and bootstrap was developed for the online fault diagnosis in the fermentation process of glutamate with small sample sets. GAM was first used to model the relationship between glutamate production and different fermentation parameters using online data from four normal fermentation experiments of glutamate. The fitted GAM with fermentation time, dissolved oxygen, oxygen uptake rate and carbon dioxide evolution rate captured 99.6 % variance of glutamate production during fermentation process. Bootstrap was then used to quantify the uncertainty of the estimated production of glutamate from the fitted GAM using 95 % confidence interval. The proposed approach was then used for the online fault diagnosis in the abnormal fermentation processes of glutamate, and a fault was defined as the estimated production of glutamate fell outside the 95 % confidence interval. The online fault diagnosis based on the proposed approach identified not only the start of the fault in the fermentation process, but also the end of the fault when the fermentation conditions were back to normal. The proposed approach only used a small sample sets from normal fermentations excitements to establish the approach, and then only required online recorded data on fermentation parameters for fault diagnosis in the fermentation process of glutamate. The proposed approach based on GAM and bootstrap provides a new and effective way for the fault diagnosis in the fermentation process of glutamate with small sample sets.
Wang, Jung-Han; Abdel-Aty, Mohamed; Wang, Ling
2017-07-01
There have been plenty of studies intended to use different methods, for example, empirical Bayes before-after methods, to get accurate estimation of CMFs. All of them have different assumptions toward crash count if there was no treatment. Additionally, another major assumption is that multiple sites share the same true CMF. Under this assumption, the CMF at an individual intersection is randomly drawn from a normally distributed population of CMFs at all intersections. Since CMFs are non-zero values, the population of all CMFs might not follow normal distributions, and even if it does, the true mean of CMFs at some intersections may be different from that at others. Therefore, a bootstrap method based on before-after empirical Bayes theory was proposed to estimate CMFs, but it did not make distributional assumptions. This bootstrap procedure has the added benefit of producing a measure of CMF stability. Furthermore, based on the bootstrapped CMF, a new CMF precision rating method was proposed to evaluate the reliability of CMFs. This study chose 29 urban four-legged intersections as treated sites, and their controls were changed from stop-controlled to signal-controlled. Meanwhile, 124 urban four-legged stop-controlled intersections were selected as reference sites. At first, different safety performance functions (SPFs) were applied to five crash categories, and it was found that each crash category had different optimal SPF form. Then, the CMFs of these five crash categories were estimated using the bootstrap empirical Bayes method. The results of the bootstrapped method showed that signalization significantly decreased Angle+Left-Turn crashes, and its CMF had the highest precision. While, the CMF for Rear-End crashes was unreliable. For KABCO, KABC, and KAB crashes, their CMFs were proved to be reliable for the majority of intersections, but the estimated effect of signalization may be not accurate at some sites. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Keylock, C. J.
2017-03-01
An algorithm is described that can generate random variants of a time series while preserving the probability distribution of original values and the pointwise Hölder regularity. Thus, it preserves the multifractal properties of the data. Our algorithm is similar in principle to well-known algorithms based on the preservation of the Fourier amplitude spectrum and original values of a time series. However, it is underpinned by a dual-tree complex wavelet transform rather than a Fourier transform. Our method, which we term the iterated amplitude adjusted wavelet transform can be used to generate bootstrapped versions of multifractal data, and because it preserves the pointwise Hölder regularity but not the local Hölder regularity, it can be used to test hypotheses concerning the presence of oscillating singularities in a time series, an important feature of turbulence and econophysics data. Because the locations of the data values are randomized with respect to the multifractal structure, hypotheses about their mutual coupling can be tested, which is important for the velocity-intermittency structure of turbulence and self-regulating processes.
Morales-Bayuelo, Alejandro; Ayazo, Hernan; Vivas-Reyes, Ricardo
2010-10-01
Comparative molecular similarity indices analysis (CoMSIA) and comparative molecular field analysis (CoMFA) were performed on a series of bicyclo [4.1.0] heptanes derivatives as melanin-concentrating hormone receptor R1 antagonists (MCHR1 antagonists). Molecular superimposition of antagonists on the template structure was performed by database alignment method. The statistically significant model was established on sixty five molecules, which were validated by a test set of ten molecules. The CoMSIA model yielded the best predictive model with a q(2) = 0.639, non cross-validated R(2) of 0.953, F value of 92.802, bootstrapped R(2) of 0.971, standard error of prediction = 0.402, and standard error of estimate = 0.146 while the CoMFA model yielded a q(2) = 0.680, non cross-validated R(2) of 0.922, F value of 114.351, bootstrapped R(2) of 0.925, standard error of prediction = 0.364, and standard error of estimate = 0.180. CoMFA analysis maps were employed for generating a pseudo cavity for LeapFrog calculation. The contour maps obtained from 3D-QSAR studies were appraised for activity trends for the molecules analyzed. The results show the variability of steric and electrostatic contributions that determine the activity of the MCHR1 antagonist, with these results we proposed new antagonists that may be more potent than previously reported, these novel antagonists were designed from the addition of highly electronegative groups in the substituent di(i-C(3)H(7))N- of the bicycle [4.1.0] heptanes, using the model CoMFA which also was used for the molecular design using the technique LeapFrog. The data generated from the present study will further help to design novel, potent, and selective MCHR1 antagonists. Copyright (c) 2010 Elsevier Masson SAS. All rights reserved.
Elastic S-matrices in (1 + 1) dimensions and Toda field theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christe, P.; Mussardo, G.
Particular deformations of 2-D conformal field theory lead to integrable massive quantum field theories. These can be characterized by the relative scattering data. This paper proposes a general scheme for classifying the elastic nondegenerate S-matrix in (1 + 1) dimensions starting from the possible boot-strap processes and the spins of the conserved currents. Their identification with the S-matrix coming from the Toda field theory is analyzed. The authors discuss both cases of Toda field theory constructed with the simply-laced Dynkin diagrams and the nonsimply-laced ones. The authors present the results of the perturbative analysis and their geometrical interpretations.
Quantitative body DW-MRI biomarkers uncertainty estimation using unscented wild-bootstrap.
Freiman, M; Voss, S D; Mulkern, R V; Perez-Rossello, J M; Warfield, S K
2011-01-01
We present a new method for the uncertainty estimation of diffusion parameters for quantitative body DW-MRI assessment. Diffusion parameters uncertainty estimation from DW-MRI is necessary for clinical applications that use these parameters to assess pathology. However, uncertainty estimation using traditional techniques requires repeated acquisitions, which is undesirable in routine clinical use. Model-based bootstrap techniques, for example, assume an underlying linear model for residuals rescaling and cannot be utilized directly for body diffusion parameters uncertainty estimation due to the non-linearity of the body diffusion model. To offset this limitation, our method uses the Unscented transform to compute the residuals rescaling parameters from the non-linear body diffusion model, and then applies the wild-bootstrap method to infer the body diffusion parameters uncertainty. Validation through phantom and human subject experiments shows that our method identify the regions with higher uncertainty in body DWI-MRI model parameters correctly with realtive error of -36% in the uncertainty values.
López, Erick B; Yamashita, Takashi
2017-02-01
This study examined whether household income mediates the relationship between acculturation and vegetable consumption among Latino adults in the U.S. Data from the 2009 to 2010 National Health and Nutrition Examination Survey were analyzed. Vegetable consumption index was created based on the frequencies of five kinds of vegetables intake. Acculturation was measured with the degree of English language use at home. Path model with bootstrapping technique was employed for mediation analysis. A significant partial mediation relationship was identified. Greater acculturation [95 % bias corrected bootstrap confident interval (BCBCI) = (0.02, 0.33)] was associated with the higher income and in turn, greater vegetable consumption. At the same time, greater acculturation was associated with lower vegetable consumption [95 % BCBCI = (-0.88, -0.07)]. Findings regarding the income as a mediator of the acculturation-dietary behavior relationship inform unique intervention programs and policy changes to address health disparities by race/ethnicity.
How to bootstrap a human communication system.
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified items to a partner using repeated non-linguistic vocalization, repeated gesture, or repeated non-linguistic vocalization plus gesture (but without using their existing language system). Gesture proved more effective (measured by communication success) and more efficient (measured by the time taken to communicate) than non-linguistic vocalization across a range of item categories (emotion, object, and action). Combining gesture and vocalization did not improve performance beyond gesture alone. We experimentally demonstrate that gesture is a more effective means of bootstrapping a human communication system. We argue that gesture outperforms non-linguistic vocalization because it lends itself more naturally to the production of motivated signs. © 2013 Cognitive Science Society, Inc.
Measuring and Benchmarking Technical Efficiency of Public Hospitals in Tianjin, China
Li, Hao; Dong, Siping
2015-01-01
China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. PMID:26396090
NASA Astrophysics Data System (ADS)
Oladyshkin, S.; Schroeder, P.; Class, H.; Nowak, W.
2013-12-01
Predicting underground carbon dioxide (CO2) storage represents a challenging problem in a complex dynamic system. Due to lacking information about reservoir parameters, quantification of uncertainties may become the dominant question in risk assessment. Calibration on past observed data from pilot-scale test injection can improve the predictive power of the involved geological, flow, and transport models. The current work performs history matching to pressure time series from a pilot storage site operated in Europe, maintained during an injection period. Simulation of compressible two-phase flow and transport (CO2/brine) in the considered site is computationally very demanding, requiring about 12 days of CPU time for an individual model run. For that reason, brute-force approaches for calibration are not feasible. In the current work, we explore an advanced framework for history matching based on the arbitrary polynomial chaos expansion (aPC) and strict Bayesian principles. The aPC [1] offers a drastic but accurate stochastic model reduction. Unlike many previous chaos expansions, it can handle arbitrary probability distribution shapes of uncertain parameters, and can therefore handle directly the statistical information appearing during the matching procedure. We capture the dependence of model output on these multipliers with the expansion-based reduced model. In our study we keep the spatial heterogeneity suggested by geophysical methods, but consider uncertainty in the magnitude of permeability trough zone-wise permeability multipliers. Next combined the aPC with Bootstrap filtering (a brute-force but fully accurate Bayesian updating mechanism) in order to perform the matching. In comparison to (Ensemble) Kalman Filters, our method accounts for higher-order statistical moments and for the non-linearity of both the forward model and the inversion, and thus allows a rigorous quantification of calibrated model uncertainty. The usually high computational costs of accurate filtering become very feasible for our suggested aPC-based calibration framework. However, the power of aPC-based Bayesian updating strongly depends on the accuracy of prior information. In the current study, the prior assumptions on the model parameters were not satisfactory and strongly underestimate the reservoir pressure. Thus, the aPC-based response surface used in Bootstrap filtering is fitted to a distant and poorly chosen region within the parameter space. Thanks to the iterative procedure suggested in [2] we overcome this drawback with small computational costs. The iteration successively improves the accuracy of the expansion around the current estimation of the posterior distribution. The final result is a calibrated model of the site that can be used for further studies, with an excellent match to the data. References [1] Oladyshkin S. and Nowak W. Data-driven uncertainty quantification using the arbitrary polynomial chaos expansion. Reliability Engineering and System Safety, 106:179-190, 2012. [2] Oladyshkin S., Class H., Nowak W. Bayesian updating via Bootstrap filtering combined with data-driven polynomial chaos expansions: methodology and application to history matching for carbon dioxide storage in geological formations. Computational Geosciences, 17 (4), 671-687, 2013.
Kolmogorov-Smirnov test for spatially correlated data
Olea, R.A.; Pawlowsky-Glahn, V.
2009-01-01
The Kolmogorov-Smirnov test is a convenient method for investigating whether two underlying univariate probability distributions can be regarded as undistinguishable from each other or whether an underlying probability distribution differs from a hypothesized distribution. Application of the test requires that the sample be unbiased and the outcomes be independent and identically distributed, conditions that are violated in several degrees by spatially continuous attributes, such as topographical elevation. A generalized form of the bootstrap method is used here for the purpose of modeling the distribution of the statistic D of the Kolmogorov-Smirnov test. The innovation is in the resampling, which in the traditional formulation of bootstrap is done by drawing from the empirical sample with replacement presuming independence. The generalization consists of preparing resamplings with the same spatial correlation as the empirical sample. This is accomplished by reading the value of unconditional stochastic realizations at the sampling locations, realizations that are generated by simulated annealing. The new approach was tested by two empirical samples taken from an exhaustive sample closely following a lognormal distribution. One sample was a regular, unbiased sample while the other one was a clustered, preferential sample that had to be preprocessed. Our results show that the p-value for the spatially correlated case is always larger that the p-value of the statistic in the absence of spatial correlation, which is in agreement with the fact that the information content of an uncorrelated sample is larger than the one for a spatially correlated sample of the same size. ?? Springer-Verlag 2008.
Fluoxetine and imipramine: are there differences in cost-utility for depression in primary care?
Serrano-Blanco, Antoni; Suárez, David; Pinto-Meza, Alejandra; Peñarrubia, Maria T; Haro, Josep Maria
2009-02-01
Depressive disorders generate severe personal burden and high economic costs. Cost-utility analyses of the different therapeutical options are crucial to policy-makers and clinicians. Previous cost-utility studies, comparing selective serotonin reuptake inhibitors and tricyclic antidepressants, have used modelling techniques or have not included indirect costs in the economic analyses. To determine the cost-utility of fluoxetine compared with imipramine for treating depressive disorders in primary care. A 6-month randomized prospective naturalistic study comparing fluoxetine with imipramine was conducted in three primary care centres in Spain. One hundred and three patients requiring antidepressant treatment for a DSM-IV depressive disorder were included in the study. Patients were randomized either to fluoxetine (53 patients) or to imipramine (50 patients) treatment. Patients were treated with antidepressants according to their general practitioner's usual clinical practice. Outcome measures were the quality of life tariff of the European Quality of Life Questionnaire: EuroQoL-5D (five domains), direct costs, indirect costs and total costs. Subjects were evaluated at the beginning of treatment and after 1, 3 and 6 months. Incremental cost-utility ratios (ICUR) were obtained. To address uncertainty in the ICUR's sampling distribution, non-parametric bootstrapping was carried out. Taking into account adjusted total costs and incremental quality of life gained, imipramine dominated fluoxetine with 81.5% of the bootstrap replications in the dominance quadrant. Imipramine seems to be a better cost-utility antidepressant option for treating depressive disorders in primary care.
Using In Vitro High-Throughput Screening Data for Predicting ...
Today there are more than 80,000 chemicals in commerce and the environment. The potential human health risks are unknown for the vast majority of these chemicals as they lack human health risk assessments, toxicity reference values and risk screening values. We aim to use computational toxicology and quantitative high throughput screening (qHTS) technologies to fill these data gaps, and begin to prioritize these chemicals for additional assessment. By coupling qHTS data with adverse outcome pathways (AOPs) we can use ontologies to make predictions about potential hazards and to identify those assays which are sufficient to infer these same hazards. Once those assays are identified, we can use bootstrap natural spline-based metaregression to integrate the evidence across multiple replicates or assays (if a combination of assays are together necessary to be sufficient). In this pilot, we demonstrate how we were able to identify that benzo[k]fluoranthene (B[k]F) may induce DNA damage and steatosis using qHTS data and two separate AOPs. We also demonstrate how bootstrap natural spline-based metaregression can be used to integrate the data across multiple assay replicates to generate a concentration-response curve. We used this analysis to calculate an internal point of departure of 0.751µM and risk-specific concentrations of 0.378µM for both 1:1,000 and 1:10,000 additive risk for B[k]F induced DNA damage based on the p53 assay. Based on the available evidence, we
Bootstrapping quarks and gluons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chew, G.F.
1979-04-01
Dual topological unitarization (DTU) - the approach to S-matrix causality and unitarity through combinatorial topology - is reviewed. Amplitudes associated with triangulated spheres are shown to constitute the core of particle physics. Each sphere is covered by triangulated disc faces corresponding to hadrons. The leading current candidate for the hadron-face triangulation pattern employs 3-triangle basic subdiscs whose orientations correspond to baryon number and topological color. Additional peripheral triangles lie along the hadron-face perimeter. Certain combinations of peripheral triangles with a basic-disc triangle can be identified as quarks, the flavor of a quark corresponding to the orientation of its edges thatmore » lie on the hadron-face perimeter. Both baryon number and flavor are additively conserved. Quark helicity, which can be associated with triangle-interior orientation, is not uniformly conserved and interacts with particle momentum, whereas flavor does not. Three different colors attach to the 3 quarks associated with a single basic subdisc, but there is no additive physical conservation law associated with color. There is interplay between color and quark helicity. In hadron faces with more than one basic subdisc, there may occur pairs of adjacent flavorless but colored triangles with net helicity +-1 that are identifiable as gluons. Broken symmetry is an automatic feature of the bootstrap. T, C and P symmetries, as well as up-down flavor symmetry, persist on all orientable surfaces.« less
Li, Wen; Zhao, Li-Zhong; Ma, Dong-Wang; Wang, De-Zheng; Shi, Lei; Wang, Hong-Lei; Dong, Mo; Zhang, Shu-Yi; Cao, Lei; Zhang, Wei-Hua; Zhang, Xi-Peng; Zhang, Qing-Huai; Yu, Lin; Qin, Hai; Wang, Xi-Mo; Chen, Sam Li-Sheng
2018-05-01
We aimed to predict colorectal cancer (CRC) based on the demographic features and clinical correlates of personal symptoms and signs from Tianjin community-based CRC screening data.A total of 891,199 residents who were aged 60 to 74 and were screened in 2012 were enrolled. The Lasso logistic regression model was used to identify the predictors for CRC. Predictive validity was assessed by the receiver operating characteristic (ROC) curve. Bootstrapping method was also performed to validate this prediction model.CRC was best predicted by a model that included age, sex, education level, occupations, diarrhea, constipation, colon mucosa and bleeding, gallbladder disease, a stressful life event, family history of CRC, and a positive fecal immunochemical test (FIT). The area under curve (AUC) for the questionnaire with a FIT was 84% (95% CI: 82%-86%), followed by 76% (95% CI: 74%-79%) for a FIT alone, and 73% (95% CI: 71%-76%) for the questionnaire alone. With 500 bootstrap replications, the estimated optimism (<0.005) shows good discrimination in validation of prediction model.A risk prediction model for CRC based on a series of symptoms and signs related to enteric diseases in combination with a FIT was developed from first round of screening. The results of the current study are useful for increasing the awareness of high-risk subjects and for individual-risk-guided invitations or strategies to achieve mass screening for CRC.
BusyBee Web: metagenomic data analysis by bootstrapped supervised binning and annotation
Kiefer, Christina; Fehlmann, Tobias; Backes, Christina
2017-01-01
Abstract Metagenomics-based studies of mixed microbial communities are impacting biotechnology, life sciences and medicine. Computational binning of metagenomic data is a powerful approach for the culture-independent recovery of population-resolved genomic sequences, i.e. from individual or closely related, constituent microorganisms. Existing binning solutions often require a priori characterized reference genomes and/or dedicated compute resources. Extending currently available reference-independent binning tools, we developed the BusyBee Web server for the automated deconvolution of metagenomic data into population-level genomic bins using assembled contigs (Illumina) or long reads (Pacific Biosciences, Oxford Nanopore Technologies). A reversible compression step as well as bootstrapped supervised binning enable quick turnaround times. The binning results are represented in interactive 2D scatterplots. Moreover, bin quality estimates, taxonomic annotations and annotations of antibiotic resistance genes are computed and visualized. Ground truth-based benchmarks of BusyBee Web demonstrate comparably high performance to state-of-the-art binning solutions for assembled contigs and markedly improved performance for long reads (median F1 scores: 70.02–95.21%). Furthermore, the applicability to real-world metagenomic datasets is shown. In conclusion, our reference-independent approach automatically bins assembled contigs or long reads, exhibits high sensitivity and precision, enables intuitive inspection of the results, and only requires FASTA-formatted input. The web-based application is freely accessible at: https://ccb-microbe.cs.uni-saarland.de/busybee. PMID:28472498
Classifier performance prediction for computer-aided diagnosis using a limited dataset.
Sahiner, Berkman; Chan, Heang-Ping; Hadjiiski, Lubomir
2008-04-01
In a practical classifier design problem, the true population is generally unknown and the available sample is finite-sized. A common approach is to use a resampling technique to estimate the performance of the classifier that will be trained with the available sample. We conducted a Monte Carlo simulation study to compare the ability of the different resampling techniques in training the classifier and predicting its performance under the constraint of a finite-sized sample. The true population for the two classes was assumed to be multivariate normal distributions with known covariance matrices. Finite sets of sample vectors were drawn from the population. The true performance of the classifier is defined as the area under the receiver operating characteristic curve (AUC) when the classifier designed with the specific sample is applied to the true population. We investigated methods based on the Fukunaga-Hayes and the leave-one-out techniques, as well as three different types of bootstrap methods, namely, the ordinary, 0.632, and 0.632+ bootstrap. The Fisher's linear discriminant analysis was used as the classifier. The dimensionality of the feature space was varied from 3 to 15. The sample size n2 from the positive class was varied between 25 and 60, while the number of cases from the negative class was either equal to n2 or 3n2. Each experiment was performed with an independent dataset randomly drawn from the true population. Using a total of 1000 experiments for each simulation condition, we compared the bias, the variance, and the root-mean-squared error (RMSE) of the AUC estimated using the different resampling techniques relative to the true AUC (obtained from training on a finite dataset and testing on the population). Our results indicated that, under the study conditions, there can be a large difference in the RMSE obtained using different resampling methods, especially when the feature space dimensionality is relatively large and the sample size is small. Under this type of conditions, the 0.632 and 0.632+ bootstrap methods have the lowest RMSE, indicating that the difference between the estimated and the true performances obtained using the 0.632 and 0.632+ bootstrap will be statistically smaller than those obtained using the other three resampling methods. Of the three bootstrap methods, the 0.632+ bootstrap provides the lowest bias. Although this investigation is performed under some specific conditions, it reveals important trends for the problem of classifier performance prediction under the constraint of a limited dataset.
Dong, Qi; Elliott, Michael R; Raghunathan, Trivellore E
2014-06-01
Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs.
Dong, Qi; Elliott, Michael R.; Raghunathan, Trivellore E.
2017-01-01
Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs. PMID:29200608
Kinetic effects on the currents determining the stability of a magnetic island in tokamaks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poli, E., E-mail: emanuele.poli@ipp.mpg.de; Bergmann, A.; Casson, F. J.
The role of the bootstrap and polarization currents for the stability of neoclassical tearing modes is investigated employing both a drift kinetic and a gyrokinetic approach. The adiabatic response of the ions around the island separatrix implies, for island widths below or around the ion thermal banana width, density flattening for islands rotating at the ion diamagnetic frequency, while for islands rotating at the electron diamagnetic frequency the density is unperturbed and the only contribution to the neoclassical drive arises from electron temperature flattening. As for the polarization current, the full inclusion of finite orbit width effects in the calculationmore » of the potential developing in a rotating island leads to a smoothing of the discontinuous derivatives exhibited by the analytic potential on which the polarization term used in the modeling is based. This leads to a reduction of the polarization-current contribution with respect to the analytic estimate, in line with other studies. Other contributions to the perpendicular ion current, related to the response of the particles around the island separatrix, are found to compete or even dominate the polarization-current term for realistic island rotation frequencies.« less
Maternal Depression and Trait Anger as Risk Factors for Escalated Physical Discipline
Shay, Nicole L.; Knutson, John F.
2008-01-01
To test the hypothesized anger-mediated relation between maternal depression and escalation of physical discipline, 122 economically disadvantaged mothers were assessed for current and lifetime diagnoses of depression using the Current Depressive Episode, Past Depression, and Dysthymia sections of the Structured Clinical Interview for DSM-IV (SCID) and a measure of current depressive symptoms, the Beck Depression Inventory–Second Edition (BDI-II). Escalation of physical discipline was assessed using a video analog parenting task; maternal anger not specific to discipline was assessed using the Spielberger Trait Anger Expression Inventory. Reports of anger were associated with the diagnosis of depression and depressive symptoms. Bootstrap analyses of indirect effects indicated that the link between depression and escalated discipline was mediated by anger. Parallel analyses based on BDI-II scores identified a marginally significant indirect effect of depression on discipline. Findings suggest that anger and irritability are central to the putative link between depression and harsh discipline. PMID:18174347
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Xian-Qu; Zhang, Rui-Bin; Meng, Guo
2016-07-15
The destabilization of ideal internal kink modes by trapped fast particles in tokamak plasmas with a “shoulder”-like equilibrium current is investigated. It is found that energetic particle branch of the mode is unstable with the driving of fast-particle precession drifts and corresponds to a precessional fishbone. The mode with a low stability threshold is also more easily excited than the conventional precessional fishbone. This is different from earlier studies for the same equilibrium in which the magnetohydrodynamic (MHD) branch of the mode is stable. Furthermore, the stability and characteristic frequency of the mode are analyzed by solving the dispersion relationmore » and comparing with the conventional fishbone. The results suggest that an equilibrium with a locally flattened q-profile, may be modified by localized current drive (or bootstrap current, etc.), is prone to the onset of the precessional fishbone branch of the mode.« less
A compact human-powered energy harvesting system
NASA Astrophysics Data System (ADS)
Rao, Yuan; McEachern, Kelly M.; Arnold, David P.
2013-12-01
This paper presents a fully functional, self-sufficient body-worn energy harvesting system for passively capturing energy from human motion, with the long-term vision of supplying power to portable, wearable, or even implanted electronic devices. The system requires no external power supplies and can bootstrap from zero-state-of-charge to generate electrical energy from walking, jogging and cycling; convert the induced ac voltage to a dc voltage; and then boost and regulate the dc voltage to charge a Li-ion-polymer battery. Tested under normal human activities (walking, jogging, cycling) when worn on different parts of the body, the 70 cm3 system is shown to charge a 3.7 V rechargeable battery at charge rates ranging from 33 μW to 234 μW.
A new ophiovirus is associated with blueberry mosaic disease.
Thekke-Veetil, Thanuja; Ho, Thien; Keller, Karen E; Martin, Robert R; Tzanetakis, Ioannis E
2014-08-30
Blueberry mosaic disease (BMD) was first described more than 60 years ago and is caused by a yet unidentified graft transmissible agent. A combination of traditional methods and next generation sequencing disclosed the presence of a new ophiovirus in symptomatic plants. The virus was detected in all BMD samples collected from several production areas of North America and was thus named blueberry mosaic associated virus. Phylogenetic analysis, supported by high bootstrap values, places the virus within the family Ophioviridae. The genome organization resembles that of citrus psorosis virus, the type member of the genus Ophiovirus. The implications of this discovery in BMD control and blueberry virus certification schemes are also discussed. Copyright © 2014 Elsevier B.V. All rights reserved.
Early Childhood Adversity and Pregnancy Outcomes
Smith, Megan V.; Gotman, Nathan; Yonkers, Kimberly A.
2016-01-01
Objectives To examine the association between adverse childhood experiences (ACEs) and pregnancy outcomes; to explore mediators of this association including psychiatric illness and health habits. Methods Exposure to ACEs was determined by the Early Trauma Inventory Self Report Short Form; psychiatric diagnoses were generated by the Composite International Diagnostic Interview administered in a cohort of 2303 pregnant women. Linear regression and structural equation modeling bootstrapping approaches tested for multiple mediators. Results Each additional ACE decreased birth weight by 16.33 g and decreased gestational age by 0.063. Smoking was the strongest mediator of the effect on gestational age. Conclusions ACEs have an enduring effect on maternal reproductive health, as manifested by mothers’ delivery of offspring that were of reduced birth weight and shorter gestational age. PMID:26762511
Concept Innateness, Concept Continuity, and Bootstrapping
Carey, Susan
2011-01-01
The commentators raised issues relevant to all three important theses of The Origin of Concepts (TOOC). Some questioned the very existence of innate representational primitives, and others questioned my claims about their richness and whether they should be thought of as concepts. Some questioned the existence of conceptual discontinuity in the course of knowledge acquisition and others argued that discontinuity is much more common than portrayed in TOOC. Some raised issues with my characterization of Quinian bootstrapping, and others questioned the dual factor theory of concepts motivated by my picture of conceptual development. PMID:23264705
Crossing symmetry in alpha space
NASA Astrophysics Data System (ADS)
Hogervorst, Matthijs; van Rees, Balt C.
2017-11-01
We initiate the study of the conformal bootstrap using Sturm-Liouville theory, specializing to four-point functions in one-dimensional CFTs. We do so by decomposing conformal correlators using a basis of eigenfunctions of the Casimir which are labeled by a complex number α. This leads to a systematic method for computing conformal block decompositions. Analyzing bootstrap equations in alpha space turns crossing symmetry into an eigenvalue problem for an integral operator K. The operator K is closely related to the Wilson transform, and some of its eigenfunctions can be found in closed form.
Direct measurement of fast transients by using boot-strapped waveform averaging
NASA Astrophysics Data System (ADS)
Olsson, Mattias; Edman, Fredrik; Karki, Khadga Jung
2018-03-01
An approximation to coherent sampling, also known as boot-strapped waveform averaging, is presented. The method uses digital cavities to determine the condition for coherent sampling. It can be used to increase the effective sampling rate of a repetitive signal and the signal to noise ratio simultaneously. The method is demonstrated by using it to directly measure the fluorescence lifetime from Rhodamine 6G by digitizing the signal from a fast avalanche photodiode. The obtained lifetime of 4.0 ns is in agreement with the known values.
NASA Astrophysics Data System (ADS)
Stroeve, Julienne C.; Jenouvrier, Stephanie; Campbell, G. Garrett; Barbraud, Christophe; Delord, Karine
2016-08-01
Sea ice variability within the marginal ice zone (MIZ) and polynyas plays an important role for phytoplankton productivity and krill abundance. Therefore, mapping their spatial extent as well as seasonal and interannual variability is essential for understanding how current and future changes in these biologically active regions may impact the Antarctic marine ecosystem. Knowledge of the distribution of MIZ, consolidated pack ice and coastal polynyas in the total Antarctic sea ice cover may also help to shed light on the factors contributing towards recent expansion of the Antarctic ice cover in some regions and contraction in others. The long-term passive microwave satellite data record provides the longest and most consistent record for assessing the proportion of the sea ice cover that is covered by each of these ice categories. However, estimates of the amount of MIZ, consolidated pack ice and polynyas depend strongly on which sea ice algorithm is used. This study uses two popular passive microwave sea ice algorithms, the NASA Team and Bootstrap, and applies the same thresholds to the sea ice concentrations to evaluate the distribution and variability in the MIZ, the consolidated pack ice and coastal polynyas. Results reveal that the seasonal cycle in the MIZ and pack ice is generally similar between both algorithms, yet the NASA Team algorithm has on average twice the MIZ and half the consolidated pack ice area as the Bootstrap algorithm. Trends also differ, with the Bootstrap algorithm suggesting statistically significant trends towards increased pack ice area and no statistically significant trends in the MIZ. The NASA Team algorithm on the other hand indicates statistically significant positive trends in the MIZ during spring. Potential coastal polynya area and amount of broken ice within the consolidated ice pack are also larger in the NASA Team algorithm. The timing of maximum polynya area may differ by as much as 5 months between algorithms. These differences lead to different relationships between sea ice characteristics and biological processes, as illustrated here with the breeding success of an Antarctic seabird.
Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques
NASA Astrophysics Data System (ADS)
Mai, J.; Tolson, B.
2017-12-01
The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method's independency of the convergence testing method, we applied it to two widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991) and the variance-based Sobol' method (Solbol' 1993). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed sensitivity results. This is one step towards reliable and transferable, published sensitivity results.
Development of welding emission factors for Cr and Cr(VI) with a confidence level.
Serageldin, Mohamed; Reeves, David W
2009-05-01
Knowledge of the emission rate and release characteristics is necessary for estimating pollutant fate and transport. Because emission measurements at a facility's fence line are generally not readily available, environmental agencies in many countries are using emission factors (EFs) to indicate the quantity of certain pollutants released into the atmosphere from operations such as welding. The amount of fumes and metals generated from a welding process is dependent on many parameters, such as electrode composition, voltage, and current. Because test reports on fume generation provide different levels of detail, a common approach was used to give a test report a quality rating on the basis of several highly subjective criteria; however, weighted average EFs generated in this way are not meant to reflect data precision or to be used for a refined risk analysis. The 95% upper confidence limit (UCL) of the unknown population mean was used in this study to account for the uncertainty in the EF test data. Several parametric UCLs were computed and compared for multiple welding EFs associated with several mild, stainless, and alloy steels. Also, several nonparametric statistical methods, including several bootstrap procedures, were used to compute 95% UCLs. For the nonparametric methods, a distribution for calculating the mean, standard deviation, and other statistical parameters for a dataset does not need to be assumed. There were instances when the sample size was small and instances when EFs for an electrode/process combination were not found. Those two points are addressed in this paper. Finally, this paper is an attempt to deal with the uncertainty in the value of a mean EF for an electrode/process combination that is based on test data from several laboratories. Welding EFs developed with a defined level of confidence may be used as input parameters for risk assessment.
Brunelli, Alessandro; Tentzeris, Vasileios; Sandri, Alberto; McKenna, Alexandra; Liew, Shan Liung; Milton, Richard; Chaudhuri, Nilanjan; Kefaloyannis, Emmanuel; Papagiannopoulos, Kostas
2016-05-01
To develop a clinically risk-adjusted financial model to estimate the cost associated with a video-assisted thoracoscopic surgery (VATS) lobectomy programme. Prospectively collected data of 236 VATS lobectomy patients (August 2012-December 2013) were analysed retrospectively. Fixed and variable intraoperative and postoperative costs were retrieved from the Hospital Accounting Department. Baseline and surgical variables were tested for a possible association with total cost using a multivariable linear regression and bootstrap analyses. Costs were calculated in GBP and expressed in Euros (EUR:GBP exchange rate 1.4). The average total cost of a VATS lobectomy was €11 368 (range €6992-€62 535). Average intraoperative (including surgical and anaesthetic time, overhead, disposable materials) and postoperative costs [including ward stay, high dependency unit (HDU) or intensive care unit (ICU) and variable costs associated with management of complications] were €8226 (range €5656-€13 296) and €3029 (range €529-€51 970), respectively. The following variables remained reliably associated with total costs after linear regression analysis and bootstrap: carbon monoxide lung diffusion capacity (DLCO) <60% predicted value (P = 0.02, bootstrap 63%) and chronic obstructive pulmonary disease (COPD; P = 0.035, bootstrap 57%). The following model was developed to estimate the total costs: 10 523 + 1894 × COPD + 2376 × DLCO < 60%. The comparison between predicted and observed costs was repeated in 1000 bootstrapped samples to verify the stability of the model. The two values were not different (P > 0.05) in 86% of the samples. A hypothetical patient with COPD and DLCO less than 60% would cost €4270 more than a patient without COPD and with higher DLCO values (€14 793 vs €10 523). Risk-adjusting financial data can help estimate the total cost associated with VATS lobectomy based on clinical factors. This model can be used to audit the internal financial performance of a VATS lobectomy programme for budgeting, planning and for appropriate bundled payment reimbursements. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
Roche, Anne I; Kroska, Emily B; Miller, Michelle L; Kroska, Sydney K; O'Hara, Michael W
2018-03-22
Childhood trauma is associated with a variety of risky, unhealthy, or problem behaviors. The current study aimed to explore experiential avoidance and mindfulness processes as mechanisms through which childhood trauma and problem behavior are linked in a college sample. The sample consisted of college-aged young adults recruited November-December, 2016 (N = 414). Participants completed self-report measures of childhood trauma, current problem behavior, experiential avoidance, and mindfulness processes. Bootstrapped mediation analyses examined the mechanistic associations of interest. Mediation analyses indicated that experiential avoidance was a significant mediator of the association between childhood trauma and problem behavior. Additionally, multiple mediation analyses indicated that specific mindfulness facets-act with awareness and nonjudgment of inner experience-significantly mediated the same association. Interventions for college students who have experienced childhood trauma might profitably target mechanisms such as avoidance and mindfulness in order to minimize engagement in problem behavior.
Anomalous dimensions of spinning operators from conformal symmetry
NASA Astrophysics Data System (ADS)
Gliozzi, Ferdinando
2018-01-01
We compute, to the first non-trivial order in the ɛ-expansion of a perturbed scalar field theory, the anomalous dimensions of an infinite class of primary operators with arbitrary spin ℓ = 0, 1, . . . , including as a particular case the weakly broken higher-spin currents, using only constraints from conformal symmetry. Following the bootstrap philosophy, no reference is made to any Lagrangian, equations of motion or coupling constants. Even the space dimensions d are left free. The interaction is implicitly turned on through the local operators by letting them acquire anomalous dimensions. When matching certain four-point and five-point functions with the corresponding quantities of the free field theory in the ɛ → 0 limit, no free parameter remains. It turns out that only the expected discrete d values are permitted and the ensuing anomalous dimensions reproduce known results for the weakly broken higher-spin currents and provide new results for the other spinning operators.
Bootstrapping language acquisition.
Abend, Omri; Kwiatkowski, Tom; Smith, Nathaniel J; Goldwater, Sharon; Steedman, Mark
2017-07-01
The semantic bootstrapping hypothesis proposes that children acquire their native language through exposure to sentences of the language paired with structured representations of their meaning, whose component substructures can be associated with words and syntactic structures used to express these concepts. The child's task is then to learn a language-specific grammar and lexicon based on (probably contextually ambiguous, possibly somewhat noisy) pairs of sentences and their meaning representations (logical forms). Starting from these assumptions, we develop a Bayesian probabilistic account of semantically bootstrapped first-language acquisition in the child, based on techniques from computational parsing and interpretation of unrestricted text. Our learner jointly models (a) word learning: the mapping between components of the given sentential meaning and lexical words (or phrases) of the language, and (b) syntax learning: the projection of lexical elements onto sentences by universal construction-free syntactic rules. Using an incremental learning algorithm, we apply the model to a dataset of real syntactically complex child-directed utterances and (pseudo) logical forms, the latter including contextually plausible but irrelevant distractors. Taking the Eve section of the CHILDES corpus as input, the model simulates several well-documented phenomena from the developmental literature. In particular, the model exhibits syntactic bootstrapping effects (in which previously learned constructions facilitate the learning of novel words), sudden jumps in learning without explicit parameter setting, acceleration of word-learning (the "vocabulary spurt"), an initial bias favoring the learning of nouns over verbs, and one-shot learning of words and their meanings. The learner thus demonstrates how statistical learning over structured representations can provide a unified account for these seemingly disparate phenomena. Copyright © 2017 Elsevier B.V. All rights reserved.
Effect of non-normality on test statistics for one-way independent groups designs.
Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R
2012-02-01
The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.
Machine vs. human translation of SNOMED CT terms.
Schulz, Stefan; Bernhardt-Melischnig, Johannes; Kreuzthaler, Markus; Daumke, Philipp; Boeker, Martin
2013-01-01
In the context of past and current SNOMED CT translation projects we compare three kinds of SNOMED CT translations from English to German by: (t1) professional medical translators; (t2) a free Web-based machine translation service; (t3) medical students. 500 SNOMED CT fully specified names from the (English) International release were randomly selected. Based on this, German translations t1, t2, and t3 were generated. A German and an Austrian physician rated the translations for linguistic correctness and content fidelity. Kappa for inter-rater reliability was 0.4 for linguistic correctness and 0.23 for content fidelity. Average ratings of linguistic correctness did not differ significantly between human translation scenarios. Content fidelity was rated slightly better for student translators compared to professional translators. Comparing machine to human translation, the linguistic correctness differed about 0.5 scale units in favour of the human translation and about 0.25 regarding content fidelity, equally in favour of the human translation. The results demonstrate that low-cost translation solutions of medical terms may produce surprisingly good results. Although we would not recommend low-cost translation for producing standardized preferred terms, this approach can be useful for creating additional language-specific entry terms. This may serve several important use cases. We also recommend testing this method to bootstrap a crowdsourcing process, by which term translations are gathered, improved, maintained, and rated by the user community.
Frey, H Christopher; Zhao, Yuchao
2004-11-15
Probabilistic emission inventories were developed for urban air toxic emissions of benzene, formaldehyde, chromium, and arsenic for the example of Houston. Variability and uncertainty in emission factors were quantified for 71-97% of total emissions, depending upon the pollutant and data availability. Parametric distributions for interunit variability were fit using maximum likelihood estimation (MLE), and uncertainty in mean emission factors was estimated using parametric bootstrap simulation. For data sets containing one or more nondetected values, empirical bootstrap simulation was used to randomly sample detection limits for nondetected values and observations for sample values, and parametric distributions for variability were fit using MLE estimators for censored data. The goodness-of-fit for censored data was evaluated by comparison of cumulative distributions of bootstrap confidence intervals and empirical data. The emission inventory 95% uncertainty ranges are as small as -25% to +42% for chromium to as large as -75% to +224% for arsenic with correlated surrogates. Uncertainty was dominated by only a few source categories. Recommendations are made for future improvements to the analysis.
Bootstrapping non-commutative gauge theories from L∞ algebras
NASA Astrophysics Data System (ADS)
Blumenhagen, Ralph; Brunner, Ilka; Kupriyanov, Vladislav; Lüst, Dieter
2018-05-01
Non-commutative gauge theories with a non-constant NC-parameter are investigated. As a novel approach, we propose that such theories should admit an underlying L∞ algebra, that governs not only the action of the symmetries but also the dynamics of the theory. Our approach is well motivated from string theory. We recall that such field theories arise in the context of branes in WZW models and briefly comment on its appearance for integrable deformations of AdS5 sigma models. For the SU(2) WZW model, we show that the earlier proposed matrix valued gauge theory on the fuzzy 2-sphere can be bootstrapped via an L∞ algebra. We then apply this approach to the construction of non-commutative Chern-Simons and Yang-Mills theories on flat and curved backgrounds with non-constant NC-structure. More concretely, up to the second order, we demonstrate how derivative and curvature corrections to the equations of motion can be bootstrapped in an algebraic way from the L∞ algebra. The appearance of a non-trivial A∞ algebra is discussed, as well.
A symbol of uniqueness: the cluster bootstrap for the 3-loop MHV heptagon
Drummond, J. M.; Papathanasiou, G.; Spradlin, M.
2015-03-16
Seven-particle scattering amplitudes in planar super-Yang-Mills theory are believed to belong to a special class of generalised polylogarithm functions called heptagon functions. These are functions with physical branch cuts whose symbols may be written in terms of the 42 cluster A-coordinates on Gr(4, 7). Motivated by the success of the hexagon bootstrap programme for constructing six-particle amplitudes we initiate the systematic study of the symbols of heptagon functions. We find that there is exactly one such symbol of weight six which satisfies the MHV last-entry condition and is finite in the 7 ll 6 collinear limit. This unique symbol ismore » both dihedral and parity-symmetric, and remarkably its collinear limit is exactly the symbol of the three-loop six-particle MHV amplitude, although none of these properties were assumed a priori. It must therefore be the symbol of the threeloop seven-particle MHV amplitude. The simplicity of its construction suggests that the n-gon bootstrap may be surprisingly powerful for n > 6.« less
NASA Technical Reports Server (NTRS)
Yoshikawa, H. H.; Madison, I. B.
1971-01-01
This study was performed in support of the NASA Task B-2 Study Plan for Space Basing. The nature of space-based operations implies that orbital transfer of propellant is a prime consideration. The intent of this report is (1) to report on the findings and recommendations of existing literature on space-based propellant transfer techniques, and (2) to determine possible alternatives to the recommended methods. The reviewed literature recommends, in general, the use of conventional liquid transfer techniques (i.e., pumping) in conjunction with an artificially induced gravitational field. An alternate concept that was studied, the Thermal Bootstrap Transfer Process, is based on the compression of a two-phase fluid with subsequent condensation to a liquid (vapor compression/condensation). This concept utilizes the intrinsic energy capacities of the tanks and propellant by exploiting temperature differentials and available energy differences. The results indicate the thermodynamic feasibility of the Thermal Bootstrap Transfer Process for a specific range of tank sizes, temperatures, fill-factors and receiver tank heat transfer coefficients.
Dmitriev, Egor V; Khomenko, Georges; Chami, Malik; Sokolov, Anton A; Churilova, Tatyana Y; Korotaev, Gennady K
2009-03-01
The absorption of sunlight by oceanic constituents significantly contributes to the spectral distribution of the water-leaving radiance. Here it is shown that current parameterizations of absorption coefficients do not apply to the optically complex waters of the Crimea Peninsula. Based on in situ measurements, parameterizations of phytoplankton, nonalgal, and total particulate absorption coefficients are proposed. Their performance is evaluated using a log-log regression combined with a low-pass filter and the nonlinear least-square method. Statistical significance of the estimated parameters is verified using the bootstrap method. The parameterizations are relevant for chlorophyll a concentrations ranging from 0.45 up to 2 mg/m(3).
Use of high order, periodic orbits in the PIES code
NASA Astrophysics Data System (ADS)
Monticello, Donald; Reiman, Allan
2010-11-01
We have implemented a version of the PIES code (Princeton Iterative Equilibrium SolverootnotetextA. Reiman et al 2007 Nucl. Fusion 47 572) that uses high order periodic orbits to select the surfaces on which straight magnetic field line coordinates will be calculated. The use of high order periodic orbits has increase the robustness and speed of the PIES code. We now have more uniform treatment of in-phase and out-of-phase islands. This new version has better convergence properties and works well with a full Newton scheme. We now have the ability to shrink islands using a bootstrap like current and this includes the m=1 island in tokamaks.
Exploration of high harmonic fast wave heating on the National Spherical Torus Experiment
NASA Astrophysics Data System (ADS)
Wilson, J. R.; Bell, R. E.; Bernabei, S.; Bitter, M.; Bonoli, P.; Gates, D.; Hosea, J.; LeBlanc, B.; Mau, T. K.; Medley, S.; Menard, J.; Mueller, D.; Ono, M.; Phillips, C. K.; Pinsker, R. I.; Raman, R.; Rosenberg, A.; Ryan, P.; Sabbagh, S.; Stutman, D.; Swain, D.; Takase, Y.; Wilgen, J.
2003-05-01
High harmonic fast wave (HHFW) heating has been proposed as a particularly attractive means for plasma heating and current drive in the high beta plasmas that are achievable in spherical torus (ST) devices. The National Spherical Torus Experiment (NSTX) [M. Ono, S. M. Kaye, S. Neumeyer et al., in Proceedings of the 18th IEEE/NPSS Symposium on Fusion Engineering, Albuquerque, 1999 (IEEE, Piscataway, NJ, 1999), p. 53] is such a device. An rf heating system has been installed on the NSTX to explore the physics of HHFW heating, current drive via rf waves and for use as a tool to demonstrate the attractiveness of the ST concept as a fusion device. To date, experiments have demonstrated many of the theoretical predictions for HHFW. In particular, strong wave absorption on electrons over a wide range of plasma parameters and wave parallel phase velocities, wave acceleration of energetic ions, and indications of current drive for directed wave spectra have been observed. In addition HHFW heating has been used to explore the energy transport properties of NSTX plasmas, to create H-mode discharges with a large fraction of bootstrap current and to control the plasma current profile during the early stages of the discharge.
Improvement of Current Drive Efficiency in Projected FNSF Discharges
NASA Astrophysics Data System (ADS)
Prater, R.; Chan, V.; Garofalo, A.
2012-10-01
The Fusion Nuclear Science Facility - Advanced Tokamak (FNSF-AT) is envisioned as a facility that uses the tokamak approach to address the development of the AT path to fusion and fusion's energy objectives. It uses copper coils for a compact device with high βN and moderate power gain. The major radius is 2.7 m and central toroidal field is 5.44 T. Achieving the required confinement and stability at βN˜3.7 requires a current profile with negative central shear and qmin>1. Off-axis Electron Cyclotron Current Drive (ECCD), in addition to high bootstrap current fraction, can help support this current profile. Using the applied EC frequency and launch location as free parameters, a systematic study has been carried out to optimize the ECCD in the range ρ= 0.5-0.7. Using a top launch, making use of a large toroidal component to the launch direction, adjusting the vertical launch angle so that the rays propagate nearly parallel to the resonance, and adjusting the frequency for optimum total current give a high dimensionless efficiency of 0.44 for a broad ECCD profile peaked at ρ=0.7, and the driven current is 17 kA/MW for n20= 2.1 and Te= 10.3 keV locally.
Creation of second order magnetic barrier inside chaos created by NTMs in the ASDEX UG
NASA Astrophysics Data System (ADS)
Ali, Halima; Punjabi, Alkesh
2012-10-01
Understanding and stabilization of neoclassical tearing modes (NTM) in tokamaks is an important problem. For low temperature plasmas, tearing modes are believed to be mainly driven by current density gradient. For collisionless plasmas, even when plasma is stable to classical tearing modes, helical reduction in bootstrap current in O-point of an island can destabilize NTMs when an initial island is seeded by other global MHD instabilities or when microturbulence triggers the transition from a linear to nonlinear instability. The onset of NTMs leads to the most serious beta limit in ASDEX UG tokamak [O. Gubner et al 2005 NF 39 1321]. The important NTMs in the ASDDEX UG are (m,n)=(3,2)+(4,3)+(1,1). Realistic parameterization of these NTMs and the safety factor in ASDEX UG are given in [O. Dumbrajs et al 2005 POP 12 1107004]. We use a symplectic map in magnetic coordinates for the ASDEX UG to integrate field lines in presence of the NTMs. We add a second order control term [H. Ali and A. Punjabi 2007 PPCF 49 1565] to this ASDEX UG field line Hamiltonian to create an invariant magnetic surface inside the chaos generated by the NTMs. The relative strength, robustness, and resilience of this barrier are studied to ascertain the most desirable noble barrier in the ASDEX UG with NTMs. We present preliminary results of this work, and discuss its implications with regard to magnetic transport barriers for increasing strength of magnetic perturbations. This work is supported by the grants DE-FG02-01ER54624 and DE-FG02-04ER54793.
NASA Astrophysics Data System (ADS)
Hasegawa, Chika; Nakayama, Yu
2018-03-01
In this paper, we solve the two-point function of the lowest dimensional scalar operator in the critical ϕ4 theory on 4 ‑ 𝜖 dimensional real projective space in three different methods. The first is to use the conventional perturbation theory, and the second is to impose the cross-cap bootstrap equation, and the third is to solve the Schwinger-Dyson equation under the assumption of conformal invariance. We find that the three methods lead to mutually consistent results but each has its own advantage.
On critical exponents without Feynman diagrams
NASA Astrophysics Data System (ADS)
Sen, Kallol; Sinha, Aninda
2016-11-01
In order to achieve a better analytic handle on the modern conformal bootstrap program, we re-examine and extend the pioneering 1974 work of Polyakov’s, which was based on consistency between the operator product expansion and unitarity. As in the bootstrap approach, this method does not depend on evaluating Feynman diagrams. We show how this approach can be used to compute the anomalous dimensions of certain operators in the O(n) model at the Wilson-Fisher fixed point in 4-ɛ dimensions up to O({ɛ }2). AS dedicates this work to the loving memory of his mother.
Iliesiu, Luca; Kos, Filip; Poland, David; ...
2016-03-17
We study the conformal bootstrap for a 4-point function of fermions in 3D. We first introduce an embedding formalism for 3D spinors and compute the conformal blocks appearing in fermion 4-point functions. Using these results, we find general bounds on the dimensions of operators appearing in the ψ × ψ OPE, and also on the central charge C T. We observe features in our bounds that coincide with scaling dimensions in the GrossNeveu models at large N. Finally, we also speculate that other features could coincide with a fermionic CFT containing no relevant scalar operators.
Reliability of reservoir firm yield determined from the historical drought of record
Archfield, S.A.; Vogel, R.M.
2005-01-01
The firm yield of a reservoir is typically defined as the maximum yield that could have been delivered without failure during the historical drought of record. In the future, reservoirs will experience droughts that are either more or less severe than the historical drought of record. The question addressed here is what the reliability of such systems will be when operated at the firm yield. To address this question, we examine the reliability of 25 hypothetical reservoirs sited across five locations in the central and western United States. These locations provided a continuous 756-month streamflow record spanning the same time interval. The firm yield of each reservoir was estimated from the historical drought of record at each location. To determine the steady-state monthly reliability of each firm-yield estimate, 12,000-month synthetic records were generated using the moving-blocks bootstrap method. Bootstrapping was repeated 100 times for each reservoir to obtain an average steady-state monthly reliability R, the number of months the reservoir did not fail divided by the total months. Values of R were greater than 0.99 for 60 percent of the study reservoirs; the other 40 percent ranged from 0.95 to 0.98. Estimates of R were highly correlated with both the level of development (ratio of firm yield to average streamflow) and average lag-1 monthly autocorrelation. Together these two predictors explained 92 percent of the variability in R, with the level of development alone explaining 85 percent of the variability. Copyright ASCE 2005.
Yu, Ling-Yuan; Chen, Zhen-Zhen; Zheng, Fang-Qiang; Shi, Ai-Ju; Guo, Ting-Ting; Yeh, Bao-Hua; Chi, Hsin; Xu, Yong-Yu
2013-02-01
The life table of the green lacewing, Chrysopa pallens (Rambur), was studied at 22 degrees C, a photoperiod of 15:9 (L:D) h, and 80% relative humidity in the laboratory. The raw data were analyzed using the age-stage, two-sex life table. The intrinsic rate of increase (r), the finite rate of increase (lambda), the net reproduction rate (R0), and the mean generation time (T) of Ch. pallens were 0.1258 d(-1), 1.1340 d(-1), 241.4 offspring and 43.6 d, respectively. For the estimation of the means, variances, and SEs of the population parameters, we compared the jackknife and bootstrap techniques. Although similar values of the means and SEs were obtained with both techniques, significant differences were observed in the frequency distribution and variances of all parameters. The jackknife technique will result in a zero net reproductive rate upon the omission of a male, an immature death, or a nonreproductive female. This result represents, however, a contradiction because an intrinsic rate of increase exists in this situation. Therefore, we suggest that the jackknife technique should not be used for the estimation of population parameters. In predator-prey interactions, the nonpredatory egg and pupal stages of the predator are time refuges for the prey, and the pest population can grow during these times. In this study, a population projection based on the age-stage, two-sex life table is used to determine the optimal interval between releases to fill the predation gaps and maintain the predatory capacity of the control agent.
Uddameri, Venkatesh; Singaraju, Sreeram; Hernandez, E Annette
2018-02-21
Seasonal and cyclic trends in nutrient concentrations at four agricultural drainage ditches were assessed using a dataset generated from a multivariate, multiscale, multiyear water quality monitoring effort in the agriculturally dominant Lower Rio Grande Valley (LRGV) River Watershed in South Texas. An innovative bootstrap sampling-based power analysis procedure was developed to evaluate the ability of Mann-Whitney and Noether tests to discern trends and to guide future monitoring efforts. The Mann-Whitney U test was able to detect significant changes between summer and winter nutrient concentrations at sites with lower depths and unimpeded flows. Pollutant dilution, non-agricultural loadings, and in-channel flow structures (weirs) masked the effects of seasonality. The detection of cyclical trends using the Noether test was highest in the presence of vegetation mainly for total phosphorus and oxidized nitrogen (nitrite + nitrate) compared to dissolved phosphorus and reduced nitrogen (total Kjeldahl nitrogen-TKN). Prospective power analysis indicated that while increased monitoring can lead to higher statistical power, the effect size (i.e., the total number of trend sequences within a time-series) had a greater influence on the Noether test. Both Mann-Whitney and Noether tests provide complementary information on seasonal and cyclic behavior of pollutant concentrations and are affected by different processes. The results from these statistical tests when evaluated in the context of flow, vegetation, and in-channel hydraulic alterations can help guide future data collection and monitoring efforts. The study highlights the need for long-term monitoring of agricultural drainage ditches to properly discern seasonal and cyclical trends.
Teixeira, Andreia Sofia; Monteiro, Pedro T; Carriço, João A; Ramirez, Mário; Francisco, Alexandre P
2015-01-01
Trees, including minimum spanning trees (MSTs), are commonly used in phylogenetic studies. But, for the research community, it may be unclear that the presented tree is just a hypothesis, chosen from among many possible alternatives. In this scenario, it is important to quantify our confidence in both the trees and the branches/edges included in such trees. In this paper, we address this problem for MSTs by introducing a new edge betweenness metric for undirected and weighted graphs. This spanning edge betweenness metric is defined as the fraction of equivalent MSTs where a given edge is present. The metric provides a per edge statistic that is similar to that of the bootstrap approach frequently used in phylogenetics to support the grouping of taxa. We provide methods for the exact computation of this metric based on the well known Kirchhoff's matrix tree theorem. Moreover, we implement and make available a module for the PHYLOViZ software and evaluate the proposed metric concerning both effectiveness and computational performance. Analysis of trees generated using multilocus sequence typing data (MLST) and the goeBURST algorithm revealed that the space of possible MSTs in real data sets is extremely large. Selection of the edge to be represented using bootstrap could lead to unreliable results since alternative edges are present in the same fraction of equivalent MSTs. The choice of the MST to be presented, results from criteria implemented in the algorithm that must be based in biologically plausible models.
Cost-effectiveness of Collaborative Care for Depression in Human Immunodeficiency Virus Clinics
Fortney, John C; Gifford, Allen L; Rimland, David; Monson, Thomas; Rodriguez-Barradas, Maria C.; Pyne, Jeffrey M
2015-01-01
Objective To examine the cost-effectiveness of the HITIDES intervention. Design Randomized controlled effectiveness and implementation trial comparing depression collaborative care with enhanced usual care. Setting Three Veterans Health Administration (VHA) HIV clinics in the Southern US. Subjects 249 HIV-infected patients completed the baseline interview; 123 were randomized to the intervention and 126 to usual care. Intervention HITIDES consisted of an off-site HIV depression care team that delivered up to 12 months of collaborative care. The intervention used a stepped-care model for depression treatment and specific recommendations were based on the Texas Medication Algorithm Project and the VA/Department of Defense Depression Treatment Guidelines. Main outcome measure(s) Quality-adjusted life years (QALYs) were calculated using the 12-Item Short Form Health Survey, the Quality of Well Being Scale, and by converting depression-free days to QALYs. The base case analysis used outpatient, pharmacy, patient, and intervention costs. Cost-effectiveness was calculated using incremental cost effectiveness ratios (ICERs) and net health benefit (NHB). ICER distributions were generated using nonparametric bootstrap with replacement sampling. Results The HITIDES intervention was more effective and cost-saving compared to usual care in 78% of bootstrapped samples. The intervention NHB was positive and therefore deemed cost-effective using an ICER threshold of $50,000/QALY. Conclusions In HIV clinic settings this intervention was more effective and cost-saving compared to usual care. Implementation of off-site depression collaborative care programs in specialty care settings may be a strategy that not only improves outcomes for patients, but also maximizes the efficient use of limited healthcare resources. PMID:26102447
Bootstrap finance: the art of start-ups.
Bhide, A
1992-01-01
Entrepreneurship is more popular than ever: courses are full, policymakers emphasize new ventures, managers yearn to go off on their own. Would-be founders often misplace their energies, however. Believing in a "big money" model of entrepreneurship, they spend a lot of time trying to attract investors instead of using wits and hustle to get their ideas off the ground. A study of 100 of the 1989 Inc. "500" list of fastest growing U.S. start-ups attests to the value of bootstrapping. In fact, what it takes to start a business often conflicts with what venture capitalists require. Investors prefer solid plans, well-defined markets, and track records. Entrepreneurs are heavy on energy and enthusiasm but may be short on credentials. They thrive in rapidly changing environments where uncertain prospects may scare off established companies. Rolling with the punches is often more important than formal plans. Striving to adhere to investors' criteria can diminish the flexibility--the try-it, fix-it approach--an entrepreneur needs to make a new venture work. Seven principles are basic for successful start-ups: get operational fast; look for quick break-even, cash-generating projects; offer high-value products or services that can sustain direct personal selling; don't try to hire the crack team; keep growth in check; focus on cash; and cultivate banks early. Growth and change are the start-up's natural environment. But change is also the reward for success: just as ventures grow, their founders usually have to take a fresh look at everything again: roles, organization, even the very policies that got the business up and running.
NASA Astrophysics Data System (ADS)
Gebhardt, Katharina; Knebelsberger, Thomas
2015-09-01
We morphologically analyzed 79 cephalopod specimens from the North and Baltic Seas belonging to 13 separate species. Another 29 specimens showed morphological features of either Alloteuthis mediaor Alloteuthis subulata or were found to be in between. Reliable identification features to distinguish between A. media and A. subulata are currently not available. The analysis of the DNA barcoding region of the COI gene revealed intraspecific distances (uncorrected p) ranging from 0 to 2.13 % (average 0.1 %) and interspecific distances between 3.31 and 22 % (average 15.52 %). All species formed monophyletic clusters in a neighbor-joining analysis and were supported by bootstrap values of ≥99 %. All COI haplotypes belonging to the 29 Alloteuthis specimens were grouped in one cluster. Neither COI nor 18S rDNA sequences helped to distinguish between the different Alloteuthis morphotypes. For species identification purposes, we recommend the use of COI, as it showed higher bootstrap support of species clusters and less amplification and sequencing failure compared to 18S. Our data strongly support the assumption that the genus Alloteuthis is only represented by a single species, at least in the North Sea. It remained unclear whether this species is A. subulata or A. media. All COI sequences including important metadata were uploaded to the Barcode of Life Data Systems and can be used as reference library for the molecular identification of more than 50 % of the cephalopod fauna known from the North and Baltic Seas.
Benedetto, Umberto; Raja, Shahzad G
2014-11-01
The effectiveness of the routine retrosternal placement of a gentamicin-impregnated collagen sponge (GICS) implant before sternotomy closure is currently a matter of some controversy. We aimed to develop a scoring system to guide decision making for the use of GICS to prevent deep sternal wound infection. Fast backward elimination on predictors, including GICS, was performed using the Lawless and Singhal method. The scoring system was reported as a partial nomogram that can be used to manually obtain predicted individual risk of deep sternal wound infection from the regression model. Bootstrapping validation of the regression models was performed. The final populations consisted of 8750 adult patients undergoing cardiac surgery through full sternotomy during the study period. A total of 329 patients (3.8%) received GICS implant. The overall incidence of deep sternal wound infection was lower among patients who received GICS implant (0.6%) than patients who did not (2.01%) (P=.02). A nomogram to predict the individual risk for deep sternal wound infection was developed that included the use of GICS. Bootstrapping validation confirmed a good discriminative power of the models. The scoring system provides an impartial assessment of the decision-making process for clinicians to establish if GICS implant is effective in reducing the risk for deep sternal wound infection in individual patients undergoing cardiac surgery through full sternotomy. Copyright © 2014 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Qian, J. P.; Garofalo, A. M.; Gong, X. Z.; Ren, Q. L.; Ding, S. Y.; Solomon, W. M.; Xu, G. S.; Grierson, B. A.; Guo, W. F.; Holcomb, C. T.; McClenaghan, J.; McKee, G. R.; Pan, C. K.; Huang, J.; Staebler, G. M.; Wan, B. N.
2017-05-01
Recent EAST/DIII-D joint experiments on the high poloidal beta {β\\text{P}} regime in DIII-D have extended operation with internal transport barriers (ITBs) and excellent energy confinement (H 98y2 ~ 1.6) to higher plasma current, for lower q 95 ⩽ 7.0, and more balanced neutral beam injection (NBI) (torque injection < 2 Nm), for lower plasma rotation than previous results (Garofalo et al, IAEA 2014, Gong et al 2014 IAEA Int. Conf. on Fusion Energy). Transport analysis and experimental measurements at low toroidal rotation suggest that the E × B shear effect is not key to the ITB formation in these high {β\\text{P}} discharges. Experiments and TGLF modeling show that the Shafranov shift has a key stabilizing effect on turbulence. Extrapolation of the DIII-D results using a 0D model shows that with the improved confinement, the high bootstrap fraction regime could achieve fusion gain Q = 5 in ITER at {β\\text{N}} ~ 2.9 and q 95 ~ 7. With the optimization of q(0), the required improved confinement is achievable when using 1.5D TGLF-SAT1 for transport simulations. Results reported in this paper suggest that the DIII-D high {β\\text{P}} scenario could be a candidate for ITER steady state operation.
NASA Astrophysics Data System (ADS)
Stroeve, Julienne; Jenouvrier, Stephanie
2016-04-01
Sea ice variability within the marginal ice zone (MIZ) and polynyas plays an important role for phytoplankton productivity and krill abundance. Therefore mapping their spatial extent, seasonal and interannual variability is essential for understanding how current and future changes in these biological active regions may impact the Antarctic marine ecosystem. Knowledge of the distribution of different ice types to the total Antarctic sea ice cover may also help to shed light on the factors contributing towards recent expansion of the Antarctic ice cover in some regions and contraction in others. The long-term passive microwave satellite data record provides the longest and most consistent data record for assessing different ice types. However, estimates of the amount of MIZ, consolidated pack ice and polynyas depends strongly on what sea ice algorithm is used. This study uses two popular passive microwave sea ice algorithms, the NASA Team and Bootstrap to evaluate the distribution and variability in the MIZ, the consolidated pack ice and coastal polynyas. Results reveal the NASA Team algorithm has on average twice the MIZ and half the consolidated pack ice area as the Bootstrap algorithm. Polynya area is also larger in the NASA Team algorithm, and the timing of maximum polynya area may differ by as much as 5 months between algorithms. These differences lead to different relationships between sea ice characteristics and biological processes, as illustrated here with the breeding success of an Antarctic seabird.
Statistical behavior of ten million experimental detection limits
NASA Astrophysics Data System (ADS)
Voigtman, Edward; Abraham, Kevin T.
2011-02-01
Using a lab-constructed laser-excited fluorimeter, together with bootstrapping methodology, the authors have generated many millions of experimental linear calibration curves for the detection of rhodamine 6G tetrafluoroborate in ethanol solutions. The detection limits computed from them are in excellent agreement with both previously published theory and with comprehensive Monte Carlo computer simulations. Currie decision levels and Currie detection limits, each in the theoretical, chemical content domain, were found to be simply scaled reciprocals of the non-centrality parameter of the non-central t distribution that characterizes univariate linear calibration curves that have homoscedastic, additive Gaussian white noise. Accurate and precise estimates of the theoretical, content domain Currie detection limit for the experimental system, with 5% (each) probabilities of false positives and false negatives, are presented.
Patient satisfaction after pulmonary resection for lung cancer: a multicenter comparative analysis.
Pompili, Cecilia; Brunelli, Alessandro; Rocco, Gaetano; Salvi, Rosario; Xiumé, Francesco; La Rocca, Antonello; Sabbatini, Armando; Martucci, Nicola
2013-01-01
Patient satisfaction reflects the perception of the customer about the level of quality of care received during the episode of hospitalization. To compare the levels of satisfaction of patients submitted to lung resection in two different thoracic surgical units. Prospective analysis of 280 consecutive patients submitted to pulmonary resection for neoplastic disease in two centers (center A: 139 patients; center B: 141 patients; 2009-2010). Patients' satisfaction was assessed at discharge through the EORTC-InPatSat32 module, a 32-item, multi-scale self-administered anonymous questionnaire. Each scale (ranging from 0 to 100 in score) was compared between the two units. Multivariable regression and bootstrap were used to verify factors associated with the patients' general satisfaction (dependent variable). Patients from unit B reported a higher general satisfaction (91.5 vs. 88.3, p = 0.04), mainly due to a significantly higher satisfaction in the doctor-related scales (doctors' technical skill: p = 0.001; doctors' interpersonal skill: p = 0.008; doctors' availability: p = 0.005, and doctors information provision: p = 0.0006). Multivariable regression analysis and bootstrap confirmed that level of care in unit B (p = 0.006, bootstrap frequency 60%) along with lower level of education of the patient population (p = 0.02, bootstrap frequency 62%) were independent factors associated with a higher general patient satisfaction. We were able to show a different level of patient satisfaction in patients operated on in two different thoracic surgery units. A reduced level of patient satisfaction may trigger changes in the management policy of individual units in order to meet patients' expectations and improve organizational efficiency. Copyright © 2012 S. Karger AG, Basel.
Wilcox, Thomas P; Zwickl, Derrick J; Heath, Tracy A; Hillis, David M
2002-11-01
Four New World genera of dwarf boas (Exiliboa, Trachyboa, Tropidophis, and Ungaliophis) have been placed by many systematists in a single group (traditionally called Tropidophiidae). However, the monophyly of this group has been questioned in several studies. Moreover, the overall relationships among basal snake lineages, including the placement of the dwarf boas, are poorly understood. We obtained mtDNA sequence data for 12S, 16S, and intervening tRNA-val genes from 23 species of snakes representing most major snake lineages, including all four genera of New World dwarf boas. We then examined the phylogenetic position of these species by estimating the phylogeny of the basal snakes. Our phylogenetic analysis suggests that New World dwarf boas are not monophyletic. Instead, we find Exiliboa and Ungaliophis to be most closely related to sand boas (Erycinae), boas (Boinae), and advanced snakes (Caenophidea), whereas Tropidophis and Trachyboa form an independent clade that separated relatively early in snake radiation. Our estimate of snake phylogeny differs significantly in other ways from some previous estimates of snake phylogeny. For instance, pythons do not cluster with boas and sand boas, but instead show a strong relationship with Loxocemus and Xenopeltis. Additionally, uropeltids cluster strongly with Cylindrophis, and together are embedded in what has previously been considered the macrostomatan radiation. These relationships are supported by both bootstrapping (parametric and nonparametric approaches) and Bayesian analysis, although Bayesian support values are consistently higher than those obtained from nonparametric bootstrapping. Simulations show that Bayesian support values represent much better estimates of phylogenetic accuracy than do nonparametric bootstrap support values, at least under the conditions of our study. Copyright 2002 Elsevier Science (USA)
Explanation of Two Anomalous Results in Statistical Mediation Analysis.
Fritz, Matthew S; Taylor, Aaron B; Mackinnon, David P
2012-01-01
Previous studies of different methods of testing mediation models have consistently found two anomalous results. The first result is elevated Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap tests not found in nonresampling tests or in resampling tests that did not include a bias correction. This is of special concern as the bias-corrected bootstrap is often recommended and used due to its higher statistical power compared with other tests. The second result is statistical power reaching an asymptote far below 1.0 and in some conditions even declining slightly as the size of the relationship between X and M , a , increased. Two computer simulations were conducted to examine these findings in greater detail. Results from the first simulation found that the increased Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap are a function of an interaction between the size of the individual paths making up the mediated effect and the sample size, such that elevated Type I error rates occur when the sample size is small and the effect size of the nonzero path is medium or larger. Results from the second simulation found that stagnation and decreases in statistical power as a function of the effect size of the a path occurred primarily when the path between M and Y , b , was small. Two empirical mediation examples are provided using data from a steroid prevention and health promotion program aimed at high school football players (Athletes Training and Learning to Avoid Steroids; Goldberg et al., 1996), one to illustrate a possible Type I error for the bias-corrected bootstrap test and a second to illustrate a loss in power related to the size of a . Implications of these findings are discussed.
Sample size determination for mediation analysis of longitudinal data.
Pan, Haitao; Liu, Suyu; Miao, Danmin; Yuan, Ying
2018-03-27
Sample size planning for longitudinal data is crucial when designing mediation studies because sufficient statistical power is not only required in grant applications and peer-reviewed publications, but is essential to reliable research results. However, sample size determination is not straightforward for mediation analysis of longitudinal design. To facilitate planning the sample size for longitudinal mediation studies with a multilevel mediation model, this article provides the sample size required to achieve 80% power by simulations under various sizes of the mediation effect, within-subject correlations and numbers of repeated measures. The sample size calculation is based on three commonly used mediation tests: Sobel's method, distribution of product method and the bootstrap method. Among the three methods of testing the mediation effects, Sobel's method required the largest sample size to achieve 80% power. Bootstrapping and the distribution of the product method performed similarly and were more powerful than Sobel's method, as reflected by the relatively smaller sample sizes. For all three methods, the sample size required to achieve 80% power depended on the value of the ICC (i.e., within-subject correlation). A larger value of ICC typically required a larger sample size to achieve 80% power. Simulation results also illustrated the advantage of the longitudinal study design. The sample size tables for most encountered scenarios in practice have also been published for convenient use. Extensive simulations study showed that the distribution of the product method and bootstrapping method have superior performance to the Sobel's method, but the product method was recommended to use in practice in terms of less computation time load compared to the bootstrapping method. A R package has been developed for the product method of sample size determination in mediation longitudinal study design.
Oberije, Cary; De Ruysscher, Dirk; Houben, Ruud; van de Heuvel, Michel; Uyterlinde, Wilma; Deasy, Joseph O; Belderbos, Jose; Dingemans, Anne-Marie C; Rimner, Andreas; Din, Shaun; Lambin, Philippe
2015-07-15
Although patients with stage III non-small cell lung cancer (NSCLC) are homogeneous according to the TNM staging system, they form a heterogeneous group, which is reflected in the survival outcome. The increasing amount of information for an individual patient and the growing number of treatment options facilitate personalized treatment, but they also complicate treatment decision making. Decision support systems (DSS), which provide individualized prognostic information, can overcome this but are currently lacking. A DSS for stage III NSCLC requires the development and integration of multiple models. The current study takes the first step in this process by developing and validating a model that can provide physicians with a survival probability for an individual NSCLC patient. Data from 548 patients with stage III NSCLC were available to enable the development of a prediction model, using stratified Cox regression. Variables were selected by using a bootstrap procedure. Performance of the model was expressed as the c statistic, assessed internally and on 2 external data sets (n=174 and n=130). The final multivariate model, stratified for treatment, consisted of age, gender, World Health Organization performance status, overall treatment time, equivalent radiation dose, number of positive lymph node stations, and gross tumor volume. The bootstrapped c statistic was 0.62. The model could identify risk groups in external data sets. Nomograms were constructed to predict an individual patient's survival probability (www.predictcancer.org). The data set can be downloaded at https://www.cancerdata.org/10.1016/j.ijrobp.2015.02.048. The prediction model for overall survival of patients with stage III NSCLC highlights the importance of combining patient, clinical, and treatment variables. Nomograms were developed and validated. This tool could be used as a first building block for a decision support system. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Pankin, A. Y.; Rafiq, T.; Kritz, A. H.; Park, G. Y.; Snyder, P. B.; Chang, C. S.
2017-06-01
The effects of plasma shaping on the H-mode pedestal structure are investigated. High fidelity kinetic simulations of the neoclassical pedestal dynamics are combined with the magnetohydrodynamic (MHD) stability conditions for triggering edge localized mode (ELM) instabilities that limit the pedestal width and height in H-mode plasmas. The neoclassical kinetic XGC0 code [Chang et al., Phys. Plasmas 11, 2649 (2004)] is used in carrying out a scan over plasma elongation and triangularity. As plasma profiles evolve, the MHD stability limits of these profiles are analyzed with the ideal MHD ELITE code [Snyder et al., Phys. Plasmas 9, 2037 (2002)]. Simulations with the XGC0 code, which includes coupled ion-electron dynamics, yield predictions for both ion and electron pedestal profiles. The differences in the predicted H-mode pedestal width and height for the DIII-D discharges with different elongation and triangularities are discussed. For the discharges with higher elongation, it is found that the gradients of the plasma profiles in the H-mode pedestal reach semi-steady states. In these simulations, the pedestal slowly continues to evolve to higher pedestal pressures and bootstrap currents until the peeling-ballooning stability conditions are satisfied. The discharges with lower elongation do not reach the semi-steady state, and ELM crashes are triggered at earlier times. The plasma elongation is found to have a stronger stabilizing effect than the plasma triangularity. For the discharges with lower elongation and lower triangularity, the ELM frequency is large, and the H-mode pedestal evolves rapidly. It is found that the temperature of neutrals in the scrape-off-layer (SOL) region can affect the dynamics of the H-mode pedestal buildup. However, the final pedestal profiles are nearly independent of the neutral temperature. The elongation and triangularity affect the pedestal widths of plasma density and electron temperature profiles differently. This provides a new mechanism of controlling the pedestal bootstrap current and the pedestal stability.
Pankin, A. Y.; Rafiq, T.; Kritz, A. H.; ...
2017-06-08
The effects of plasma shaping on the H-mode pedestal structure are investigated. High fidelity kinetic simulations of the neoclassical pedestal dynamics are combined with the magnetohydrodynamic (MHD) stability conditions for triggering edge localized mode (ELM) instabilities that limit the pedestal width and height in H-mode plasmas. We use the neoclassical kinetic XGC0 code [Chang et al., Phys. Plasmas 11, 2649 (2004)] to carry out a scan over plasma elongation and triangularity. As plasma profiles evolve, the MHD stability limits of these profiles are analyzed with the ideal MHD ELITE code [Snyder et al., Phys. Plasmas 9, 2037 (2002)]. In simulationsmore » with the XGC0 code, which includes coupled ion-electron dynamics, yield predictions for both ion and electron pedestal profiles. The differences in the predicted H-mode pedestal width and height for the DIII-D discharges with different elongation and triangularities are discussed. For the discharges with higher elongation, it is found that the gradients of the plasma profiles in the H-mode pedestal reach semi-steady states. In these simulations, the pedestal slowly continues to evolve to higher pedestal pressures and bootstrap currents until the peeling-ballooning stability conditions are satisfied. The discharges with lower elongation do not reach the semi-steady state, and ELM crashes are triggered at earlier times. The plasma elongation is found to have a stronger stabilizing effect than the plasma triangularity. For the discharges with lower elongation and lower triangularity, the ELM frequency is large, and the H-mode pedestal evolves rapidly. It is found that the temperature of neutrals in the scrape-off-layer (SOL) region can affect the dynamics of the H-mode pedestal buildup. But the final pedestal profiles are nearly independent of the neutral temperature. The elongation and triangularity affect the pedestal widths of plasma density and electron temperature profiles differently. This provides a new mechanism of controlling the pedestal bootstrap current and the pedestal stability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pankin, A. Y.; Rafiq, T.; Kritz, A. H.
The effects of plasma shaping on the H-mode pedestal structure are investigated. High fidelity kinetic simulations of the neoclassical pedestal dynamics are combined with the magnetohydrodynamic (MHD) stability conditions for triggering edge localized mode (ELM) instabilities that limit the pedestal width and height in H-mode plasmas. We use the neoclassical kinetic XGC0 code [Chang et al., Phys. Plasmas 11, 2649 (2004)] to carry out a scan over plasma elongation and triangularity. As plasma profiles evolve, the MHD stability limits of these profiles are analyzed with the ideal MHD ELITE code [Snyder et al., Phys. Plasmas 9, 2037 (2002)]. In simulationsmore » with the XGC0 code, which includes coupled ion-electron dynamics, yield predictions for both ion and electron pedestal profiles. The differences in the predicted H-mode pedestal width and height for the DIII-D discharges with different elongation and triangularities are discussed. For the discharges with higher elongation, it is found that the gradients of the plasma profiles in the H-mode pedestal reach semi-steady states. In these simulations, the pedestal slowly continues to evolve to higher pedestal pressures and bootstrap currents until the peeling-ballooning stability conditions are satisfied. The discharges with lower elongation do not reach the semi-steady state, and ELM crashes are triggered at earlier times. The plasma elongation is found to have a stronger stabilizing effect than the plasma triangularity. For the discharges with lower elongation and lower triangularity, the ELM frequency is large, and the H-mode pedestal evolves rapidly. It is found that the temperature of neutrals in the scrape-off-layer (SOL) region can affect the dynamics of the H-mode pedestal buildup. But the final pedestal profiles are nearly independent of the neutral temperature. The elongation and triangularity affect the pedestal widths of plasma density and electron temperature profiles differently. This provides a new mechanism of controlling the pedestal bootstrap current and the pedestal stability.« less
High-beta, steady-state hybrid scenario on DIII-D
Petty, C. C.; Kinsey, J. E.; Holcomb, C. T.; ...
2015-12-17
Here, the potential of the hybrid scenario (first developed as an advanced inductive scenario for high fluence) as a regime for high-beta, steady-state plasmas is demonstrated on the DIII-D tokamak. These experiments show that the beneficial characteristics of hybrids, namely safety factor ≥1 with low central magnetic shear, high stability limits and excellent confinement, are maintained when strong central current drive (electron cyclotron and neutral beam) is applied to increase the calculated non-inductive fraction to ≈100% (≈50% bootstrap current). The best discharges achieve normalized beta of 3.4, IPB98(y,2) confinement factor of 1.4, surface loop voltage of 0.01 V, and nearlymore » equal electron and ion temperatures at low collisionality. A zero-dimensional physics model shows that steady-state hybrid operation with Q fus ~ 5 is feasible in FDF and ITER. The advantage of the hybrid scenario as an Advanced Tokamak regime is that the external current drive can be deposited near the plasma axis where the efficiency is high; additionally, good alignment between the current drive and plasma current profiles is not necessary as the poloidal magnetic flux pumping self-organizes the current density profile in hybrids with an m/n=3/2 tearing mode.« less
Why do workaholics experience depression? A study with Chinese University teachers.
Nie, Yingzhi; Sun, Haitao
2016-10-01
This study focuses on the relationships of workaholism to job burnout and depression of university teachers. The direct and indirect (via job burnout) effects of workaholism on depression were investigated in 412 Chinese university teachers. Structural equation modeling and bootstrap method were used. Results revealed that workaholism, job burnout, and depression significantly correlated with each other. Structural equation modeling and bootstrap test indicated the partial mediation role of job burnout on the relationship between workaholism and depression. The findings shed some light on how workaholism influenced depression and provided valuable evidence for prevention of depression in work. © The Author(s) 2015.
Blank, Jos L T; van Hulst, Bart Laurents
2011-10-01
This paper describes the efficiency of Dutch hospitals using the Data Envelopment Analysis (DEA) method with bootstrapping. In particular, the analysis focuses on accounting for cost inefficiency measures on the part of hospital corporate governance. We use bootstrap techniques, as introduced by Simar and Wilson (J. Econom. 136(1):31-64, 2007), in order to obtain more efficient estimates of the effects of governance on the efficiency. The results show that part of the cost efficiency can be explained with governance. In particular we find that a higher remuneration of the board as well as a higher remuneration of the supervisory board does not implicate better performance.
Construction of prediction intervals for Palmer Drought Severity Index using bootstrap
NASA Astrophysics Data System (ADS)
Beyaztas, Ufuk; Bickici Arikan, Bugrayhan; Beyaztas, Beste Hamiye; Kahya, Ercan
2018-04-01
In this study, we propose an approach based on the residual-based bootstrap method to obtain valid prediction intervals using monthly, short-term (three-months) and mid-term (six-months) drought observations. The effects of North Atlantic and Arctic Oscillation indexes on the constructed prediction intervals are also examined. Performance of the proposed approach is evaluated for the Palmer Drought Severity Index (PDSI) obtained from Konya closed basin located in Central Anatolia, Turkey. The finite sample properties of the proposed method are further illustrated by an extensive simulation study. Our results revealed that the proposed approach is capable of producing valid prediction intervals for future PDSI values.
Spotorno O, Angel E; Córdova, Luis; Solari I, Aldo
2008-12-01
To identify and characterize chilean samples of Trypanosoma cruzi and their association with hosts, the first 516 bp of the mitochondrial cytochrome b gene were sequenced from eight biological samples, and phylogenetically compared with other known 20 American sequences. The molecular characterization of these 28 sequences in a maximum likelihood phylogram (-lnL = 1255.12, tree length = 180, consistency index = 0.79) allowed the robust identification (bootstrap % > 99) of three previously known discrete typing units (DTU): DTU IIb, IIa, and I. An apparently undescribed new sequence found in four new chilean samples was detected and designated as DTU Ib; they were separated by 24.7 differences, but robustly related (bootstrap % = 97 in 500 replicates) to those of DTU I by sharing 12 substitutions, among which four were nonsynonymous ones. Such new DTU Ib was also robust (bootstrap % = 100), and characterized by 10 unambiguous substitutions, with a single nonsynonymous G to T change at site 409. The fact that two of such new sequences were found in parasites from a chilean endemic caviomorph rodent, Octodon degus, and that they were closely related to the ancient DTU I suggested old origins and a long association to caviomorph hosts.
Highton, R
1993-12-01
An analysis of the relationship between the number of loci utilized in an electrophoretic study of genetic relationships and the statistical support for the topology of UPGMA trees is reported for two published data sets. These are Highton and Larson (Syst. Zool.28:579-599, 1979), an analysis of the relationships of 28 species of plethodonine salamanders, and Hedges (Syst. Zool., 35:1-21, 1986), a similar study of 30 taxa of Holarctic hylid frogs. As the number of loci increases, the statistical support for the topology at each node in UPGMA trees was determined by both the bootstrap and jackknife methods. The results show that the bootstrap and jackknife probabilities supporting the topology at some nodes of UPGMA trees increase as the number of loci utilized in a study is increased, as expected for nodes that have groupings that reflect phylogenetic relationships. The pattern of increase varies and is especially rapid in the case of groups with no close relatives. At nodes that likely do not represent correct phylogenetic relationships, the bootstrap probabilities do not increase and often decline with the addition of more loci.
Comparison of mode estimation methods and application in molecular clock analysis
NASA Technical Reports Server (NTRS)
Hedges, S. Blair; Shah, Prachi
2003-01-01
BACKGROUND: Distributions of time estimates in molecular clock studies are sometimes skewed or contain outliers. In those cases, the mode is a better estimator of the overall time of divergence than the mean or median. However, different methods are available for estimating the mode. We compared these methods in simulations to determine their strengths and weaknesses and further assessed their performance when applied to real data sets from a molecular clock study. RESULTS: We found that the half-range mode and robust parametric mode methods have a lower bias than other mode methods under a diversity of conditions. However, the half-range mode suffers from a relatively high variance and the robust parametric mode is more susceptible to bias by outliers. We determined that bootstrapping reduces the variance of both mode estimators. Application of the different methods to real data sets yielded results that were concordant with the simulations. CONCLUSION: Because the half-range mode is a simple and fast method, and produced less bias overall in our simulations, we recommend the bootstrapped version of it as a general-purpose mode estimator and suggest a bootstrap method for obtaining the standard error and 95% confidence interval of the mode.
Bootstrap percolation on spatial networks
NASA Astrophysics Data System (ADS)
Gao, Jian; Zhou, Tao; Hu, Yanqing
2015-10-01
Bootstrap percolation is a general representation of some networked activation process, which has found applications in explaining many important social phenomena, such as the propagation of information. Inspired by some recent findings on spatial structure of online social networks, here we study bootstrap percolation on undirected spatial networks, with the probability density function of long-range links’ lengths being a power law with tunable exponent. Setting the size of the giant active component as the order parameter, we find a parameter-dependent critical value for the power-law exponent, above which there is a double phase transition, mixed of a second-order phase transition and a hybrid phase transition with two varying critical points, otherwise there is only a second-order phase transition. We further find a parameter-independent critical value around -1, about which the two critical points for the double phase transition are almost constant. To our surprise, this critical value -1 is just equal or very close to the values of many real online social networks, including LiveJournal, HP Labs email network, Belgian mobile phone network, etc. This work helps us in better understanding the self-organization of spatial structure of online social networks, in terms of the effective function for information spreading.
2009-01-01
Background The International Commission on Radiological Protection (ICRP) recommended annual occupational dose limit is 20 mSv. Cancer mortality in Japanese A-bomb survivors exposed to less than 20 mSv external radiation in 1945 was analysed previously, using a latency model with non-linear dose response. Questions were raised regarding statistical inference with this model. Methods Cancers with over 100 deaths in the 0 - 20 mSv subcohort of the 1950-1990 Life Span Study are analysed with Poisson regression models incorporating latency, allowing linear and non-linear dose response. Bootstrap percentile and Bias-corrected accelerated (BCa) methods and simulation of the Likelihood Ratio Test lead to Confidence Intervals for Excess Relative Risk (ERR) and tests against the linear model. Results The linear model shows significant large, positive values of ERR for liver and urinary cancers at latencies from 37 - 43 years. Dose response below 20 mSv is strongly non-linear at the optimal latencies for the stomach (11.89 years), liver (36.9), lung (13.6), leukaemia (23.66), and pancreas (11.86) and across broad latency ranges. Confidence Intervals for ERR are comparable using Bootstrap and Likelihood Ratio Test methods and BCa 95% Confidence Intervals are strictly positive across latency ranges for all 5 cancers. Similar risk estimates for 10 mSv (lagged dose) are obtained from the 0 - 20 mSv and 5 - 500 mSv data for the stomach, liver, lung and leukaemia. Dose response for the latter 3 cancers is significantly non-linear in the 5 - 500 mSv range. Conclusion Liver and urinary cancer mortality risk is significantly raised using a latency model with linear dose response. A non-linear model is strongly superior for the stomach, liver, lung, pancreas and leukaemia. Bootstrap and Likelihood-based confidence intervals are broadly comparable and ERR is strictly positive by bootstrap methods for all 5 cancers. Except for the pancreas, similar estimates of latency and risk from 10 mSv are obtained from the 0 - 20 mSv and 5 - 500 mSv subcohorts. Large and significant cancer risks for Japanese survivors exposed to less than 20 mSv external radiation from the atomic bombs in 1945 cast doubt on the ICRP recommended annual occupational dose limit. PMID:20003238
Crow, Thomas; Cross, Dorthie; Powers, Abigail; Bradley, Bekh
2014-10-01
Abuse and neglect in childhood are well-established risk factors for later psychopathology. Past research has suggested that childhood emotional abuse may be particularly harmful to psychological development. The current cross-sectional study employed multiple regression techniques to assess the effects of childhood trauma on adulthood depression and emotion dysregulation in a large sample of mostly low-income African Americans recruited in an urban hospital. Bootstrap analyses were used to test emotion dysregulation as a potential mediator between emotional abuse in childhood and current depression. Childhood emotional abuse significantly predicted depressive symptoms even when accounting for all other childhood trauma types, and we found support for a complementary mediation of this relationship by emotion dysregulation. Our findings highlight the importance of emotion dysregulation and childhood emotional abuse in relation to adult depression. Moving forward, clinicians should consider the particular importance of emotional abuse in the development of depression, and future research should seek to identify mechanisms through which emotional abuse increases risk for depression and emotion dysregulation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Crow, Thomas; Cross, Dorthie; Powers, Abigail; Bradley, Bekh
2014-01-01
Abuse and neglect in childhood are well-established risk factors for later psychopathology. Past research has suggested that childhood emotional abuse may be particularly harmful to psychological development. The current cross-sectional study employed multiple regression techniques to assess the effects of childhood trauma on adulthood depression and emotion dysregulation in a large sample of mostly low-income African Americans recruited in an urban hospital. Bootstrap analyses were used to test emotion dysregulation as a potential mediator between emotional abuse in childhood and current depression. Childhood emotional abuse significantly predicted depressive symptoms even when accounting for all other childhood trauma types, and we found support for a complementary mediation of this relationship by emotion dysregulation. Our findings highlight the importance of emotion dysregulation and childhood emotional abuse in relation to adult depression. Moving forward, clinicians should consider the particular importance of emotional abuse in the development of depression, and future research should seek to identify mechanisms through which emotional abuse increases risk for depression and emotion dysregulation. PMID:25035171
Suppressing magnetic island growth by resonant magnetic perturbation
NASA Astrophysics Data System (ADS)
Yu, Q.; Günter, S.; Lackner, K.
2018-05-01
The effect of externally applied resonant magnetic perturbations (RMPs) on the growth of magnetic islands is investigated based on two-fluid equations. It is found that if the local bi-normal electron fluid velocity at the resonant surface is sufficiently large, static RMPs of the same helicity and of moderate amplitude can suppress the growth of magnetic islands in high-temperature plasmas. These islands will otherwise grow, driven by an unfavorable plasma current density profile and bootstrap current perturbation. These results indicate that the error field can stabilize island growth, if the error field amplitude is not too large and the local bi-normal electron fluid velocity is not too low. They also indicate that applied rotating RMPs with an appropriate frequency can be utilized to suppress island growth in high-temperature plasmas, even for a low bi-normal electron fluid velocity. A significant change in the local equilibrium plasma current density gradient by small amplitude RMPs is found for realistic plasma parameters, which are important for the island stability and are expected to be more important for fusion reactors with low plasma resistivity.
Plasma stability analysis using Consistent Automatic Kinetic Equilibrium reconstruction (CAKE)
NASA Astrophysics Data System (ADS)
Roelofs, Matthijs; Kolemen, Egemen; Eldon, David; Glasser, Alex; Meneghini, Orso; Smith, Sterling P.
2017-10-01
Presented here is the Consistent Automatic Kinetic Equilibrium (CAKE) code. CAKE is being developed to perform real-time kinetic equilibrium reconstruction, aiming to do a reconstruction in less than 100ms. This is achieved by taking, next to real-time Motional Stark Effect (MSE) and magnetics data, real-time Thomson Scattering (TS) and real-time Charge Exchange Recombination (CER, still in development) data in to account. Electron densities and temperature are determined by TS, while ion density and pressures are determined using CER. These form, together with the temperature and density of neutrals, the additional pressure constraints. Extra current constraints are imposed in the core by the MSE diagnostics. The pedestal current density is estimated using Sauters equation for the bootstrap current density. By comparing the behaviour of the ideal MHD perturbed potential energy (δW) and the linear stability index (Δ') of CAKE to magnetics-only reconstruction, it can be seen that the use of diagnostics to reconstruct the pedestal have a large effect on stability. Supported by U.S. DOE DE-SC0015878 and DE-FC02-04ER54698.
Nonlinear Fluid Model Of 3-D Field Effects In Tokamak Plasmas
NASA Astrophysics Data System (ADS)
Callen, J. D.; Hegna, C. C.; Beidler, M. T.
2017-10-01
Extended MHD codes (e.g., NIMROD, M3D-C1) are beginning to explore nonlinear effects of small 3-D magnetic fields on tokamak plasmas. To facilitate development of analogous physically understandable reduced models, a fluid-based dynamic nonlinear model of these added 3-D field effects in the base axisymmetric tokamak magnetic field geometry is being developed. The model incorporates kinetic-based closures within an extended MHD framework. Key 3-D field effects models that have been developed include: 1) a comprehensive modified Rutherford equation for the growth of a magnetic island that includes the classical tearing and NTM perturbed bootstrap current drives, externally applied magnetic field and current drives, and classical and neoclassical polarization current effects, and 2) dynamic nonlinear evolution of the plasma toroidal flow (radial electric field) in response to the 3-D fields. An application of this model to RMP ELM suppression precipitated by an ELM crash will be discussed. Supported by Office of Fusion Energy Sciences, Office of Science, Dept. of Energy Grants DE-FG02-86ER53218 and DE-FG02-92ER54139.
Sevick, Laura K; Santana, Maria-Jose; Ghali, William A; Clement, Fiona
2017-01-01
Objective To complete an economic evaluation within a randomised controlled trial (RCT) comparing the use of an electronic discharge communication tool (eDCT) compared with usual care. Setting Patients being discharged from a single tertiary care centre’s internal medicine Medical Teaching Units. Participants Between January 2012 and December 2013, 1399 patients were randomised to a discharge mechanism. Forty-five patients were excluded from the economic evaluation as they did not have data for the index hospitalisation cost; 1354 patients contributed to the economic evaluation. Intervention eDCT generated at discharge containing structured content on reason for admission, details of the hospital stay, treatments received and follow-up care required. The control group was discharged via traditional dictation methods. Primary and secondary outcome measures The primary economic outcome was the cost per quality-adjusted life year (QALY) gained. Secondary outcomes included the cost per death avoided and the cost per readmission avoided. Results The average transcription cost was $C22.28 per patient, whereas the estimated cost of the eDCT was $C13.33 per patient. The cost per QALY gained was $C239 933 in the eDCT arm compared with usual care due to the very small gains in effectiveness and approximately $C800difference in resource utilisation costs. The bootstrap analyses resulted in eDCT being more effective and more costly in 29.2% of samples, less costly and more effective in 29.2% of samples, less effective and more costly in 23.9% of samples and finally, less costly and less effective in 17.7% of samples. Conclusions The eDCT reduced per patient costs of the generation of discharge summaries. The bootstrap estimates demonstrate considerable uncertainty supporting the finding of neutrality reported in the clinical component of the RCT. The immediate transcription cost savings and previously documented provider and patient satisfaction may increase the impetus for organisations to invest in such systems, provided they have a foundation of eHealth infrastructure and readiness. Trial registration number NCT01402609. PMID:29247110
Improved Design of Stellarator Coils for Current Carrying Plasmas
NASA Astrophysics Data System (ADS)
Drevlak, M.; Strumberger, E.; Hirshman, S.; Boozer, A.; Brooks, A.; Valanju, P.
1998-11-01
The method of automatic optimization (P. Merkel, Nucl. Fus. 27), (1987) 867; P. Merkel, M. Drevlak, Proc 25th EPS Conf. on Cont. Fus. and Plas. Phys., Prague, in print. for the design of stellarator coils consists essentially of determining filaments such that the average relative field error int dS [ (B_coil + B_j) \\cdot n]^2/B^2_coil is minimized on the prescribed plasma boundary. Bj is the magnetic field produced by the plasma currents of the given finite β fixed boundary equilibrium. For equilibria of the W7-X type, Bj can be neglected, because of the reduced parallel plasma currents. This is not true for quasi-axisymmetric stellarator (QAS) configurations (A. Reiman, et al., to be published.) with large equilibrium and net plasma (bootstrap) currents. Although the coils for QAS exhibit low values of the field error, free boundary calculations indicate that the shape of the plasma is usually not accurately reproduced , particularly when saddle coils are used. We investigate if the surface reconstruction can be improved by introducing a modified measure of the field error based on a measure of the resonant components of the normal field.
Current/Pressure Profile Effects on Tearing Mode Stability in DIII-D Hybrid Discharges
NASA Astrophysics Data System (ADS)
Kim, K.; Park, J. M.; Murakami, M.; La Haye, R. J.; Na, Yong-Su
2015-11-01
It is important to understand the onset threshold and the evolution of tearing modes (TMs) for developing a high-performance steady state fusion reactor. As initial and basic comparisons to determine TM onset, the measured plasma profiles (such as temperature, density, rotation) were compared with the calculated current profiles between a pair of discharges with/without n=1 mode based on the database for DIII-D hybrid plasmas. The profiles were not much different, but the details were analyzed to determine their characteristics, especially near the rational surface. The tearing stability index calculated from PEST3, Δ' tends to increase rapidly just before the n=1 mode onset for these cases. The modeled equilibrium with varying pressure or current profiles parametrically based on the reference discharge is reconstructed for checking the onset dependency on Δ' or neoclassical effects such as bootstrap current. Simulations of TMs with the modeled equilibrium using resistive MHD codes will also be presented and compared with experiments to determine the sensibility for predicting TM onset. Work supported by US DOE under DE-FC02-04ER54698 and DE-AC52-07NA27344.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaing, K. C.; Peng, Yueng Kay Martin
Transport theory for potato orbits in the region near the magnetic axis in an axisymmetric torus such as tokamaks and spherical tori is extended to the situation where the toroidal flow speed is of the order of the sonic speed as observed in National Spherical Torus Experiment [E. J. Synakowski, M. G. Bell, R. E. Bell et al., Nucl. Fusion 43, 1653 (2003)]. It is found that transport fluxes such as ion radial heat flux, and bootstrap current density are modified by a factor of the order of the square of the toroidal Mach number. The consequences of the orbitmore » squeezing are also presented. The theory is developed for parabolic (in radius r) plasma profiles. A method to apply the results of the theory for the transport modeling is discussed.« less
Andres, Fanny; Castanier, Carole; Le Scanff, Christine
2014-02-01
The present study aims to explore the mediating effects of conscientiousness and alexithymia in the relationship between parental attachment style and alcohol use in a large sample of athletic young people. Participants included 434 French sport sciences students. Alcohol use, parental attachment style, conscientiousness and alexithymia were assessed. The hypotheses were tested by using regression and bootstrapping mediation analyses. Maternal insecure attachment style is positively associated with alcohol use. The current study highlights a multiple pathway in this relationship. The results reveal the mediating effect of low conscientiousness and alexithymia between maternal insecure attachment and alcohol use. Athletes' alcohol use seems to be the result of a complex association of underlying psychological factors. © 2013.
Estimation of urban runoff and water quality using remote sensing and artificial intelligence.
Ha, S R; Park, S Y; Park, D H
2003-01-01
Water quality and quantity of runoff are strongly dependent on the landuse and landcover (LULC) criteria. In this study, we developed a more improved parameter estimation procedure for the environmental model using remote sensing (RS) and artificial intelligence (AI) techniques. Landsat TM multi-band (7bands) and Korea Multi-Purpose Satellite (KOMPSAT) panchromatic data were selected for input data processing. We employed two kinds of artificial intelligence techniques, RBF-NN (radial-basis-function neural network) and ANN (artificial neural network), to classify LULC of the study area. A bootstrap resampling method, a statistical technique, was employed to generate the confidence intervals and distribution of the unit load. SWMM was used to simulate the urban runoff and water quality and applied to the study watershed. The condition of urban flow and non-point contaminations was simulated with rainfall-runoff and measured water quality data. The estimated total runoff, peak time, and pollutant generation varied considerably according to the classification accuracy and percentile unit load applied. The proposed procedure would efficiently be applied to water quality and runoff simulation in a rapidly changing urban area.
Tam, Angela; Dansereau, Christian; Badhwar, AmanPreet; Orban, Pierre; Belleville, Sylvie; Chertkow, Howard; Dagher, Alain; Hanganu, Alexandru; Monchi, Oury; Rosa-Neto, Pedro; Shmuel, Amir; Breitner, John; Bellec, Pierre
2016-12-01
We present group eight resolutions of brain parcellations for clusters generated from resting-state functional magnetic resonance images for 99 cognitively normal elderly persons and 129 patients with mild cognitive impairment, pooled from four independent datasets. This dataset was generated as part of the following study: Common Effects of Amnestic Mild Cognitive Impairment on Resting-State Connectivity Across Four Independent Studies (Tam et al., 2015) [1]. The brain parcellations have been registered to both symmetric and asymmetric MNI brain templates and generated using a method called bootstrap analysis of stable clusters (BASC) (Bellec et al., 2010) [2]. We present two variants of these parcellations. One variant contains bihemisphereic parcels (4, 6, 12, 22, 33, 65, 111, and 208 total parcels across eight resolutions). The second variant contains spatially connected regions of interest (ROIs) that span only one hemisphere (10, 17, 30, 51, 77, 199, and 322 total ROIs across eight resolutions). We also present maps illustrating functional connectivity differences between patients and controls for four regions of interest (striatum, dorsal prefrontal cortex, middle temporal lobe, and medial frontal cortex). The brain parcels and associated statistical maps have been publicly released as 3D volumes, available in .mnc and .nii file formats on figshare and on Neurovault. Finally, the code used to generate this dataset is available on Github.
Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques
NASA Astrophysics Data System (ADS)
Mai, Juliane; Tolson, Bryan
2017-04-01
The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters or model processes. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method independency of the convergence testing method, we applied it to three widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991, Campolongo et al., 2000), the variance-based Sobol' method (Solbol' 1993, Saltelli et al. 2010) and a derivative-based method known as Parameter Importance index (Goehler et al. 2013). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. Subsequently, we focus on the model-independency by testing the frugal method using the hydrologic model mHM (www.ufz.de/mhm) with about 50 model parameters. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed (and published) sensitivity results. This is one step towards reliable and transferable, published sensitivity results.
Heating and current drive requirements towards steady state operation in ITER
NASA Astrophysics Data System (ADS)
Poli, F. M.; Bonoli, P. T.; Kessel, C. E.; Batchelor, D. B.; Gorelenkova, M.; Harvey, B.; Petrov, Y.
2014-02-01
Steady state scenarios envisaged for ITER aim at optimizing the bootstrap current, while maintaining sufficient confinement and stability to provide the necessary fusion yield. Non-inductive scenarios will need to operate with Internal Transport Barriers (ITBs) in order to reach adequate fusion gain at typical currents of 9 MA. However, the large pressure gradients associated with ITBs in regions of weak or negative magnetic shear can be conducive to ideal MHD instabilities, reducing the no-wall limit. The E × B flow shear from toroidal plasma rotation is expected to be low in ITER, with a major role in the ITB dynamics being played by magnetic geometry. Combinations of H/CD sources that maintain weakly reversed magnetic shear profiles throughout the discharge are the focus of this work. Time-dependent transport simulations indicate that, with a trade-off of the EC equatorial and upper launcher, the formation and sustainment of quasi-steady state ITBs could be demonstrated in ITER with the baseline heating configuration. However, with proper constraints from peeling-ballooning theory on the pedestal width and height, the fusion gain and the maximum non-inductive current are below the ITER target. Upgrades of the heating and current drive system in ITER, like the use of Lower Hybrid current drive, could overcome these limitations, sustaining higher non-inductive current and confinement, more expanded ITBs which are ideal MHD stable.
Integrated Scenario Modeling of NSTX Advanced Plasma Configurations
NASA Astrophysics Data System (ADS)
Kessel, Charles; Synakowski, Edward
2003-10-01
The Spherical Torus will provide an attractive fusion energy source if it can demonstrate the following major features: high elongation and triangularity, 100% non-inductive current with a credible path to high bootstrap fractions, non-solenoidal startup and current rampup, high beta with stabilization of RWM instabilities, and sufficiently high energy confinement. NSTX has specific experimental milestones to examine these features, and integrated scenario modeling is helping to understand how these configurations might be produced and what tools are needed to access this operating space. Simulations with the Tokamak Simulation Code (TSC), CURRAY, and JSOLVER/BALMSC/PEST2 have identified fully non-inductively sustained, high beta plasmas that rely on strong plasma shaping accomplished with a PF coil modification, off-axis current drive from Electron Bernstein Waves (EBW), flexible on-axis heating and CD from High Harmonic Fast Wave (HHFW) and Neutral Beam Injection (NBI), and density control. Ideal MHD stability shows that with wall stabilization through plasma rotation and/or RWM feedback coils, a beta of 40% is achievable, with 100% non-inductive current sustained for 4 current diffusion times. Experimental data and theory are combined to produce a best extrapolation to these regimes, which is continuously improved as the discharges approach these parameters, and theoretical/computational methods expand. Further investigations and development for integrated scenario modeling on NSTX is discussed.
Comparison of fusion alpha performance in JET advanced scenario and H-mode plasmas
NASA Astrophysics Data System (ADS)
Asunta, O.; Kurki-Suonio, T.; Tala, T.; Sipilä, S.; Salomaa, R.; contributors, JET-EFDA
2008-12-01
Currently, plasmas with internal transport barriers (ITBs) appear the most likely candidates for steady-state scenarios for future fusion reactors. In such plasmas, the broad hot and dense region in the plasma core leads to high fusion gain, while the cool edge protects the integrity of the first wall. Economically desirable large bootstrap current fraction and low inductive current drive may, however, lead to degraded fast ion confinement. In this work the confinement and heating profile of fusion alphas were compared between H-mode and ITB plasmas in realistic JET geometry. The work was carried out using the Monte Carlo-based guiding-center-following code ASCOT. For the same plasma current, the ITB discharges were found to produce four to eight times more fusion power than a comparable ELMy H-mode discharge. Unfortunately, also the alpha particle losses were larger (~16%) compared with the H-mode discharge (7%). In the H-mode discharges, alpha power was deposited to the plasma symmetrically around the magnetic axis, whereas in the current-hole discharge, the power was spread out to a larger volume in the plasma center. This was due to wider particle orbits, and the magnetic structure allowing for a broader hot region in the centre.
Exploration of High Harmonic Fast Wave Heating on the National Spherical Torus Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.R. Wilson; R.E. Bell; S. Bernabei
2003-02-11
High Harmonic Fast Wave (HHFW) heating has been proposed as a particularly attractive means for plasma heating and current drive in the high-beta plasmas that are achievable in spherical torus (ST) devices. The National Spherical Torus Experiment (NSTX) [Ono, M., Kaye, S.M., Neumeyer, S., et al., Proceedings, 18th IEEE/NPSS Symposium on Fusion Engineering, Albuquerque, 1999, (IEEE, Piscataway, NJ (1999), p. 53.)] is such a device. An radio-frequency (rf) heating system has been installed on NSTX to explore the physics of HHFW heating, current drive via rf waves and for use as a tool to demonstrate the attractiveness of the STmore » concept as a fusion device. To date, experiments have demonstrated many of the theoretical predictions for HHFW. In particular, strong wave absorption on electrons over a wide range of plasma parameters and wave parallel phase velocities, wave acceleration of energetic ions, and indications of current drive for directed wave spectra have been observed. In addition HHFW heating has been used to explore the energy transport properties of NSTX plasmas, to create H-mode (high-confinement mode) discharges with a large fraction of bootstrap current and to control the plasma current profile during the early stages of the discharge.« less
Burgoon, Lyle D; Druwe, Ingrid L; Painter, Kyle; Yost, Erin E
2017-02-01
Today there are more than 80,000 chemicals in commerce and the environment. The potential human health risks are unknown for the vast majority of these chemicals as they lack human health risk assessments, toxicity reference values, and risk screening values. We aim to use computational toxicology and quantitative high-throughput screening (qHTS) technologies to fill these data gaps, and begin to prioritize these chemicals for additional assessment. In this pilot, we demonstrate how we were able to identify that benzo[k]fluoranthene may induce DNA damage and steatosis using qHTS data and two separate adverse outcome pathways (AOPs). We also demonstrate how bootstrap natural spline-based meta-regression can be used to integrate data across multiple assay replicates to generate a concentration-response curve. We used this analysis to calculate an in vitro point of departure of 0.751 μM and risk-specific in vitro concentrations of 0.29 μM and 0.28 μM for 1:1,000 and 1:10,000 risk, respectively, for DNA damage. Based on the available evidence, and considering that only a single HSD17B4 assay is available, we have low overall confidence in the steatosis hazard identification. This case study suggests that coupling qHTS assays with AOPs and ontologies will facilitate hazard identification. Combining this with quantitative evidence integration methods, such as bootstrap meta-regression, may allow risk assessors to identify points of departure and risk-specific internal/in vitro concentrations. These results are sufficient to prioritize the chemicals; however, in the longer term we will need to estimate external doses for risk screening purposes, such as through margin of exposure methods. © 2016 Society for Risk Analysis.
Paleomagnetic constraints on deformation of superfast-spread oceanic crust exposed at Pito Deep Rift
NASA Astrophysics Data System (ADS)
Horst, A. J.; Varga, R. J.; Gee, J. S.; Karson, J. A.
2011-12-01
The uppermost oceanic crust produced at the superfast spreading (˜142 km Ma-1, full-spreading rate) southern East Pacific Rise (EPR) during the Gauss Chron is exposed in a tectonic window along the northeastern wall of the Pito Deep Rift. Paleomagnetic analysis of fully oriented dike (62) and gabbro (5) samples from two adjacent study areas yield bootstrapped mean remanence directions of 38.9° ± 8.1°, -16.7° ± 15.6°, n = 23 (Area A) and 30.4° ± 8.0°, -25.1° ± 12.9°, n = 44 (Area B), both are significantly distinct from the Geocentric Axial Dipole expected direction at 23° S. Regional tectonics and outcrop-scale structural data combined with bootstrapped remanence directions constrain models that involve a sequence of three rotations that result in dikes restored to subvertical orientations related to (1) inward-tilting of crustal blocks during spreading (Area A = 11°, Area B = 22°), (2) clockwise, vertical-axis rotation of the Easter Microplate (A = 46°, B = 44°), and (3) block tilting at Pito Deep Rift (A = 21°, B = 10°). These data support a structural model for accretion at the southern EPR in which outcrop-scale faulting and block rotation accommodates spreading-related subaxial subsidence that is generally less than that observed in crust generated at a fast spreading rate exposed at Hess Deep Rift. These data also support previous estimates for the clockwise rotation of crust adjacent to the Easter Microplate. Dike sample natural remanent magnetization (NRM) has an arithmetic mean of 5.96 A/m ± 3.76, which suggests that they significantly contribute to observed magnetic anomalies from fast- to superfast-spread crust.
Masters, J C; Anthony, N M; de Wit, M J; Mitchell, A
2005-08-01
Major aspects of lorisid phylogeny and systematics remain unresolved, despite several studies (involving morphology, histology, karyology, immunology, and DNA sequencing) aimed at elucidating them. Our study is the first to investigate the evolution of this enigmatic group using molecular and morphological data for all four well-established genera: Arctocebus, Loris, Nycticebus, and Perodicticus. Data sets consisting of 386 bp of 12S rRNA, 535 bp of 16S rRNA, and 36 craniodental characters were analyzed separately and in combination, using maximum parsimony and maximum likelihood. Outgroups, consisting of two galagid taxa (Otolemur and Galagoides) and a lemuroid (Microcebus), were also varied. The morphological data set yielded a paraphyletic lorisid clade with the robust Nycticebus and Perodicticus grouped as sister taxa, and the galagids allied with Arctocebus. All molecular analyses maximum parsimony (MP) or maximum likelihood (ML) which included Microcebus as an outgroup rendered a paraphyletic lorisid clade, with one exception: the 12S + 16S data set analyzed with ML. The position of the galagids in these paraphyletic topologies was inconsistent, however, and bootstrap values were low. Exclusion of Microcebus generated a monophyletic Lorisidae with Asian and African subclades; bootstrap values for all three clades in the total evidence tree were over 90%. We estimated mean genetic distances for lemuroids vs. lorisoids, lorisids vs. galagids, and Asian vs. African lorisids as a guide to relative divergence times. We present information regarding a temporary land bridge that linked the two now widely separated regions inhabited by lorisids that may explain their distribution. Finally, we make taxonomic recommendations based on our results. (c) 2005 Wiley-Liss, Inc.
ASTRAL-R score predicts non-recanalisation after intravenous thrombolysis in acute ischaemic stroke.
Vanacker, Peter; Heldner, Mirjam R; Seiffge, David; Mueller, Hubertus; Eskandari, Ashraf; Traenka, Christopher; Ntaios, George; Mosimann, Pascal J; Sztajzel, Roman; Mendes Pereira, Vitor; Cras, Patrick; Engelter, Stefan; Lyrer, Philippe; Fischer, Urs; Lambrou, Dimitris; Arnold, Marcel; Michel, Patrik
2015-05-01
Intravenous thrombolysis (IVT) as treatment in acute ischaemic strokes may be insufficient to achieve recanalisation in certain patients. Predicting probability of non-recanalisation after IVT may have the potential to influence patient selection to more aggressive management strategies. We aimed at deriving and internally validating a predictive score for post-thrombolytic non-recanalisation, using clinical and radiological variables. In thrombolysis registries from four Swiss academic stroke centres (Lausanne, Bern, Basel and Geneva), patients were selected with large arterial occlusion on acute imaging and with repeated arterial assessment at 24 hours. Based on a logistic regression analysis, an integer-based score for each covariate of the fitted multivariate model was generated. Performance of integer-based predictive model was assessed by bootstrapping available data and cross validation (delete-d method). In 599 thrombolysed strokes, five variables were identified as independent predictors of absence of recanalisation: Acute glucose > 7 mmol/l (A), significant extracranial vessel STenosis (ST), decreased Range of visual fields (R), large Arterial occlusion (A) and decreased Level of consciousness (L). All variables were weighted 1, except for (L) which obtained 2 points based on β-coefficients on the logistic scale. ASTRAL-R scores 0, 3 and 6 corresponded to non-recanalisation probabilities of 18, 44 and 74 % respectively. Predictive ability showed AUC of 0.66 (95 %CI, 0.61-0.70) when using bootstrap and 0.66 (0.63-0.68) when using delete-d cross validation. In conclusion, the 5-item ASTRAL-R score moderately predicts non-recanalisation at 24 hours in thrombolysed ischaemic strokes. If its performance can be confirmed by external validation and its clinical usefulness can be proven, the score may influence patient selection for more aggressive revascularisation strategies in routine clinical practice.
Chu, Hui-May; Ette, Ene I
2005-09-02
his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.
Aref-Eshghi, Erfan; Oake, Justin; Godwin, Marshall; Aubrey-Bassler, Kris; Duke, Pauline; Mahdavian, Masoud; Asghari, Shabnam
2017-03-01
The objective of this study was to define the optimal algorithm to identify patients with dyslipidemia using electronic medical records (EMRs). EMRs of patients attending primary care clinics in St. John's, Newfoundland and Labrador (NL), Canada during 2009-2010, were studied to determine the best algorithm for identification of dyslipidemia. Six algorithms containing three components, dyslipidemia ICD coding, lipid lowering medication use, and abnormal laboratory lipid levels, were tested against a gold standard, defined as the existence of any of the three criteria. Linear discriminate analysis, and bootstrapping were performed following sensitivity/specificity testing and receiver's operating curve analysis. Two validating datasets, NL records of 2011-2014, and Canada-wide records of 2010-2012, were used to replicate the results. Relative to the gold standard, combining laboratory data together with lipid lowering medication consumption yielded the highest sensitivity (99.6%), NPV (98.1%), Kappa agreement (0.98), and area under the curve (AUC, 0.998). The linear discriminant analysis for this combination resulted in an error rate of 0.15 and an Eigenvalue of 1.99, and the bootstrapping led to AUC: 0.998, 95% confidence interval: 0.997-0.999, Kappa: 0.99. This algorithm in the first validating dataset yielded a sensitivity of 97%, Negative Predictive Value (NPV) = 83%, Kappa = 0.88, and AUC = 0.98. These figures for the second validating data set were 98%, 93%, 0.95, and 0.99, respectively. Combining laboratory data with lipid lowering medication consumption within the EMR is the best algorithm for detecting dyslipidemia. These results can generate standardized information systems for dyslipidemia and other chronic disease investigations using EMRs.
Minimally-Invasive Neural Interface for Distributed Wireless Electrocorticogram Recording Systems
Chang, Sun-Il
2018-01-01
This paper presents a minimally-invasive neural interface for distributed wireless electrocorticogram (ECoG) recording systems. The proposed interface equips all necessary components for ECoG recording, such as the high performance front-end integrated circuits, a fabricated flexible microelectrode array, and wireless communication inside a miniaturized custom-made platform. The multiple units of the interface systems can be deployed to cover a broad range of the target brain region and transmit signals via a built-in intra-skin communication (ISCOM) module. The core integrated circuit (IC) consists of 16-channel, low-power push-pull double-gated preamplifiers, in-channel successive approximation register analog-to-digital converters (SAR ADC) with a single-clocked bootstrapping switch and a time-delayed control unit, an ISCOM module for wireless data transfer through the skin instead of a power-hungry RF wireless transmitter, and a monolithic voltage/current reference generator to support the aforementioned analog and mixed-signal circuit blocks. The IC was fabricated using 250 nm CMOS processes in an area of 3.2 × 0.9 mm2 and achieved the low-power operation of 2.5 µW per channel. Input-referred noise was measured as 5.62 µVrms for 10 Hz to 10 kHz and ENOB of 7.21 at 31.25 kS/s. The implemented system successfully recorded multi-channel neural activities in vivo from a primate and demonstrated modular expandability using the ISCOM with power consumption of 160 µW. PMID:29342103
Minimally-Invasive Neural Interface for Distributed Wireless Electrocorticogram Recording Systems.
Chang, Sun-Il; Park, Sung-Yun; Yoon, Euisik
2018-01-17
This paper presents a minimally-invasive neural interface for distributed wireless electrocorticogram (ECoG) recording systems. The proposed interface equips all necessary components for ECoG recording, such as the high performance front-end integrated circuits, a fabricated flexible microelectrode array, and wireless communication inside a miniaturized custom-made platform. The multiple units of the interface systems can be deployed to cover a broad range of the target brain region and transmit signals via a built-in intra-skin communication (ISCOM) module. The core integrated circuit (IC) consists of 16-channel, low-power push-pull double-gated preamplifiers, in-channel successive approximation register analog-to-digital converters (SAR ADC) with a single-clocked bootstrapping switch and a time-delayed control unit, an ISCOM module for wireless data transfer through the skin instead of a power-hungry RF wireless transmitter, and a monolithic voltage/current reference generator to support the aforementioned analog and mixed-signal circuit blocks. The IC was fabricated using 250 nm CMOS processes in an area of 3.2 × 0.9 mm² and achieved the low-power operation of 2.5 µW per channel. Input-referred noise was measured as 5.62 µV rms for 10 Hz to 10 kHz and ENOB of 7.21 at 31.25 kS/s. The implemented system successfully recorded multi-channel neural activities in vivo from a primate and demonstrated modular expandability using the ISCOM with power consumption of 160 µW.
Verma, Sushma; Singh, Shweta; Sharma, Suresh; Tewari, S K; Roy, R K; Goel, A K; Rana, T S
2015-04-01
Curcuma longa L., commonly known as turmeric, is one of the economically and medicinally important plant species. It is predominantly cultivated in the tropical and sub tropical countries. India is the largest producer, and exporter of turmeric in the world, followed by China, Indonesia, Bangladesh and Thailand. In the present study, Directed Amplification of Minisatellite DNA (DAMD) and Inter Simple Sequence Repeats (ISSR), methods were used to estimate the genetic variability in indigenous turmeric germplasm. Cumulative data analysis for DAMD (15) and ISSR (13) markers resulted into 478 fragments, out of which 392 fragments were polymorphic, revealing 82 % polymorphism across the turmeric genotypes. Wide range of pairwise genetic distances (0.03-0.59) across the genotypes revealed that these genotypes are genetically quite diverse. The UPGMA dendrogram generated using cumulative data showed significant relationships amongst the genotypes. All 29 genotypes studied grouped into two clusters irrespective of their geographical affiliations with 100 % bootstrap value except few genotypes, suggesting considerable diversity amongst the genotypes. These results suggested that the current collection of turmeric genotypes preserve the vast majority of natural variations. The results further demonstrate the efficiency and reliability of DAMD and ISSR markers in determining the genetic diversity and relationships among the indigenous turmeric germplasm. DAMD and ISSR profiling have identified diverse turmeric genotypes, which could be further utilized in various genetic improvement programmes including conventional as well as marker assisted breeding towards development of new and desirable turmeric genotypes.
Using next generation transcriptome sequencing to predict an ectomycorrhizal metablome.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsen, P. E.; Sreedasyam, A.; Trivedi, G
Mycorrhizae, symbiotic interactions between soil fungi and tree roots, are ubiquitous in terrestrial ecosystems. The fungi contribute phosphorous, nitrogen and mobilized nutrients from organic matter in the soil and in return the fungus receives photosynthetically-derived carbohydrates. This union of plant and fungal metabolisms is the mycorrhizal metabolome. Understanding this symbiotic relationship at a molecular level provides important contributions to the understanding of forest ecosystems and global carbon cycling. We generated next generation short-read transcriptomic sequencing data from fully-formed ectomycorrhizae between Laccaria bicolor and aspen (Populus tremuloides) roots. The transcriptomic data was used to identify statistically significantly expressed gene models usingmore » a bootstrap-style approach, and these expressed genes were mapped to specific metabolic pathways. Integration of expressed genes that code for metabolic enzymes and the set of expressed membrane transporters generates a predictive model of the ectomycorrhizal metabolome. The generated model of mycorrhizal metabolome predicts that the specific compounds glycine, glutamate, and allantoin are synthesized by L. bicolor and that these compounds or their metabolites may be used for the benefit of aspen in exchange for the photosynthetically-derived sugars fructose and glucose. The analysis illustrates an approach to generate testable biological hypotheses to investigate the complex molecular interactions that drive ectomycorrhizal symbiosis. These models are consistent with experimental environmental data and provide insight into the molecular exchange processes for organisms in this complex ecosystem. The method used here for predicting metabolomic models of mycorrhizal systems from deep RNA sequencing data can be generalized and is broadly applicable to transcriptomic data derived from complex systems.« less
Chang, Ling-Yin; Wu, Wen-Chi; Wu, Chi-Chen; Lin, Linen Nymphas; Yen, Lee-Lan; Chang, Hsing-Yi
2017-01-01
Peer victimization in children and adolescents is a serious public health concern. Growing evidence exists for negative consequences of peer victimization, but research has mostly been short term and little is known about the mechanisms that moderate and mediate the impacts of peer victimization on subsequent antisocial behavior. The current study intended to examine the longitudinal relationship between peer victimization in adolescence and antisocial behavior in young adulthood and to determine whether sleep problems influence this relationship. In total, 2006 adolescents participated in a prospective study from 2009 to 2013. The moderating role of sleep problems was examined by testing the significance of the interaction between peer victimization and sleep problems. The mediating role of sleep problems was tested by using bootstrapping mediational analyses. All analyses were conducted using SAS 9.3 software. We found that peer victimization during adolescence was positively and significantly associated with antisocial behavior in young adulthood (β = 0.10, p < 0.0001). This association was mediated, but not moderated by sleep problems. Specifically, peer victimization first increased levels of sleep problems, which in turn elevated the risk of antisocial behavior (indirect effect: 0.01, 95% bootstrap confidence interval: 0.004, 0.021). These findings imply that sleep problems may operate as a potential mechanism through which peer victimization during adolescence leads to increases in antisocial behavior in young adulthood. Prevention and intervention programs that target sleep problems may yield benefits for decreasing antisocial behavior in adolescents who have been victimized by peers. Copyright © 2016 Elsevier Ltd. All rights reserved.
mBEEF-vdW: Robust fitting of error estimation density functionals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lundgaard, Keld T.; Wellendorff, Jess; Voss, Johannes
Here, we propose a general-purpose semilocal/nonlocal exchange-correlation functional approximation, named mBEEF-vdW. The exchange is a meta generalized gradient approximation, and the correlation is a semilocal and nonlocal mixture, with the Rutgers-Chalmers approximation for van der Waals (vdW) forces. The functional is fitted within the Bayesian error estimation functional (BEEF) framework. We improve the previously used fitting procedures by introducing a robust MM-estimator based loss function, reducing the sensitivity to outliers in the datasets. To more reliably determine the optimal model complexity, we furthermore introduce a generalization of the bootstrap 0.632 estimator with hierarchical bootstrap sampling and geometric mean estimator overmore » the training datasets. Using this estimator, we show that the robust loss function leads to a 10% improvement in the estimated prediction error over the previously used least-squares loss function. The mBEEF-vdW functional is benchmarked against popular density functional approximations over a wide range of datasets relevant for heterogeneous catalysis, including datasets that were not used for its training. Overall, we find that mBEEF-vdW has a higher general accuracy than competing popular functionals, and it is one of the best performing functionals on chemisorption systems, surface energies, lattice constants, and dispersion. We also show the potential-energy curve of graphene on the nickel(111) surface, where mBEEF-vdW matches the experimental binding length. mBEEF-vdW is currently available in gpaw and other density functional theory codes through Libxc, version 3.0.0.« less
mBEEF-vdW: Robust fitting of error estimation density functionals
Lundgaard, Keld T.; Wellendorff, Jess; Voss, Johannes; ...
2016-06-15
Here, we propose a general-purpose semilocal/nonlocal exchange-correlation functional approximation, named mBEEF-vdW. The exchange is a meta generalized gradient approximation, and the correlation is a semilocal and nonlocal mixture, with the Rutgers-Chalmers approximation for van der Waals (vdW) forces. The functional is fitted within the Bayesian error estimation functional (BEEF) framework. We improve the previously used fitting procedures by introducing a robust MM-estimator based loss function, reducing the sensitivity to outliers in the datasets. To more reliably determine the optimal model complexity, we furthermore introduce a generalization of the bootstrap 0.632 estimator with hierarchical bootstrap sampling and geometric mean estimator overmore » the training datasets. Using this estimator, we show that the robust loss function leads to a 10% improvement in the estimated prediction error over the previously used least-squares loss function. The mBEEF-vdW functional is benchmarked against popular density functional approximations over a wide range of datasets relevant for heterogeneous catalysis, including datasets that were not used for its training. Overall, we find that mBEEF-vdW has a higher general accuracy than competing popular functionals, and it is one of the best performing functionals on chemisorption systems, surface energies, lattice constants, and dispersion. We also show the potential-energy curve of graphene on the nickel(111) surface, where mBEEF-vdW matches the experimental binding length. mBEEF-vdW is currently available in gpaw and other density functional theory codes through Libxc, version 3.0.0.« less
Efficient statistical tests to compare Youden index: accounting for contingency correlation.
Chen, Fangyao; Xue, Yuqiang; Tan, Ming T; Chen, Pingyan
2015-04-30
Youden index is widely utilized in studies evaluating accuracy of diagnostic tests and performance of predictive, prognostic, or risk models. However, both one and two independent sample tests on Youden index have been derived ignoring the dependence (association) between sensitivity and specificity, resulting in potentially misleading findings. Besides, paired sample test on Youden index is currently unavailable. This article develops efficient statistical inference procedures for one sample, independent, and paired sample tests on Youden index by accounting for contingency correlation, namely associations between sensitivity and specificity and paired samples typically represented in contingency tables. For one and two independent sample tests, the variances are estimated by Delta method, and the statistical inference is based on the central limit theory, which are then verified by bootstrap estimates. For paired samples test, we show that the estimated covariance of the two sensitivities and specificities can be represented as a function of kappa statistic so the test can be readily carried out. We then show the remarkable accuracy of the estimated variance using a constrained optimization approach. Simulation is performed to evaluate the statistical properties of the derived tests. The proposed approaches yield more stable type I errors at the nominal level and substantially higher power (efficiency) than does the original Youden's approach. Therefore, the simple explicit large sample solution performs very well. Because we can readily implement the asymptotic and exact bootstrap computation with common software like R, the method is broadly applicable to the evaluation of diagnostic tests and model performance. Copyright © 2015 John Wiley & Sons, Ltd.
Understanding London's Water Supply Tradeoffs When Scheduling Interventions Under Deep Uncertainty
NASA Astrophysics Data System (ADS)
Huskova, I.; Matrosov, E. S.; Harou, J. J.; Kasprzyk, J. R.; Reed, P. M.
2015-12-01
Water supply planning in many major world cities faces several challenges associated with but not limited to climate change, population growth and insufficient land availability for infrastructure development. Long-term plans to maintain supply-demand balance and ecosystem services require careful consideration of uncertainties associated with future conditions. The current approach for London's water supply planning utilizes least cost optimization of future intervention schedules with limited uncertainty consideration. Recently, the focus of the long-term plans has shifted from solely least cost performance to robustness and resilience of the system. Identifying robust scheduling of interventions requires optimizing over a statistically representative sample of stochastic inputs which may be computationally difficult to achieve. In this study we optimize schedules using an ensemble of plausible scenarios and assess how manipulating that ensemble influences the different Pareto-approximate intervention schedules. We investigate how a major stress event's location in time as well as the optimization problem formulation influence the Pareto-approximate schedules. A bootstrapping method that respects the non-stationary trend of climate change scenarios and ensures the even distribution of the major stress event in the scenario ensemble is proposed. Different bootstrapped hydrological scenario ensembles are assessed using many-objective scenario optimization of London's future water supply and demand intervention scheduling. However, such a "fixed" scheduling of interventions approach does not aim to embed flexibility or adapt effectively as the future unfolds. Alternatively, making decisions based on the observations of occurred conditions could help planners who prefer adaptive planning. We will show how rules to guide the implementation of interventions based on observations may result in more flexible strategies.
Jones, Adam G
2015-11-01
Bateman's principles continue to play a major role in the characterization of genetic mating systems in natural populations. The modern manifestations of Bateman's ideas include the opportunity for sexual selection (i.e. I(s) - the variance in relative mating success), the opportunity for selection (i.e. I - the variance in relative reproductive success) and the Bateman gradient (i.e. β(ss) - the slope of the least-squares regression of reproductive success on mating success). These variables serve as the foundation for one convenient approach for the quantification of mating systems. However, their estimation presents at least two challenges, which I address here with a new Windows-based computer software package called BATEMANATER. The first challenge is that confidence intervals for these variables are not easy to calculate. BATEMANATER solves this problem using a bootstrapping approach. The second, more serious, problem is that direct estimates of mating system variables from open populations will typically be biased if some potential progeny or adults are missing from the analysed sample. BATEMANATER addresses this problem using a maximum-likelihood approach to estimate mating system variables from incompletely sampled breeding populations. The current version of BATEMANATER addresses the problem for systems in which progeny can be collected in groups of half- or full-siblings, as would occur when eggs are laid in discrete masses or offspring occur in pregnant females. BATEMANATER has a user-friendly graphical interface and thus represents a new, convenient tool for the characterization and comparison of genetic mating systems. © 2015 John Wiley & Sons Ltd.
HSX as an example of a resilient non-resonant divertor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bader, A.; Boozer, A. H.; Hegna, C. C.
This study describes an initial description of the resilient divertor properties of quasi-symmetric (QS) stellarators using the HSX (Helically Symmetric eXperiment) configuration as a test-case. Divertors in high-performance QS stellarators will need to be resilient to changes in plasma configuration that arise due to evolution of plasma pressure profiles and bootstrap currents for divertor design. Resiliency is tested by examining the changes in strike point patterns from the field line following, which arise due to configurational changes. A low strike point variation with high configuration changes corresponds to high resiliency. The HSX edge displays resilient properties with configuration changes arisingmore » from the (1) wall position, (2) plasma current, and (3) external coils. The resilient behavior is lost if large edge islands intersect the wall structure. The resilient edge properties are corroborated by heat flux calculations from the fully 3-D plasma simulations using EMC3-EIRENE. Additionally, the strike point patterns are found to correspond to high curvature regions of magnetic flux surfaces.« less
Exploration of spherical torus physics in the NSTX device
NASA Astrophysics Data System (ADS)
Ono, M.; Kaye, S. M.; Peng, Y.-K. M.; Barnes, G.; Blanchard, W.; Carter, M. D.; Chrzanowski, J.; Dudek, L.; Ewig, R.; Gates, D.; Hatcher, R. E.; Jarboe, T.; Jardin, S. C.; Johnson, D.; Kaita, R.; Kalish, M.; Kessel, C. E.; Kugel, H. W.; Maingi, R.; Majeski, R.; Manickam, J.; McCormack, B.; Menard, J.; Mueller, D.; Nelson, B. A.; Nelson, B. E.; Neumeyer, C.; Oliaro, G.; Paoletti, F.; Parsells, R.; Perry, E.; Pomphrey, N.; Ramakrishnan, S.; Raman, R.; Rewoldt, G.; Robinson, J.; Roquemore, A. L.; Ryan, P.; Sabbagh, S.; Swain, D.; Synakowski, E. J.; Viola, M.; Williams, M.; Wilson, J. R.; NSTX Team
2000-03-01
The National Spherical Torus Experiment (NSTX) is being built at Princeton Plasma Physics Laboratory to test the fusion physics principles for the spherical torus concept at the MA level. The NSTX nominal plasma parameters are R0 = 85 cm, a = 67 cm, R/a >= 1.26, Bt = 3 kG, Ip = 1 MA, q95 = 14, elongation κ <= 2.2, triangularity δ <= 0.5 and a plasma pulse length of up to 5 s. The plasma heating/current drive tools are high harmonic fast wave (6 MW, 5 s), neutral beam injection (5 MW, 80 keV, 5 s) and coaxial helicity injection. Theoretical calculations predict that NSTX should provide exciting possibilities for exploring a number of important new physics regimes, including very high plasma β, naturally high plasma elongation, high bootstrap current fraction, absolute magnetic well and high pressure driven sheared flow. In addition, the NSTX programme plans to explore fully non-inductive plasma startup as well as a dispersive scrape-off layer for heat and particle flux handling.
HSX as an example of a resilient non-resonant divertor
Bader, A.; Boozer, A. H.; Hegna, C. C.; ...
2017-03-16
This study describes an initial description of the resilient divertor properties of quasi-symmetric (QS) stellarators using the HSX (Helically Symmetric eXperiment) configuration as a test-case. Divertors in high-performance QS stellarators will need to be resilient to changes in plasma configuration that arise due to evolution of plasma pressure profiles and bootstrap currents for divertor design. Resiliency is tested by examining the changes in strike point patterns from the field line following, which arise due to configurational changes. A low strike point variation with high configuration changes corresponds to high resiliency. The HSX edge displays resilient properties with configuration changes arisingmore » from the (1) wall position, (2) plasma current, and (3) external coils. The resilient behavior is lost if large edge islands intersect the wall structure. The resilient edge properties are corroborated by heat flux calculations from the fully 3-D plasma simulations using EMC3-EIRENE. Additionally, the strike point patterns are found to correspond to high curvature regions of magnetic flux surfaces.« less
A high-efficiency low-voltage CMOS rectifier for harvesting energy in implantable devices.
Hashemi, S Saeid; Sawan, Mohamad; Savaria, Yvon
2012-08-01
We present, in this paper, a new full-wave CMOS rectifier dedicated for wirelessly-powered low-voltage biomedical implants. It uses bootstrapped capacitors to reduce the effective threshold voltage of selected MOS switches. It achieves a significant increase in its overall power efficiency and low voltage-drop. Therefore, the rectifier is good for applications with low-voltage power supplies and large load current. The rectifier topology does not require complex circuit design. The highest voltages available in the circuit are used to drive the gates of selected transistors in order to reduce leakage current and to lower their channel on-resistance, while having high transconductance. The proposed rectifier was fabricated using the standard TSMC 0.18 μm CMOS process. When connected to a sinusoidal source of 3.3 V peak amplitude, it allows improving the overall power efficiency by 11% compared to the best recently published results given by a gate cross-coupled-based structure.
Monte Carlo based statistical power analysis for mediation models: methods and software.
Zhang, Zhiyong
2014-12-01
The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.
SOCIAL COMPETENCE AND PSYCHOLOGICAL VULNERABILITY: THE MEDIATING ROLE OF FLOURISHING.
Uysal, Recep
2015-10-01
This study examined whether flourishing mediated the social competence and psychological vulnerability. Participants were 259 university students (147 women, 112 men; M age = 21.3 yr., SD = 1.7) who completed the Turkish versions of the Perceived Social Competence Scale, the Flourishing Scale, and the Psychological Vulnerability Scale. Mediation models were tested using the bootstrapping method to examine indirect effects. Consistent with the hypotheses, the results indicated a positive relationship between social competence and flourishing, and a negative relationship between social competence and psychological vulnerability. Results of the bootstrapping method revealed that flourishing significantly mediated the relationship between social competence and psychological vulnerability. The significance and limitations of the results were discussed.
Yang, Yi-Feng
2014-02-01
This paper discusses the effects of transformational leadership on cooperative conflict resolution (management) by evaluating several alternative models related to the mediating role of job satisfaction and change commitment. Samples of data from customer service personnel in Taiwan were analyzed. Based on the bootstrap sample technique, an empirical study was carried out to yield the best fitting model. The procedure of hierarchical nested model analysis was used, incorporating the methods of bootstrapping mediation, PRODCLIN2, and structural equation modeling (SEM) comparison. The analysis suggests that leadership that promotes integration (change commitment) and provides inspiration and motivation (job satisfaction), in the proper order, creates the means for cooperative conflict resolution.
Yang, Yi-Feng
2016-08-01
This study discusses the influence of transformational leadership on job satisfaction through assessing six alternative models related to the mediators of leadership trust and change commitment utilizing a data sample (N = 341; M age = 32.5 year, SD = 5.2) for service promotion personnel in Taiwan. The bootstrap sampling technique was used to select the better fitting model. The tool of hierarchical nested model analysis was applied, along with the approaches of bootstrapping mediation, PRODCLIN2, and structural equation modeling comparison. The results overall demonstrate that leadership is important and that leadership role identification (trust) and workgroup cohesiveness (commitment) form an ordered serial relationship. © The Author(s) 2016.
Uncertainty estimation of Intensity-Duration-Frequency relationships: A regional analysis
NASA Astrophysics Data System (ADS)
Mélèse, Victor; Blanchet, Juliette; Molinié, Gilles
2018-03-01
We propose in this article a regional study of uncertainties in IDF curves derived from point-rainfall maxima. We develop two generalized extreme value models based on the simple scaling assumption, first in the frequentist framework and second in the Bayesian framework. Within the frequentist framework, uncertainties are obtained i) from the Gaussian density stemming from the asymptotic normality theorem of the maximum likelihood and ii) with a bootstrap procedure. Within the Bayesian framework, uncertainties are obtained from the posterior densities. We confront these two frameworks on the same database covering a large region of 100, 000 km2 in southern France with contrasted rainfall regime, in order to be able to draw conclusion that are not specific to the data. The two frameworks are applied to 405 hourly stations with data back to the 1980's, accumulated in the range 3 h-120 h. We show that i) the Bayesian framework is more robust than the frequentist one to the starting point of the estimation procedure, ii) the posterior and the bootstrap densities are able to better adjust uncertainty estimation to the data than the Gaussian density, and iii) the bootstrap density give unreasonable confidence intervals, in particular for return levels associated to large return period. Therefore our recommendation goes towards the use of the Bayesian framework to compute uncertainty.
Confidence intervals for correlations when data are not normal.
Bishara, Anthony J; Hittner, James B
2017-02-01
With nonnormal data, the typical confidence interval of the correlation (Fisher z') may be inaccurate. The literature has been unclear as to which of several alternative methods should be used instead, and how extreme a violation of normality is needed to justify an alternative. Through Monte Carlo simulation, 11 confidence interval methods were compared, including Fisher z', two Spearman rank-order methods, the Box-Cox transformation, rank-based inverse normal (RIN) transformation, and various bootstrap methods. Nonnormality often distorted the Fisher z' confidence interval-for example, leading to a 95 % confidence interval that had actual coverage as low as 68 %. Increasing the sample size sometimes worsened this problem. Inaccurate Fisher z' intervals could be predicted by a sample kurtosis of at least 2, an absolute sample skewness of at least 1, or significant violations of normality hypothesis tests. Only the Spearman rank-order and RIN transformation methods were universally robust to nonnormality. Among the bootstrap methods, an observed imposed bootstrap came closest to accurate coverage, though it often resulted in an overly long interval. The results suggest that sample nonnormality can justify avoidance of the Fisher z' interval in favor of a more robust alternative. R code for the relevant methods is provided in supplementary materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCann, Cooper; Repasky, Kevin S.; Morin, Mikindra
Low-cost flight-based hyperspectral imaging systems have the potential to provide important information for ecosystem and environmental studies as well as aide in land management. To realize this potential, methods must be developed to provide large-area surface reflectance data allowing for temporal data sets at the mesoscale. This paper describes a bootstrap method of producing a large-area, radiometrically referenced hyperspectral data set using the Landsat surface reflectance (LaSRC) data product as a reference target. The bootstrap method uses standard hyperspectral processing techniques that are extended to remove uneven illumination conditions between flight passes, allowing for radiometrically self-consistent data after mosaicking. Throughmore » selective spectral and spatial resampling, LaSRC data are used as a radiometric reference target. Advantages of the bootstrap method include the need for minimal site access, no ancillary instrumentation, and automated data processing. Data from two hyperspectral flights over the same managed agricultural and unmanaged range land covering approximately 5.8 km 2 acquired on June 21, 2014 and June 24, 2015 are presented. As a result, data from a flight over agricultural land collected on June 6, 2016 are compared with concurrently collected ground-based reflectance spectra as a means of validation.« less
McCann, Cooper; Repasky, Kevin S.; Morin, Mikindra; ...
2017-07-25
Low-cost flight-based hyperspectral imaging systems have the potential to provide important information for ecosystem and environmental studies as well as aide in land management. To realize this potential, methods must be developed to provide large-area surface reflectance data allowing for temporal data sets at the mesoscale. This paper describes a bootstrap method of producing a large-area, radiometrically referenced hyperspectral data set using the Landsat surface reflectance (LaSRC) data product as a reference target. The bootstrap method uses standard hyperspectral processing techniques that are extended to remove uneven illumination conditions between flight passes, allowing for radiometrically self-consistent data after mosaicking. Throughmore » selective spectral and spatial resampling, LaSRC data are used as a radiometric reference target. Advantages of the bootstrap method include the need for minimal site access, no ancillary instrumentation, and automated data processing. Data from two hyperspectral flights over the same managed agricultural and unmanaged range land covering approximately 5.8 km 2 acquired on June 21, 2014 and June 24, 2015 are presented. As a result, data from a flight over agricultural land collected on June 6, 2016 are compared with concurrently collected ground-based reflectance spectra as a means of validation.« less
Confidence intervals for distinguishing ordinal and disordinal interactions in multiple regression.
Lee, Sunbok; Lei, Man-Kit; Brody, Gene H
2015-06-01
Distinguishing between ordinal and disordinal interaction in multiple regression is useful in testing many interesting theoretical hypotheses. Because the distinction is made based on the location of a crossover point of 2 simple regression lines, confidence intervals of the crossover point can be used to distinguish ordinal and disordinal interactions. This study examined 2 factors that need to be considered in constructing confidence intervals of the crossover point: (a) the assumption about the sampling distribution of the crossover point, and (b) the possibility of abnormally wide confidence intervals for the crossover point. A Monte Carlo simulation study was conducted to compare 6 different methods for constructing confidence intervals of the crossover point in terms of the coverage rate, the proportion of true values that fall to the left or right of the confidence intervals, and the average width of the confidence intervals. The methods include the reparameterization, delta, Fieller, basic bootstrap, percentile bootstrap, and bias-corrected accelerated bootstrap methods. The results of our Monte Carlo simulation study suggest that statistical inference using confidence intervals to distinguish ordinal and disordinal interaction requires sample sizes more than 500 to be able to provide sufficiently narrow confidence intervals to identify the location of the crossover point. (c) 2015 APA, all rights reserved).
Gotelli, Nicholas J.; Dorazio, Robert M.; Ellison, Aaron M.; Grossman, Gary D.
2010-01-01
Quantifying patterns of temporal trends in species assemblages is an important analytical challenge in community ecology. We describe methods of analysis that can be applied to a matrix of counts of individuals that is organized by species (rows) and time-ordered sampling periods (columns). We first developed a bootstrapping procedure to test the null hypothesis of random sampling from a stationary species abundance distribution with temporally varying sampling probabilities. This procedure can be modified to account for undetected species. We next developed a hierarchical model to estimate species-specific trends in abundance while accounting for species-specific probabilities of detection. We analysed two long-term datasets on stream fishes and grassland insects to demonstrate these methods. For both assemblages, the bootstrap test indicated that temporal trends in abundance were more heterogeneous than expected under the null model. We used the hierarchical model to estimate trends in abundance and identified sets of species in each assemblage that were steadily increasing, decreasing or remaining constant in abundance over more than a decade of standardized annual surveys. Our methods of analysis are broadly applicable to other ecological datasets, and they represent an advance over most existing procedures, which do not incorporate effects of incomplete sampling and imperfect detection.
Modality specificity and integration in working memory: Insights from visuospatial bootstrapping.
Allen, Richard J; Havelka, Jelena; Falcon, Thomas; Evans, Sally; Darling, Stephen
2015-05-01
The question of how meaningful associations between verbal and spatial information might be utilized to facilitate working memory performance is potentially highly instructive for models of memory function. The present study explored how separable processing capacities within specialized domains might each contribute to this, by examining the disruptive impacts of simple verbal and spatial concurrent tasks on young adults' recall of visually presented digit sequences encountered either in a single location or within a meaningful spatial "keypad" configuration. The previously observed advantage for recall in the latter condition (the "visuospatial bootstrapping effect") consistently emerged across 3 experiments, indicating use of familiar spatial information in boosting verbal memory. The magnitude of this effect interacted with concurrent activity; articulatory suppression during encoding disrupted recall to a greater extent when digits were presented in single locations (Experiment 1), while spatial tapping during encoding had a larger impact on the keypad condition and abolished the visuospatial bootstrapping advantage (Experiment 2). When spatial tapping was performed during recall (Experiment 3), no task by display interaction was observed. Outcomes are discussed within the context of the multicomponent model of working memory, with a particular emphasis on cross-domain storage in the episodic buffer (Baddeley, 2000). (c) 2015 APA, all rights reserved).
Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G
2018-03-01
Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.
Heating and current drive requirements towards steady state operation in ITER
NASA Astrophysics Data System (ADS)
Poli, Francesca; Kessel, Charles; Bonoli, Paul; Batchelor, Donald; Harvey, Bob
2013-10-01
Steady state scenarios envisaged for ITER aim at optimizing the bootstrap current, while maintaining sufficient confinement and stability. Non-inductive scenarios will need to operate with Internal Transport Barriers (ITBs) to reach adequate fusion gain at typical currents of 9 MA. Scenarios are established as relaxed flattop states with time-dependent transport simulations with TSC. The E × B flow shear from toroidal plasma rotation is expected to be low in ITER, with a major role in the ITB dynamics being played by magnetic geometry. Combinations of external sources that maintain weakly reversed shear profiles and ρ (qmin >= 0 . 5 are the focus of this work. Simulations indicate that, with a trade-off of the EC equatorial and upper launcher, the formation and sustainment of ITBs could be demonstrated with the baseline configuration. However, with proper constraints from peeling-ballooning theory on the pedestal width and height, the fusion gain and the maximum non-inductive current (6.2MA) are below the target. Upgrades of the heating and current drive system, like the use of Lower Hybrid current drive, could overcome these limitations. With 30MW of coupled LH in the flattop and operating at the Greenwald density, plasmas can sustain ~ 9 MA and achieve Q ~ 4 . Work supported by the US Department of Energy under DE-AC02-CH0911466.
NASA Astrophysics Data System (ADS)
Serinaldi, Francesco; Kilsby, Chris G.
2013-06-01
The information contained in hyetographs and hydrographs is often synthesized by using key properties such as the peak or maximum value Xp, volume V, duration D, and average intensity I. These variables play a fundamental role in hydrologic engineering as they are used, for instance, to define design hyetographs and hydrographs as well as to model and simulate the rainfall and streamflow processes. Given their inherent variability and the empirical evidence of the presence of a significant degree of association, such quantities have been studied as correlated random variables suitable to be modeled by multivariate joint distribution functions. The advent of copulas in geosciences simplified the inference procedures allowing for splitting the analysis of the marginal distributions and the study of the so-called dependence structure or copula. However, the attention paid to the modeling task has overlooked a more thorough study of the true nature and origin of the relationships that link Xp,V,D, and I. In this study, we apply a set of ad hoc bootstrap algorithms to investigate these aspects by analyzing the hyetographs and hydrographs extracted from 282 daily rainfall series from central eastern Europe, three 5 min rainfall series from central Italy, 80 daily streamflow series from the continental United States, and two sets of 200 simulated universal multifractal time series. Our results show that all the pairwise dependence structures between Xp,V,D, and I exhibit some key properties that can be reproduced by simple bootstrap algorithms that rely on a standard univariate resampling without resort to multivariate techniques. Therefore, the strong similarities between the observed dependence structures and the agreement between the observed and bootstrap samples suggest the existence of a numerical generating mechanism based on the superposition of the effects of sampling data at finite time steps and the process of summing realizations of independent random variables over random durations. We also show that the pairwise dependence structures are weakly dependent on the internal patterns of the hyetographs and hydrographs, meaning that the temporal evolution of the rainfall and runoff events marginally influences the mutual relationships of Xp,V,D, and I. Finally, our findings point out that subtle and often overlooked deterministic relationships between the properties of the event hyetographs and hydrographs exist. Confusing these relationships with genuine stochastic relationships can lead to an incorrect application of multivariate distributions and copulas and to misleading results.
Genetic Algorithm Based Framework for Automation of Stochastic Modeling of Multi-Season Streamflows
NASA Astrophysics Data System (ADS)
Srivastav, R. K.; Srinivasan, K.; Sudheer, K.
2009-05-01
Synthetic streamflow data generation involves the synthesis of likely streamflow patterns that are statistically indistinguishable from the observed streamflow data. The various kinds of stochastic models adopted for multi-season streamflow generation in hydrology are: i) parametric models which hypothesize the form of the periodic dependence structure and the distributional form a priori (examples are PAR, PARMA); disaggregation models that aim to preserve the correlation structure at the periodic level and the aggregated annual level; ii) Nonparametric models (examples are bootstrap/kernel based methods), which characterize the laws of chance, describing the stream flow process, without recourse to prior assumptions as to the form or structure of these laws; (k-nearest neighbor (k-NN), matched block bootstrap (MABB)); non-parametric disaggregation model. iii) Hybrid models which blend both parametric and non-parametric models advantageously to model the streamflows effectively. Despite many of these developments that have taken place in the field of stochastic modeling of streamflows over the last four decades, accurate prediction of the storage and the critical drought characteristics has been posing a persistent challenge to the stochastic modeler. This is partly because, usually, the stochastic streamflow model parameters are estimated by minimizing a statistically based objective function (such as maximum likelihood (MLE) or least squares (LS) estimation) and subsequently the efficacy of the models is being validated based on the accuracy of prediction of the estimates of the water-use characteristics, which requires large number of trial simulations and inspection of many plots and tables. Still accurate prediction of the storage and the critical drought characteristics may not be ensured. In this study a multi-objective optimization framework is proposed to find the optimal hybrid model (blend of a simple parametric model, PAR(1) model and matched block bootstrap (MABB) ) based on the explicit objective functions of minimizing the relative bias and relative root mean square error in estimating the storage capacity of the reservoir. The optimal parameter set of the hybrid model is obtained based on the search over a multi- dimensional parameter space (involving simultaneous exploration of the parametric (PAR(1)) as well as the non-parametric (MABB) components). This is achieved using the efficient evolutionary search based optimization tool namely, non-dominated sorting genetic algorithm - II (NSGA-II). This approach helps in reducing the drudgery involved in the process of manual selection of the hybrid model, in addition to predicting the basic summary statistics dependence structure, marginal distribution and water-use characteristics accurately. The proposed optimization framework is used to model the multi-season streamflows of River Beaver and River Weber of USA. In case of both the rivers, the proposed GA-based hybrid model yields a much better prediction of the storage capacity (where simultaneous exploration of both parametric and non-parametric components is done) when compared with the MLE-based hybrid models (where the hybrid model selection is done in two stages, thus probably resulting in a sub-optimal model). This framework can be further extended to include different linear/non-linear hybrid stochastic models at other temporal and spatial scales as well.
Nasr Esfahani, Bahram; Moghim, Sharareh; Ghasemian Safaei, Hajieh; Moghoofei, Mohsen; Sedighi, Mansour; Hadifar, Shima
2016-01-01
Background Taxonomic and phylogenetic studies of Mycobacterium species have been based around the 16sRNA gene for many years. However, due to the high strain similarity between species in the Mycobacterium genus (94.3% - 100%), defining a valid phylogenetic tree is difficult; consequently, its use in estimating the boundaries between species is limited. The sequence of the rpoB gene makes it an appropriate gene for phylogenetic analysis, especially in bacteria with limited variation. Objectives In the present study, a 360bp sequence of rpoB was used for precise classification of Mycobacterium strains isolated in Isfahan, Iran. Materials and Methods From February to October 2013, 57 clinical and environmental isolates were collected, subcultured, and identified by phenotypic methods. After DNA extraction, a 360bp fragment was PCR-amplified and sequenced. The phylogenetic tree was constructed based on consensus sequence data, using MEGA5 software. Results Slow and fast-growing groups of the Mycobacterium strains were clearly differentiated based on the constructed tree of 56 common Mycobacterium isolates. Each species with a unique title in the tree was identified; in total, 13 nods with a bootstrap value of over 50% were supported. Among the slow-growing group was Mycobacterium kansasii, with M. tuberculosis in a cluster with a bootstrap value of 98% and M. gordonae in another cluster with a bootstrap value of 90%. In the fast-growing group, one cluster with a bootstrap value of 89% was defined, including all fast-growing members present in this study. Conclusions The results suggest that only the application of the rpoB gene sequence is sufficient for taxonomic categorization and definition of a new Mycobacterium species, due to its high resolution power and proper variation in its sequence (85% - 100%); the resulting tree has high validity. PMID:27284397
NASA Astrophysics Data System (ADS)
Nagai, Yukie; Hosoda, Koh; Morita, Akio; Asada, Minoru
This study argues how human infants acquire the ability of joint attention through interactions with their caregivers from a viewpoint of cognitive developmental robotics. In this paper, a mechanism by which a robot acquires sensorimotor coordination for joint attention through bootstrap learning is described. Bootstrap learning is a process by which a learner acquires higher capabilities through interactions with its environment based on embedded lower capabilities even if the learner does not receive any external evaluation nor the environment is controlled. The proposed mechanism for bootstrap learning of joint attention consists of the robot's embedded mechanisms: visual attention and learning with self-evaluation. The former is to find and attend to a salient object in the field of the robot's view, and the latter is to evaluate the success of visual attention, not joint attention, and then to learn the sensorimotor coordination. Since the object which the robot looks at based on visual attention does not always correspond to the object which the caregiver is looking at in an environment including multiple objects, the robot may have incorrect learning situations for joint attention as well as correct ones. However, the robot is expected to statistically lose the learning data of the incorrect ones as outliers because of its weaker correlation between the sensor input and the motor output than that of the correct ones, and consequently to acquire appropriate sensorimotor coordination for joint attention even if the caregiver does not provide any task evaluation to the robot. The experimental results show the validity of the proposed mechanism. It is suggested that the proposed mechanism could explain the developmental mechanism of infants' joint attention because the learning process of the robot's joint attention can be regarded as equivalent to the developmental process of infants' one.
Yang, Xianjin; Chen, Xiao; Carrigan, Charles R.; ...
2014-06-03
A parametric bootstrap approach is presented for uncertainty quantification (UQ) of CO₂ saturation derived from electrical resistance tomography (ERT) data collected at the Cranfield, Mississippi (USA) carbon sequestration site. There are many sources of uncertainty in ERT-derived CO₂ saturation, but we focus on how the ERT observation errors propagate to the estimated CO₂ saturation in a nonlinear inversion process. Our UQ approach consists of three steps. We first estimated the observational errors from a large number of reciprocal ERT measurements. The second step was to invert the pre-injection baseline data and the resulting resistivity tomograph was used as the priormore » information for nonlinear inversion of time-lapse data. We assigned a 3% random noise to the baseline model. Finally, we used a parametric bootstrap method to obtain bootstrap CO₂ saturation samples by deterministically solving a nonlinear inverse problem many times with resampled data and resampled baseline models. Then the mean and standard deviation of CO₂ saturation were calculated from the bootstrap samples. We found that the maximum standard deviation of CO₂ saturation was around 6% with a corresponding maximum saturation of 30% for a data set collected 100 days after injection began. There was no apparent spatial correlation between the mean and standard deviation of CO₂ saturation but the standard deviation values increased with time as the saturation increased. The uncertainty in CO₂ saturation also depends on the ERT reciprocal error threshold used to identify and remove noisy data and inversion constraints such as temporal roughness. Five hundred realizations requiring 3.5 h on a single 12-core node were needed for the nonlinear Monte Carlo inversion to arrive at stationary variances while the Markov Chain Monte Carlo (MCMC) stochastic inverse approach may expend days for a global search. This indicates that UQ of 2D or 3D ERT inverse problems can be performed on a laptop or desktop PC.« less
Analyzing hospitalization data: potential limitations of Poisson regression.
Weaver, Colin G; Ravani, Pietro; Oliver, Matthew J; Austin, Peter C; Quinn, Robert R
2015-08-01
Poisson regression is commonly used to analyze hospitalization data when outcomes are expressed as counts (e.g. number of days in hospital). However, data often violate the assumptions on which Poisson regression is based. More appropriate extensions of this model, while available, are rarely used. We compared hospitalization data between 206 patients treated with hemodialysis (HD) and 107 treated with peritoneal dialysis (PD) using Poisson regression and compared results from standard Poisson regression with those obtained using three other approaches for modeling count data: negative binomial (NB) regression, zero-inflated Poisson (ZIP) regression and zero-inflated negative binomial (ZINB) regression. We examined the appropriateness of each model and compared the results obtained with each approach. During a mean 1.9 years of follow-up, 183 of 313 patients (58%) were never hospitalized (indicating an excess of 'zeros'). The data also displayed overdispersion (variance greater than mean), violating another assumption of the Poisson model. Using four criteria, we determined that the NB and ZINB models performed best. According to these two models, patients treated with HD experienced similar hospitalization rates as those receiving PD {NB rate ratio (RR): 1.04 [bootstrapped 95% confidence interval (CI): 0.49-2.20]; ZINB summary RR: 1.21 (bootstrapped 95% CI 0.60-2.46)}. Poisson and ZIP models fit the data poorly and had much larger point estimates than the NB and ZINB models [Poisson RR: 1.93 (bootstrapped 95% CI 0.88-4.23); ZIP summary RR: 1.84 (bootstrapped 95% CI 0.88-3.84)]. We found substantially different results when modeling hospitalization data, depending on the approach used. Our results argue strongly for a sound model selection process and improved reporting around statistical methods used for modeling count data. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
A Bootstrap Approach to an Affordable Exploration Program
NASA Technical Reports Server (NTRS)
Oeftering, Richard C.
2011-01-01
This paper examines the potential to build an affordable sustainable exploration program by adopting an approach that requires investing in technologies that can be used to build a space infrastructure from very modest initial capabilities. Human exploration has had a history of flight programs that have high development and operational costs. Since Apollo, human exploration has had very constrained budgets and they are expected be constrained in the future. Due to their high operations costs it becomes necessary to consider retiring established space facilities in order to move on to the next exploration challenge. This practice may save cost in the near term but it does so by sacrificing part of the program s future architecture. Human exploration also has a history of sacrificing fully functional flight hardware to achieve mission objectives. An affordable exploration program cannot be built when it involves billions of dollars of discarded space flight hardware, instead, the program must emphasize preserving its high value space assets and building a suitable permanent infrastructure. Further this infrastructure must reduce operational and logistics cost. The paper examines the importance of achieving a high level of logistics independence by minimizing resource consumption, minimizing the dependency on external logistics, and maximizing the utility of resources available. The approach involves the development and deployment of a core suite of technologies that have minimum initial needs yet are able expand upon initial capability in an incremental bootstrap fashion. The bootstrap approach incrementally creates an infrastructure that grows and becomes self sustaining and eventually begins producing the energy, products and consumable propellants that support human exploration. The bootstrap technologies involve new methods of delivering and manipulating energy and materials. These technologies will exploit the space environment, minimize dependencies, and minimize the need for imported resources. They will provide the widest range of utility in a resource scarce environment and pave the way to an affordable exploration program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paz-Soldan, C.; Logan, N. C.; Haskey, S. R.
The nature of the multi-modal n=2 plasma response and its impact on global confinement is studied as a function of the axisymmetric equilibrium pressure, edge safety factor, collisionality, and L-versus H-mode conditions. Varying the relative phase (ΔΦ UL) between upper and lower in-vessel coils demonstrates that different n=2 poloidal spectra preferentially excite different plasma responses. These different plasma response modes are preferentially detected on the tokamak high-field side (HFS) or low-field side (LFS) midplanes, have different radial extents, couple differently to the resonant surfaces, and have variable impacts on edge stability and global confinement. In all equilibrium conditions studied, themore » observed confinement degradation shares the same ΔΦ UL dependence as the coupling to the resonant surfaces given by both ideal (IPEC) and resistive (MARS-F) MHD computation. Varying the edge safety factor shifts the equilibrium field-line pitch and thus the ΔΦ UL dependence of both the global confinement and the n=2 magnetic response. As edge safety factor is varied, modeling finds that the HFS response (but not the LFS response), the resonant surface coupling, and the edge displacements near the X-point all share the same ΔΦ UL dependence. The LFS response magnitude is strongly sensitive to the core pressure and is insensitive to the collisionality and edge safety factor. This indicates that the LFS measurements are primarily sensitive to a pressure-driven kink-ballooning mode that couples to the core plasma. MHD modeling accurately reproduces these (and indeed all) LFS experimental trends and supports this interpretation. In contrast to the LFS, the HFS magnetic response and correlated global confinement impact is unchanged with plasma pressure, but is strongly reduced in high collisionality conditions in both H- and L-mode. This experimentally suggests the bootstrap current drives the HFS response through the kink-peeling mode drive, though surprisingly weak or no dependence on the bootstrap current is seen in modeling. Instead, modeling is revealed to be very sensitive to the details of the edge current profile and equilibrium truncation. Furthermore, holding truncation fixed, most HFS experimental trends are not captured, thus demonstrating a stark contrast between the robustness of the HFS experimental results and the sensitivity of its computation.« less
NASA Astrophysics Data System (ADS)
Paz-Soldan, C.; Logan, N. C.; Haskey, S. R.; Nazikian, R.; Strait, E. J.; Chen, X.; Ferraro, N. M.; King, J. D.; Lyons, B. C.; Park, J.-K.
2016-05-01
The nature of the multi-modal n = 2 plasma response and its impact on global confinement is studied as a function of the axisymmetric equilibrium pressure, edge safety factor, collisionality, and L-versus H-mode conditions. Varying the relative phase (Δ {φ\\text{UL}} ) between upper and lower in-vessel coils demonstrates that different n = 2 poloidal spectra preferentially excite different plasma responses. These different plasma response modes are preferentially detected on the tokamak high-field side (HFS) or low-field side (LFS) midplanes, have different radial extents, couple differently to the resonant surfaces, and have variable impacts on edge stability and global confinement. In all equilibrium conditions studied, the observed confinement degradation shares the same Δ {φ\\text{UL}} dependence as the coupling to the resonant surfaces given by both ideal (IPEC) and resistive (MARS-F) MHD computation. Varying the edge safety factor shifts the equilibrium field-line pitch and thus the Δ {φ\\text{UL}} dependence of both the global confinement and the n = 2 magnetic response. As edge safety factor is varied, modeling finds that the HFS response (but not the LFS response), the resonant surface coupling, and the edge displacements near the X-point all share the same Δ {φ\\text{UL}} dependence. The LFS response magnitude is strongly sensitive to the core pressure and is insensitive to the collisionality and edge safety factor. This indicates that the LFS measurements are primarily sensitive to a pressure-driven kink-ballooning mode that couples to the core plasma. MHD modeling accurately reproduces these (and indeed all) LFS experimental trends and supports this interpretation. In contrast to the LFS, the HFS magnetic response and correlated global confinement impact is unchanged with plasma pressure, but is strongly reduced in high collisionality conditions in both H- and L-mode. This experimentally suggests the bootstrap current drives the HFS response through the kink-peeling mode drive, though surprisingly weak or no dependence on the bootstrap current is seen in modeling. Instead, modeling is revealed to be very sensitive to the details of the edge current profile and equilibrium truncation. Holding truncation fixed, most HFS experimental trends are not captured, thus demonstrating a stark contrast between the robustness of the HFS experimental results and the sensitivity of its computation.
Paz-Soldan, C.; Logan, N. C.; Haskey, S. R.; ...
2016-03-31
The nature of the multi-modal n=2 plasma response and its impact on global confinement is studied as a function of the axisymmetric equilibrium pressure, edge safety factor, collisionality, and L-versus H-mode conditions. Varying the relative phase (ΔΦ UL) between upper and lower in-vessel coils demonstrates that different n=2 poloidal spectra preferentially excite different plasma responses. These different plasma response modes are preferentially detected on the tokamak high-field side (HFS) or low-field side (LFS) midplanes, have different radial extents, couple differently to the resonant surfaces, and have variable impacts on edge stability and global confinement. In all equilibrium conditions studied, themore » observed confinement degradation shares the same ΔΦ UL dependence as the coupling to the resonant surfaces given by both ideal (IPEC) and resistive (MARS-F) MHD computation. Varying the edge safety factor shifts the equilibrium field-line pitch and thus the ΔΦ UL dependence of both the global confinement and the n=2 magnetic response. As edge safety factor is varied, modeling finds that the HFS response (but not the LFS response), the resonant surface coupling, and the edge displacements near the X-point all share the same ΔΦ UL dependence. The LFS response magnitude is strongly sensitive to the core pressure and is insensitive to the collisionality and edge safety factor. This indicates that the LFS measurements are primarily sensitive to a pressure-driven kink-ballooning mode that couples to the core plasma. MHD modeling accurately reproduces these (and indeed all) LFS experimental trends and supports this interpretation. In contrast to the LFS, the HFS magnetic response and correlated global confinement impact is unchanged with plasma pressure, but is strongly reduced in high collisionality conditions in both H- and L-mode. This experimentally suggests the bootstrap current drives the HFS response through the kink-peeling mode drive, though surprisingly weak or no dependence on the bootstrap current is seen in modeling. Instead, modeling is revealed to be very sensitive to the details of the edge current profile and equilibrium truncation. Furthermore, holding truncation fixed, most HFS experimental trends are not captured, thus demonstrating a stark contrast between the robustness of the HFS experimental results and the sensitivity of its computation.« less
Eckermann, Simon; Karnon, Jon; Willan, Andrew R
2010-01-01
Value of information (VOI) methods have been proposed as a systematic approach to inform optimal research design and prioritization. Four related questions arise that VOI methods could address. (i) Is further research for a health technology assessment (HTA) potentially worthwhile? (ii) Is the cost of a given research design less than its expected value? (iii) What is the optimal research design for an HTA? (iv) How can research funding be best prioritized across alternative HTAs? Following Occam's razor, we consider the usefulness of VOI methods in informing questions 1-4 relative to their simplicity of use. Expected value of perfect information (EVPI) with current information, while simple to calculate, is shown to provide neither a necessary nor a sufficient condition to address question 1, given that what EVPI needs to exceed varies with the cost of research design, which can vary from very large down to negligible. Hence, for any given HTA, EVPI does not discriminate, as it can be large and further research not worthwhile or small and further research worthwhile. In contrast, each of questions 1-4 are shown to be fully addressed (necessary and sufficient) where VOI methods are applied to maximize expected value of sample information (EVSI) minus expected costs across designs. In comparing complexity in use of VOI methods, applying the central limit theorem (CLT) simplifies analysis to enable easy estimation of EVSI and optimal overall research design, and has been shown to outperform bootstrapping, particularly with small samples. Consequently, VOI methods applying the CLT to inform optimal overall research design satisfy Occam's razor in both improving decision making and reducing complexity. Furthermore, they enable consideration of relevant decision contexts, including option value and opportunity cost of delay, time, imperfect implementation and optimal design across jurisdictions. More complex VOI methods such as bootstrapping of the expected value of partial EVPI may have potential value in refining overall research design. However, Occam's razor must be seriously considered in application of these VOI methods, given their increased complexity and current limitations in informing decision making, with restriction to EVPI rather than EVSI and not allowing for important decision-making contexts. Initial use of CLT methods to focus these more complex partial VOI methods towards where they may be useful in refining optimal overall trial design is suggested. Integrating CLT methods with such partial VOI methods to allow estimation of partial EVSI is suggested in future research to add value to the current VOI toolkit.
Neoclassical transport in toroidal plasmas with nonaxisymmetric flux surfaces
Belli, Emily A.; Candy, Jefferey M.
2015-04-15
The capability to treat nonaxisymmetric flux surface geometry has been added to the drift-kinetic code NEO. Geometric quantities (i.e. metric elements) are supplied by a recently-developed local 3D equilibrium solver, allowing neoclassical transport coefficients to be systematically computed while varying the 3D plasma shape in a simple and intuitive manner. Code verification is accomplished via detailed comparison with 3D Pfirsch–Schlüter theory. A discussion of the various collisionality regimes associated with 3D transport is given, with an emphasis on non-ambipolar particle flux, neoclassical toroidal viscosity, energy flux and bootstrap current. As a result, we compute the transport in the presence ofmore » ripple-type perturbations in a DIII-D-like H-mode edge plasma.« less
Refining search terms for nanotechnology
NASA Astrophysics Data System (ADS)
Porter, Alan L.; Youtie, Jan; Shapira, Philip; Schoeneck, David J.
2008-05-01
The ability to delineate the boundaries of an emerging technology is central to obtaining an understanding of the technology's research paths and commercialization prospects. Nowhere is this more relevant than in the case of nanotechnology (hereafter identified as "nano") given its current rapid growth and multidisciplinary nature. (Under the rubric of nanotechnology, we also include nanoscience and nanoengineering.) Past efforts have utilized several strategies, including simple term search for the prefix nano, complex lexical and citation-based approaches, and bootstrapping techniques. This research introduces a modularized Boolean approach to defining nanotechnology which has been applied to several research and patenting databases. We explain our approach to downloading and cleaning data, and report initial results. Comparisons of this approach with other nanotechnology search formulations are presented. Implications for search strategy development and profiling of the nanotechnology field are discussed.
Tests of Mediation: Paradoxical Decline in Statistical Power as a Function of Mediator Collinearity
Beasley, T. Mark
2013-01-01
Increasing the correlation between the independent variable and the mediator (a coefficient) increases the effect size (ab) for mediation analysis; however, increasing a by definition increases collinearity in mediation models. As a result, the standard error of product tests increase. The variance inflation due to increases in a at some point outweighs the increase of the effect size (ab) and results in a loss of statistical power. This phenomenon also occurs with nonparametric bootstrapping approaches because the variance of the bootstrap distribution of ab approximates the variance expected from normal theory. Both variances increase dramatically when a exceeds the b coefficient, thus explaining the power decline with increases in a. Implications for statistical analysis and applied researchers are discussed. PMID:24954952
The Bootstrap, the Jackknife, and the Randomization Test: A Sampling Taxonomy.
Rodgers, J L
1999-10-01
A simple sampling taxonomy is defined that shows the differences between and relationships among the bootstrap, the jackknife, and the randomization test. Each method has as its goal the creation of an empirical sampling distribution that can be used to test statistical hypotheses, estimate standard errors, and/or create confidence intervals. Distinctions between the methods can be made based on the sampling approach (with replacement versus without replacement) and the sample size (replacing the whole original sample versus replacing a subset of the original sample). The taxonomy is useful for teaching the goals and purposes of resampling schemes. An extension of the taxonomy implies other possible resampling approaches that have not previously been considered. Univariate and multivariate examples are presented.
NASA Astrophysics Data System (ADS)
Kukkonen, M.; Maltamo, M.; Packalen, P.
2017-08-01
Image matching is emerging as a compelling alternative to airborne laser scanning (ALS) as a data source for forest inventory and management. There is currently an open discussion in the forest inventory community about whether, and to what extent, the new method can be applied to practical inventory campaigns. This paper aims to contribute to this discussion by comparing two different image matching algorithms (Semi-Global Matching [SGM] and Next-Generation Automatic Terrain Extraction [NGATE]) and ALS in a typical managed boreal forest environment in southern Finland. Spectral features from unrectified aerial images were included in the modeling and the potential of image matching in areas without a high resolution digital terrain model (DTM) was also explored. Plot level predictions for total volume, stem number, basal area, height of basal area median tree and diameter of basal area median tree were modeled using an area-based approach. Plot level dominant tree species were predicted using a random forest algorithm, also using an area-based approach. The statistical difference between the error rates from different datasets was evaluated using a bootstrap method. Results showed that ALS outperformed image matching with every forest attribute, even when a high resolution DTM was used for height normalization and spectral information from images was included. Dominant tree species classification with image matching achieved accuracy levels similar to ALS regardless of the resolution of the DTM when spectral metrics were used. Neither of the image matching algorithms consistently outperformed the other, but there were noticeably different error rates depending on the parameter configuration, spectral band, resolution of DTM, or response variable. This study showed that image matching provides reasonable point cloud data for forest inventory purposes, especially when a high resolution DTM is available and information from the understory is redundant.
Robot Behavior Acquisition Superposition and Composting of Behaviors Learned through Teleoperation
NASA Technical Reports Server (NTRS)
Peters, Richard Alan, II
2004-01-01
Superposition of a small set of behaviors, learned via teleoperation, can lead to robust completion of a simple articulated reach-and-grasp task. Results support the hypothesis that a set of learned behaviors can be combined to generate new behaviors of a similar type. This supports the hypothesis that a robot can learn to interact purposefully with its environment through a developmental acquisition of sensory-motor coordination. Teleoperation bootstraps the process by enabling the robot to observe its own sensory responses to actions that lead to specific outcomes. A reach-and-grasp task, learned by an articulated robot through a small number of teleoperated trials, can be performed autonomously with success in the face of significant variations in the environment and perturbations of the goal. Superpositioning was performed using the Verbs and Adverbs algorithm that was developed originally for the graphical animation of articulated characters. Work was performed on Robonaut at NASA-JSC.
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Lewicki, Scott; Morgan, Scott
2011-01-01
The measurement techniques for organizations which have achieved the Software Engineering Institutes CMMI Maturity Levels 4 and 5 are well documented. On the other hand, how to effectively measure when an organization is Maturity Level 3 is less well understood, especially when there is no consistency in tool use and there is extensive tailoring of the organizational software processes. Most organizations fail in their attempts to generate, collect, and analyze standard process improvement metrics under these conditions. But at JPL, NASA's prime center for deep space robotic exploration, we have a long history of proving there is always a solution: It just may not be what you expected. In this paper we describe the wide variety of qualitative and quantitative techniques we have been implementing over the last few years, including the various approaches used to communicate the results to both software technical managers and senior managers.
Gkorezis, Panagiotis; Panagiotou, Maria; Theodorou, Mamas
2016-10-01
The aim of this study was to examine the direct and indirect effect, through organizational identification, of workplace ostracism on nurses' silence towards patient safety. Employee silence in nursing has recently received attention in relation to its antecedents. Yet, very little is known about the role of workplace ostracism in generating nurses' silence. A cross-sectional survey was conducted in a public hospital in Cyprus. Data were collected from 157 nurses employed in a public hospital of Cyprus between November 2014-January 2015. To examine the present hypotheses bootstrapping analysis and Sobel test were conducted. Results demonstrated that workplace ostracism has an effect on nurses' silence towards patient safety. Moreover, this effect was partially mediated through organizational identification. Workplace ostracism among nurses significantly affects both nurses' attitude and behaviour namely organizational identification and employee silence. © 2016 John Wiley & Sons Ltd.
Analysis of genetic diversity of Persea bombycina "Som" using RAPD-based molecular markers.
Bhau, Brijmohan Singh; Medhi, Kalyani; Das, Ambrish P; Saikia, Siddhartha P; Neog, Kartik; Choudhury, S N
2009-08-01
The utility of RAPD markers in assessing genetic diversity and phenetic relationships in Persea bombycina, a major tree species for golden silk (muga) production, was investigated using 48 genotypes from northeast India. Thirteen RAPD primer combinations generated 93 bands. On average, seven RAPD fragments were amplified per reaction. In a UPGMA phenetic dendrogram based on Jaccard's coefficient, the P. bombycina accessions showed a high level of genetic variation, as indicated by genetic similarity. The grouping in the phenogram was highly consistent, as indicated by high values of cophenetic correlation and high bootstrap values at the key nodes. The accessions were scattered on a plot derived from principal correspondence analysis. The study concluded that the high level of genetic diversity in the P. bombycina accessions may be attributed to the species' outcrossing nature. This study may be useful in identifying diverse genetic stocks of P. bombycina, which may then be conserved on a priority basis.
Non-Solenoidal Startup Research Directions on the Pegasus Toroidal Experiment
NASA Astrophysics Data System (ADS)
Fonck, R. J.; Bongard, M. W.; Lewicki, B. T.; Reusch, J. A.; Winz, G. R.
2017-10-01
The Pegasus research program has been focused on developing a physical understanding and predictive models for non-solenoidal tokamak plasma startup using Local Helicity Injection (LHI). LHI employs strong localized electron currents injected along magnetic field lines in the plasma edge that relax through magnetic turbulence to form a tokamak-like plasma. Pending approval, the Pegasus program will address a broader, more comprehensive examination of non-solenoidal tokamak startup techniques. New capabilities may include: increasing the toroidal field to 0.6 T to support critical scaling tests to near-NSTX-U field levels; deploying internal plasma diagnostics; installing a coaxial helicity injection (CHI) capability in the upper divertor region; and deploying a modest (200-400 kW) electron cyclotron RF capability. These efforts will address scaling of relevant physics to higher BT, separate and comparative studies of helicity injection techniques, efficiency of handoff to consequent current sustainment techniques, and the use of ECH to synergistically improve the target plasma for consequent bootstrap and neutral beam current drive sustainment. This has an ultimate goal of validating techniques to produce a 1 MA target plasma in NSTX-U and beyond. Work supported by US DOE Grant DE-FG02-96ER54375.
de Paiva, Alessandra Marques; Barberena, Felipe Fajardo Villela Antolin; Lopes, Rosana Conrado
2016-06-01
Brazil holds most of the Atlantic Forest Domain and is also one of the Rubiaceae diversity centers in the Neotropics. Despite the urban expansion in the state of Rio de Janeiro, large areas of continuous vegetation with high connectivity degree can still be found. Recently, new Rubiaceae species have been described in the Rio de Janeiro flora, which present small populations and very particular distribution. The current paper analyzed the similarity in the floristic composition of the Rubiaceae in eight Atlantic Forest remnants of Rio de Janeiro state protected by Conservation Units. We also surveyed and set guidelines for conservation of microendemic species. The similarity analysis were based on previously published studies in Área de Proteção Ambiental de Grumari, Área de Proteção Ambiental Palmares, Parque Estadual da Serra da Tiririca, Parque Nacional do Itatiaia, Parque Nacional de Jurubatiba, Reserva Biológica de Poço das Antas, Reserva Biológica do Tinguá and Reserva Ecológica de Macaé de Cima - using the PAST software (“Paleontological Statistics”) with Sørensen coefficient. The floristic similarity analysis revealed two groups with distinct physiographic characteristics and different vegetation types. Group A consisted in two Restinga areas, Área de Proteção Ambiental de Grumari and Parque Nacional de Jurubatiba, which showed strong bootstrap support (98 %). Group B included forest remnants with distinct phytophisiognomies or altitudes, but with moderate bootstrap support. Low similarity levels among the eight areas were found due to the habitats’ heterogeneity. The current study pointed out 19 microendemic species from the Atlantic Forest, they present a single-site distribution or a distribution restricted to Mountain and Metropolitan regions of Rio de Janeiro state. Concerning the conservation status of microendemic species, discrepancies between the Catalogue of Flora of Rio de Janeiro and the Red Book of Brazilian Flora (two of the main reference catalogs of Brazilian flora) have been identified. We have also highlighted the need for recollecting microendemic species from the Atlantic Forest, and for properly assessing the degree of threat faced by these taxons early.
Small area estimation of proportions with different levels of auxiliary data.
Chandra, Hukum; Kumar, Sushil; Aditya, Kaustav
2018-03-01
Binary data are often of interest in many small areas of applications. The use of standard small area estimation methods based on linear mixed models becomes problematic for such data. An empirical plug-in predictor (EPP) under a unit-level generalized linear mixed model with logit link function is often used for the estimation of a small area proportion. However, this EPP requires the availability of unit-level population information for auxiliary data that may not be always accessible. As a consequence, in many practical situations, this EPP approach cannot be applied. Based on the level of auxiliary information available, different small area predictors for estimation of proportions are proposed. Analytic and bootstrap approaches to estimating the mean squared error of the proposed small area predictors are also developed. Monte Carlo simulations based on both simulated and real data show that the proposed small area predictors work well for generating the small area estimates of proportions and represent a practical alternative to the above approach. The developed predictor is applied to generate estimates of the proportions of indebted farm households at district-level using debt investment survey data from India. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Tests for informative cluster size using a novel balanced bootstrap scheme.
Nevalainen, Jaakko; Oja, Hannu; Datta, Somnath
2017-07-20
Clustered data are often encountered in biomedical studies, and to date, a number of approaches have been proposed to analyze such data. However, the phenomenon of informative cluster size (ICS) is a challenging problem, and its presence has an impact on the choice of a correct analysis methodology. For example, Dutta and Datta (2015, Biometrics) presented a number of marginal distributions that could be tested. Depending on the nature and degree of informativeness of the cluster size, these marginal distributions may differ, as do the choices of the appropriate test. In particular, they applied their new test to a periodontal data set where the plausibility of the informativeness was mentioned, but no formal test for the same was conducted. We propose bootstrap tests for testing the presence of ICS. A balanced bootstrap method is developed to successfully estimate the null distribution by merging the re-sampled observations with closely matching counterparts. Relying on the assumption of exchangeability within clusters, the proposed procedure performs well in simulations even with a small number of clusters, at different distributions and against different alternative hypotheses, thus making it an omnibus test. We also explain how to extend the ICS test to a regression setting and thereby enhancing its practical utility. The methodologies are illustrated using the periodontal data set mentioned earlier. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Chaudhuri, Shomesh E; Merfeld, Daniel M
2013-03-01
Psychophysics generally relies on estimating a subject's ability to perform a specific task as a function of an observed stimulus. For threshold studies, the fitted functions are called psychometric functions. While fitting psychometric functions to data acquired using adaptive sampling procedures (e.g., "staircase" procedures), investigators have encountered a bias in the spread ("slope" or "threshold") parameter that has been attributed to the serial dependency of the adaptive data. Using simulations, we confirm this bias for cumulative Gaussian parametric maximum likelihood fits on data collected via adaptive sampling procedures, and then present a bias-reduced maximum likelihood fit that substantially reduces the bias without reducing the precision of the spread parameter estimate and without reducing the accuracy or precision of the other fit parameters. As a separate topic, we explain how to implement this bias reduction technique using generalized linear model fits as well as other numeric maximum likelihood techniques such as the Nelder-Mead simplex. We then provide a comparison of the iterative bootstrap and observed information matrix techniques for estimating parameter fit variance from adaptive sampling procedure data sets. The iterative bootstrap technique is shown to be slightly more accurate; however, the observed information technique executes in a small fraction (0.005 %) of the time required by the iterative bootstrap technique, which is an advantage when a real-time estimate of parameter fit variance is required.
Bootstrap Enhanced Penalized Regression for Variable Selection with Neuroimaging Data.
Abram, Samantha V; Helwig, Nathaniel E; Moodie, Craig A; DeYoung, Colin G; MacDonald, Angus W; Waller, Niels G
2016-01-01
Recent advances in fMRI research highlight the use of multivariate methods for examining whole-brain connectivity. Complementary data-driven methods are needed for determining the subset of predictors related to individual differences. Although commonly used for this purpose, ordinary least squares (OLS) regression may not be ideal due to multi-collinearity and over-fitting issues. Penalized regression is a promising and underutilized alternative to OLS regression. In this paper, we propose a nonparametric bootstrap quantile (QNT) approach for variable selection with neuroimaging data. We use real and simulated data, as well as annotated R code, to demonstrate the benefits of our proposed method. Our results illustrate the practical potential of our proposed bootstrap QNT approach. Our real data example demonstrates how our method can be used to relate individual differences in neural network connectivity with an externalizing personality measure. Also, our simulation results reveal that the QNT method is effective under a variety of data conditions. Penalized regression yields more stable estimates and sparser models than OLS regression in situations with large numbers of highly correlated neural predictors. Our results demonstrate that penalized regression is a promising method for examining associations between neural predictors and clinically relevant traits or behaviors. These findings have important implications for the growing field of functional connectivity research, where multivariate methods produce numerous, highly correlated brain networks.
More N =4 superconformal bootstrap
NASA Astrophysics Data System (ADS)
Beem, Christopher; Rastelli, Leonardo; van Rees, Balt C.
2017-08-01
In this long overdue second installment, we continue to develop the conformal bootstrap program for N =4 superconformal field theories (SCFTs) in four dimensions via an analysis of the correlation function of four stress-tensor supermultiplets. We review analytic results for this correlator and make contact with the SCFT/chiral algebra correspondence of Beem et al. [Commun. Math. Phys. 336, 1359 (2015), 10.1007/s00220-014-2272-x]. We demonstrate that the constraints of unitarity and crossing symmetry require the central charge c to be greater than or equal to 3 /4 in any interacting N =4 SCFT. We apply numerical bootstrap methods to derive upper bounds on scaling dimensions and operator product expansion coefficients for several low-lying, unprotected operators as a function of the central charge. We interpret our bounds in the context of N =4 super Yang-Mills theories, formulating a series of conjectures regarding the embedding of the conformal manifold—parametrized by the complexified gauge coupling—into the space of scaling dimensions and operator product expansion coefficients. Our conjectures assign a distinguished role to points on the conformal manifold that are self-dual under a subgroup of the S -duality group. This paper contains a more detailed exposition of a number of results previously reported in Beem et al. [Phys. Rev. Lett. 111, 071601 (2013), 10.1103/PhysRevLett.111.071601] in addition to new results.
Bootstrap Enhanced Penalized Regression for Variable Selection with Neuroimaging Data
Abram, Samantha V.; Helwig, Nathaniel E.; Moodie, Craig A.; DeYoung, Colin G.; MacDonald, Angus W.; Waller, Niels G.
2016-01-01
Recent advances in fMRI research highlight the use of multivariate methods for examining whole-brain connectivity. Complementary data-driven methods are needed for determining the subset of predictors related to individual differences. Although commonly used for this purpose, ordinary least squares (OLS) regression may not be ideal due to multi-collinearity and over-fitting issues. Penalized regression is a promising and underutilized alternative to OLS regression. In this paper, we propose a nonparametric bootstrap quantile (QNT) approach for variable selection with neuroimaging data. We use real and simulated data, as well as annotated R code, to demonstrate the benefits of our proposed method. Our results illustrate the practical potential of our proposed bootstrap QNT approach. Our real data example demonstrates how our method can be used to relate individual differences in neural network connectivity with an externalizing personality measure. Also, our simulation results reveal that the QNT method is effective under a variety of data conditions. Penalized regression yields more stable estimates and sparser models than OLS regression in situations with large numbers of highly correlated neural predictors. Our results demonstrate that penalized regression is a promising method for examining associations between neural predictors and clinically relevant traits or behaviors. These findings have important implications for the growing field of functional connectivity research, where multivariate methods produce numerous, highly correlated brain networks. PMID:27516732
Roberts, Steven; Martin, Michael A
2010-01-01
Concerns have been raised about findings of associations between particulate matter (PM) air pollution and mortality that have been based on a single "best" model arising from a model selection procedure, because such a strategy may ignore model uncertainty inherently involved in searching through a set of candidate models to find the best model. Model averaging has been proposed as a method of allowing for model uncertainty in this context. To propose an extension (double BOOT) to a previously described bootstrap model-averaging procedure (BOOT) for use in time series studies of the association between PM and mortality. We compared double BOOT and BOOT with Bayesian model averaging (BMA) and a standard method of model selection [standard Akaike's information criterion (AIC)]. Actual time series data from the United States are used to conduct a simulation study to compare and contrast the performance of double BOOT, BOOT, BMA, and standard AIC. Double BOOT produced estimates of the effect of PM on mortality that have had smaller root mean squared error than did those produced by BOOT, BMA, and standard AIC. This performance boost resulted from estimates produced by double BOOT having smaller variance than those produced by BOOT and BMA. Double BOOT is a viable alternative to BOOT and BMA for producing estimates of the mortality effect of PM.
Bootstrapping of Life through Holonomy and Self-modification
NASA Astrophysics Data System (ADS)
Kazansky, Alexander B.
2010-11-01
Life on the Earth demonstrate not only adaptive, cognitive, particularly, anticipatory properties, but also active, transformative function to its local and global environment. As V. Vernadsky stated, life is a powerful geological force. Charles Darwin realized that too. In his last work [1] he proved, that earthworms through their vital activity in geological time scale are able to form and support contemporary structure of soil on the whole planet. Locally, through so-called process of niche construction [2] organisms virtually modifies abiotic and biotic factors of natural selection and thereby insert feedback loop in evolutionary process. Stigmergy [3] is one more form of indirect interaction of organisms via the environment by signs, left in local environment or just by performing working activity in swarms, leading to self-organization and coordination of actions in the process of refuges construction. In organization of life we can separate active, rigid, organism-like, autopoietic-like systems or less rigid, sympoietic, socio-biological type systems [4]. Nevertheless, all forms of life systems demonstrate so-called bootstrapping, or spontaneous process of self-organizing emergence. This process is feasible thanks to self-modification, and holonomy in their organization, or total reflexivity. Analysis of the role of indirect interactions in bootstrapping, made in this paper, is aimed at revealing relationships between concepts and making step to forming new systemic model of organization and evolution of special dual pair, biota and biosphere.
A neurocomputational theory of how explicit learning bootstraps early procedural learning.
Paul, Erick J; Ashby, F Gregory
2013-01-01
It is widely accepted that human learning and memory is mediated by multiple memory systems that are each best suited to different requirements and demands. Within the domain of categorization, at least two systems are thought to facilitate learning: an explicit (declarative) system depending largely on the prefrontal cortex, and a procedural (non-declarative) system depending on the basal ganglia. Substantial evidence suggests that each system is optimally suited to learn particular categorization tasks. However, it remains unknown precisely how these systems interact to produce optimal learning and behavior. In order to investigate this issue, the present research evaluated the progression of learning through simulation of categorization tasks using COVIS, a well-known model of human category learning that includes both explicit and procedural learning systems. Specifically, the model's parameter space was thoroughly explored in procedurally learned categorization tasks across a variety of conditions and architectures to identify plausible interaction architectures. The simulation results support the hypothesis that one-way interaction between the systems occurs such that the explicit system "bootstraps" learning early on in the procedural system. Thus, the procedural system initially learns a suboptimal strategy employed by the explicit system and later refines its strategy. This bootstrapping could be from cortical-striatal projections that originate in premotor or motor regions of cortex, or possibly by the explicit system's control of motor responses through basal ganglia-mediated loops.
Kang, Le; Carter, Randy; Darcy, Kathleen; Kauderer, James; Liao, Shu-Yuan
2013-01-01
In this article we use a latent class model (LCM) with prevalence modeled as a function of covariates to assess diagnostic test accuracy in situations where the true disease status is not observed, but observations on three or more conditionally independent diagnostic tests are available. A fast Monte Carlo EM (MCEM) algorithm with binary (disease) diagnostic data is implemented to estimate parameters of interest; namely, sensitivity, specificity, and prevalence of the disease as a function of covariates. To obtain standard errors for confidence interval construction of estimated parameters, the missing information principle is applied to adjust information matrix estimates. We compare the adjusted information matrix based standard error estimates with the bootstrap standard error estimates both obtained using the fast MCEM algorithm through an extensive Monte Carlo study. Simulation demonstrates that the adjusted information matrix approach estimates the standard error similarly with the bootstrap methods under certain scenarios. The bootstrap percentile intervals have satisfactory coverage probabilities. We then apply the LCM analysis to a real data set of 122 subjects from a Gynecologic Oncology Group (GOG) study of significant cervical lesion (S-CL) diagnosis in women with atypical glandular cells of undetermined significance (AGC) to compare the diagnostic accuracy of a histology-based evaluation, a CA-IX biomarker-based test and a human papillomavirus (HPV) DNA test. PMID:24163493