Sample records for bootstrap current fractions

  1. Transport modeling of the DIII-D high $${{\\beta}_{p}}$$ scenario and extrapolations to ITER steady-state operation

    DOE PAGES

    McClenaghan, Joseph; Garofalo, Andrea M.; Meneghini, Orso; ...

    2017-08-03

    In this study, transport modeling of a proposed ITER steady-state scenario based on DIII-D high poloidal-beta (more » $${{\\beta}_{p}}$$ ) discharges finds that ITB formation can occur with either sufficient rotation or a negative central shear q-profile. The high $${{\\beta}_{p}}$$ scenario is characterized by a large bootstrap current fraction (80%) which reduces the demands on the external current drive, and a large radius internal transport barrier which is associated with excellent normalized confinement. Modeling predictions of the electron transport in the high $${{\\beta}_{p}}$$ scenario improve as $${{q}_{95}}$$ approaches levels similar to typical existing models of ITER steady-state and the ion transport is turbulence dominated. Typical temperature and density profiles from the non-inductive high $${{\\beta}_{p}}$$ scenario on DIII-D are scaled according to 0D modeling predictions of the requirements for achieving a $Q=5$ steady-state fusion gain in ITER with 'day one' heating and current drive capabilities. Then, TGLF turbulence modeling is carried out under systematic variations of the toroidal rotation and the core q-profile. A high bootstrap fraction, high $${{\\beta}_{p}}$$ scenario is found to be near an ITB formation threshold, and either strong negative central magnetic shear or rotation in a high bootstrap fraction are found to successfully provide the turbulence suppression required to achieve $Q=5$.« less

  2. Progress Toward Steady State Tokamak Operation Exploiting the high bootstrap current fraction regime

    NASA Astrophysics Data System (ADS)

    Ren, Q.

    2015-11-01

    Recent DIII-D experiments have advanced the normalized fusion performance of the high bootstrap current fraction tokamak regime toward reactor-relevant steady state operation. The experiments, conducted by a joint team of researchers from the DIII-D and EAST tokamaks, developed a fully noninductive scenario that could be extended on EAST to a demonstration of long pulse steady-state tokamak operation. Fully noninductive plasmas with extremely high values of the poloidal beta, βp >= 4 , have been sustained at βT >= 2 % for long durations with excellent energy confinement quality (H98y,2 >= 1 . 5) and internal transport barriers (ITBs) generated at large minor radius (>= 0 . 6) in all channels (Te, Ti, ne, VTf). Large bootstrap fraction (fBS ~ 80 %) has been obtained with high βp. ITBs have been shown to be compatible with steady state operation. Because of the unusually large ITB radius, normalized pressure is not limited to low βN values by internal ITB-driven modes. βN up to ~4.3 has been obtained by optimizing the plasma-wall distance. The scenario is robust against several variations, including replacing some on-axis with off-axis neutral beam injection (NBI), adding electron cyclotron (EC) heating, and reducing the NBI torque by a factor of 2. This latter observation is particularly promising for extension of the scenario to EAST, where maximum power is obtained with balanced NBI injection, and to a reactor, expected to have low rotation. However, modeling of this regime has provided new challenges to state-of-the-art modeling capabilities: quasilinear models can dramatically underpredict the electron transport, and the Sauter bootstrap current can be insufficient. The analysis shows first-principle NEO is in good agreement with experiments for the bootstrap current calculation and ETG modes with a larger saturated amplitude or EM modes may provide the missing electron transport. Work supported in part by the US DOE under DE-FC02-04ER54698, DE-AC52-07NA27344, DE-AC02-09CH11466, and the NMCFP of China under 2015GB110000 and 2015GB102000.

  3. Joint DIII-D/EAST Experiments Toward Steady State AT Demonstration

    NASA Astrophysics Data System (ADS)

    Garofalo, A. M.; Meneghini, O.; Staebler, G. M.; van Zeeland, M. A.; Gong, X.; Ding, S.; Qian, J.; Ren, Q.; Xu, G.; Grierson, B. A.; Solomon, W. M.; Holcomb, C. T.

    2015-11-01

    Joint DIII-D/EAST experiments on fully noninductive operation at high poloidal beta have demonstrated several attractive features of this regime for a steady-state fusion reactor. Very large bootstrap fraction (>80 %) is desirable because it reduces the demands on external noninductive current drive. High bootstrap fraction with an H-mode edge results in a broad current profile and internal transport barriers (ITBs) at large minor radius, leading to high normalized energy confinement and high MHD stability limits. The ITB radius expands with higher normalized beta, further improving both stability and confinement. Electron density ITB and large Shafranov shift lead to low AE activity in the plasma core and low anomalous fast ion losses. Both the ITB and the current profile show remarkable robustness against perturbations, without external control. Supported by US DOE under DE-FC02-04ER54698, DE-AC02-09CH11466 & DE-AC52-07NA27344 & by NMCFSP under contracts 2015GB102000 and 2015GB110001.

  4. Transport Barriers in Bootstrap Driven Tokamaks

    NASA Astrophysics Data System (ADS)

    Staebler, Gary

    2017-10-01

    Maximizing the bootstrap current in a tokamak, so that it drives a high fraction of the total current, reduces the external power required to drive current by other means. Improved energy confinement, relative to empirical scaling laws, enables a reactor to more fully take advantage of the bootstrap driven tokamak. Experiments have demonstrated improved energy confinement due to the spontaneous formation of an internal transport barrier in high bootstrap fraction discharges. Gyrokinetic analysis, and quasilinear predictive modeling, demonstrates that the observed transport barrier is due to the suppression of turbulence primarily due to the large Shafranov shift. ExB velocity shear does not play a significant role in the transport barrier due to the high safety factor. It will be shown, that the Shafranov shift can produce a bifurcation to improved confinement in regions of positive magnetic shear or a continuous reduction in transport for weak or negative magnetic shear. Operation at high safety factor lowers the pressure gradient threshold for the Shafranov shift driven barrier formation. The ion energy transport is reduced to neoclassical and electron energy and particle transport is reduced, but still turbulent, within the barrier. Deeper into the plasma, very large levels of electron transport are observed. The observed electron temperature profile is shown to be close to the threshold for the electron temperature gradient (ETG) mode. A large ETG driven energy transport is qualitatively consistent with recent multi-scale gyrokinetic simulations showing that reducing the ion scale turbulence can lead to large increase in the electron scale transport. A new saturation model for the quasilinear TGLF transport code, that fits these multi-scale gyrokinetic simulations, can match the data if the impact of zonal flow mixing on the ETG modes is reduced at high safety factor. This work was supported by the U.S. Department of Energy under DE-FG02-95ER54309 and DE-FC02-04ER54698.

  5. Progress toward steady-state tokamak operation exploiting the high bootstrap current fraction regime

    DOE PAGES

    Ren, Q. L.; Garofalo, A. M.; Gong, X. Z.; ...

    2016-06-20

    Recent DIII-D experiments have increased the normalized fusion performance of the high bootstrap current fraction tokamak regime toward reactor-relevant steady state operation. The experiments, conducted by a joint team of researchers from the DIII-D and EAST tokamaks, developed a fully noninductive scenario that could be extended on EAST to a demonstration of long pulse steady-state tokamak operation. Improved understanding of scenario stability has led to the achievement of very high values of β p and β N despite strong ITBs. Good confinement has been achieved with reduced toroidal rotation. These high β p plasmas challenge the energy transport understanding, especiallymore » in the electron energy channel. A new turbulent transport model, named 2 TGLF-SAT1, has been developed which improves the transport prediction. Experiments extending results to long pulse on EAST, based on the physics basis developed at DIII-D, have been conducted. Finally, more investigations will be carried out on EAST with more additional auxiliary power to come online in the near term.« less

  6. Limitations of bootstrap current models

    DOE PAGES

    Belli, Emily A.; Candy, Jefferey M.; Meneghini, Orso; ...

    2014-03-27

    We assess the accuracy and limitations of two analytic models of the tokamak bootstrap current: (1) the well-known Sauter model and (2) a recent modification of the Sauter model by Koh et al. For this study, we use simulations from the first-principles kinetic code NEO as the baseline to which the models are compared. Tests are performed using both theoretical parameter scans as well as core- to-edge scans of real DIII-D and NSTX plasma profiles. The effects of extreme aspect ratio, large impurity fraction, energetic particles, and high collisionality are studied. In particular, the error in neglecting cross-species collisional couplingmore » – an approximation inherent to both analytic models – is quantified. Moreover, the implications of the corrections from kinetic NEO simulations on MHD equilibrium reconstructions is studied via integrated modeling with kinetic EFIT.« less

  7. Empirical single sample quantification of bias and variance in Q-ball imaging.

    PubMed

    Hainline, Allison E; Nath, Vishwesh; Parvathaneni, Prasanna; Blaber, Justin A; Schilling, Kurt G; Anderson, Adam W; Kang, Hakmook; Landman, Bennett A

    2018-02-06

    The bias and variance of high angular resolution diffusion imaging methods have not been thoroughly explored in the literature and may benefit from the simulation extrapolation (SIMEX) and bootstrap techniques to estimate bias and variance of high angular resolution diffusion imaging metrics. The SIMEX approach is well established in the statistics literature and uses simulation of increasingly noisy data to extrapolate back to a hypothetical case with no noise. The bias of calculated metrics can then be computed by subtracting the SIMEX estimate from the original pointwise measurement. The SIMEX technique has been studied in the context of diffusion imaging to accurately capture the bias in fractional anisotropy measurements in DTI. Herein, we extend the application of SIMEX and bootstrap approaches to characterize bias and variance in metrics obtained from a Q-ball imaging reconstruction of high angular resolution diffusion imaging data. The results demonstrate that SIMEX and bootstrap approaches provide consistent estimates of the bias and variance of generalized fractional anisotropy, respectively. The RMSE for the generalized fractional anisotropy estimates shows a 7% decrease in white matter and an 8% decrease in gray matter when compared with the observed generalized fractional anisotropy estimates. On average, the bootstrap technique results in SD estimates that are approximately 97% of the true variation in white matter, and 86% in gray matter. Both SIMEX and bootstrap methods are flexible, estimate population characteristics based on single scans, and may be extended for bias and variance estimation on a variety of high angular resolution diffusion imaging metrics. © 2018 International Society for Magnetic Resonance in Medicine.

  8. Effects of magnetic islands on bootstrap current in toroidal plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, G.; Lin, Z.

    The effects of magnetic islands on electron bootstrap current in toroidal plasmas are studied using gyrokinetic simulations. The magnetic islands cause little changes of the bootstrap current level in the banana regime because of trapped electron effects. In the plateau regime, the bootstrap current is completely suppressed at the island centers due to the destruction of trapped electron orbits by collisions and the flattening of pressure profiles by the islands. In the collisional regime, small but finite bootstrap current can exist inside the islands because of the pressure gradients created by large collisional transport across the islands. Lastly, simulation resultsmore » show that the bootstrap current level increases near the island separatrix due to steeper local density gradients.« less

  9. Effects of magnetic islands on bootstrap current in toroidal plasmas

    DOE PAGES

    Dong, G.; Lin, Z.

    2016-12-19

    The effects of magnetic islands on electron bootstrap current in toroidal plasmas are studied using gyrokinetic simulations. The magnetic islands cause little changes of the bootstrap current level in the banana regime because of trapped electron effects. In the plateau regime, the bootstrap current is completely suppressed at the island centers due to the destruction of trapped electron orbits by collisions and the flattening of pressure profiles by the islands. In the collisional regime, small but finite bootstrap current can exist inside the islands because of the pressure gradients created by large collisional transport across the islands. Lastly, simulation resultsmore » show that the bootstrap current level increases near the island separatrix due to steeper local density gradients.« less

  10. Transport barriers in bootstrap-driven tokamaks

    NASA Astrophysics Data System (ADS)

    Staebler, G. M.; Garofalo, A. M.; Pan, C.; McClenaghan, J.; Van Zeeland, M. A.; Lao, L. L.

    2018-05-01

    Experiments have demonstrated improved energy confinement due to the spontaneous formation of an internal transport barrier in high bootstrap fraction discharges. Gyrokinetic analysis, and quasilinear predictive modeling, demonstrates that the observed transport barrier is caused by the suppression of turbulence primarily from the large Shafranov shift. It is shown that the Shafranov shift can produce a bifurcation to improved confinement in regions of positive magnetic shear or a continuous reduction in transport for weak or negative magnetic shear. Operation at high safety factor lowers the pressure gradient threshold for the Shafranov shift-driven barrier formation. Two self-organized states of the internal and edge transport barrier are observed. It is shown that these two states are controlled by the interaction of the bootstrap current with magnetic shear, and the kinetic ballooning mode instability boundary. Election scale energy transport is predicted to be dominant in the inner 60% of the profile. Evidence is presented that energetic particle-driven instabilities could be playing a role in the thermal energy transport in this region.

  11. Reduced ion bootstrap current drive on NTM instability

    NASA Astrophysics Data System (ADS)

    Qu, Hongpeng; Wang, Feng; Wang, Aike; Peng, Xiaodong; Li, Jiquan

    2018-05-01

    The loss of bootstrap current inside magnetic island plays a dominant role in driving the neoclassical tearing mode (NTM) instability in tokamak plasmas. In this work, we investigate the finite-banana-width (FBW) effect on the profile of ion bootstrap current in the island vicinity via an analytical approach. The results show that even if the pressure gradient vanishes inside the island, the ion bootstrap current can partly survive due to the FBW effect. The efficiency of the FBW effect is higher when the island width becomes smaller. Nevertheless, even when the island width is comparable to the ion FBW, the unperturbed ion bootstrap current inside the island cannot be largely recovered by the FBW effect, and thus the current loss still exists. This suggests that FBW effect alone cannot dramatically reduce the ion bootstrap current drive on NTMs.

  12. Coupling of PIES 3-D Equilibrium Code and NIFS Bootstrap Code with Applications to the Computation of Stellarator Equilibria

    NASA Astrophysics Data System (ADS)

    Monticello, D. A.; Reiman, A. H.; Watanabe, K. Y.; Nakajima, N.; Okamoto, M.

    1997-11-01

    The existence of bootstrap currents in both tokamaks and stellarators was confirmed, experimentally, more than ten years ago. Such currents can have significant effects on the equilibrium and stability of these MHD devices. In addition, stellarators, with the notable exception of W7-X, are predicted to have such large bootstrap currents that reliable equilibrium calculations require the self-consistent evaluation of bootstrap currents. Modeling of discharges which contain islands requires an algorithm that does not assume good surfaces. Only one of the two 3-D equilibrium codes that exist, PIES( Reiman, A. H., Greenside, H. S., Compt. Phys. Commun. 43), (1986)., can easily be modified to handle bootstrap current. Here we report on the coupling of the PIES 3-D equilibrium code and NIFS bootstrap code(Watanabe, K., et al., Nuclear Fusion 35) (1995), 335.

  13. Test of bootstrap current models using high- β p EAST-demonstration plasmas on DIII-D

    DOE PAGES

    Ren, Qilong; Lao, Lang L.; Garofalo, Andrea M.; ...

    2015-01-12

    Magnetic measurements together with kinetic profile and motional Stark effect measurements are used in full kinetic equilibrium reconstructions to test the Sauter and NEO bootstrap current models in a DIII-D high-more » $${{\\beta}_{\\text{p}}}$$ EAST-demonstration experiment. This aims at developing on DIII-D a high bootstrap current scenario to be extended on EAST for a demonstration of true steady-state at high performance and uses EAST-similar operational conditions: plasma shape, plasma current, toroidal magnetic field, total heating power and current ramp-up rate. It is found that the large edge bootstrap current in these high-$${{\\beta}_{\\text{p}}}$$ plasmas allows the use of magnetic measurements to clearly distinguish the two bootstrap current models. In these high collisionality and high-$${{\\beta}_{\\text{p}}}$$ plasmas, the Sauter model overpredicts the peak of the edge current density by about 30%, while the first-principle kinetic NEO model is in close agreement with the edge current density of the reconstructed equilibrium. Furthermore, these results are consistent with recent work showing that the Sauter model largely overestimates the edge bootstrap current at high collisionality.« less

  14. Bootstrap and fast wave current drive for tokamak reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ehst, D.A.

    1991-09-01

    Using the multi-species neoclassical treatment of Hirshman and Sigmar we study steady state bootstrap equilibria with seed currents provided by low frequency (ICRF) fast waves and with additional surface current density driven by lower hybrid waves. This study applies to reactor plasmas of arbitrary aspect ratio. IN one limit the bootstrap component can supply nearly the total equilibrium current with minimal driving power (< 20 MW). However, for larger total currents considerable driving power is required (for ITER: I{sub o} = 18 MA needs P{sub FW} = 15 MW, P{sub LH} = 75 MW). A computational survey of bootstrap fractionmore » and current drive efficiency is presented. 11 refs., 8 figs.« less

  15. Bootstrap current in a tokamak

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kessel, C.E.

    1994-03-01

    The bootstrap current in a tokamak is examined by implementing the Hirshman-Sigmar model and comparing the predicted current profiles with those from two popular approximations. The dependences of the bootstrap current profile on the plasma properties are illustrated. The implications for steady state tokamaks are presented through two constraints; the pressure profile must be peaked and {beta}{sub p} must be kept below a critical value.

  16. Control of bootstrap current in the pedestal region of tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaing, K. C.; Department of Engineering Physics, University of Wisconsin, Madison, Wisconsin 53796; Lai, A. L.

    2013-12-15

    The high confinement mode (H-mode) plasmas in the pedestal region of tokamaks are characterized by steep gradient of the radial electric field, and sonic poloidal U{sub p,m} flow that consists of poloidal components of the E×B flow and the plasma flow velocity that is parallel to the magnetic field B. Here, E is the electric field. The bootstrap current that is important for the equilibrium, and stability of the pedestal of H-mode plasmas is shown to have an expression different from that in the conventional theory. In the limit where ‖U{sub p,m}‖≫ 1, the bootstrap current is driven by themore » electron temperature gradient and inductive electric field fundamentally different from that in the conventional theory. The bootstrap current in the pedestal region can be controlled through manipulating U{sub p,m} and the gradient of the radial electric. This, in turn, can control plasma stability such as edge-localized modes. Quantitative evaluations of various coefficients are shown to illustrate that the bootstrap current remains finite when ‖U{sub p,m}‖ approaches infinite and to provide indications how to control the bootstrap current. Approximate analytic expressions for viscous coefficients that join results in the banana and plateau-Pfirsch-Schluter regimes are presented to facilitate bootstrap and neoclassical transport simulations in the pedestal region.« less

  17. Electron transport fluxes in potato plateau regime

    NASA Astrophysics Data System (ADS)

    Shaing, K. C.; Hazeltine, R. D.

    1997-12-01

    Electron transport fluxes in the potato plateau regime are calculated from the solutions of the drift kinetic equation and fluid equations. It is found that the bootstrap current density remains finite in the region close to the magnetic axis, although it decreases with increasing collision frequency. This finite amount of the bootstrap current in the relatively collisional regime is important in modeling tokamak startup with 100% bootstrap current.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaing, K.C.; Hazeltine, R.D.

    Electron transport fluxes in the potato plateau regime are calculated from the solutions of the drift kinetic equation and fluid equations. It is found that the bootstrap current density remains finite in the region close to the magnetic axis, although it decreases with increasing collision frequency. This finite amount of the bootstrap current in the relatively collisional regime is important in modeling tokamak startup with 100{percent} bootstrap current. {copyright} {ital 1997 American Institute of Physics.}

  19. Advances in the high bootstrap fraction regime on DIII-D towards the Q = 5 mission of ITER steady state

    DOE PAGES

    Qian, Jinping P.; Garofalo, Andrea M.; Gong, Xianzu Z.; ...

    2017-03-20

    Recent EAST/DIII-D joint experiments on the high poloidal betamore » $${{\\beta}_{\\text{P}}}$$ regime in DIII-D have extended operation with internal transport barriers (ITBs) and excellent energy confinement (H 98y2 ~ 1.6) to higher plasma current, for lower q 95 ≤ 7.0, and more balanced neutral beam injection (NBI) (torque injection < 2 Nm), for lower plasma rotation than previous results. Transport analysis and experimental measurements at low toroidal rotation suggest that the E × B shear effect is not key to the ITB formation in these high $${{\\beta}_{\\text{P}}}$$ discharges. Experiments and TGLF modeling show that the Shafranov shift has a key stabilizing effect on turbulence. Extrapolation of the DIII-D results using a 0D model shows that with the improved confinement, the high bootstrap fraction regime could achieve fusion gain Q = 5 in ITER at $${{\\beta}_{\\text{N}}}$$ ~ 2.9 and q 95 ~ 7. With the optimization of q(0), the required improved confinement is achievable when using 1.5D TGLF-SAT1 for transport simulations. Furthermore, results reported in this paper suggest that the DIII-D high $${{\\beta}_{\\text{P}}}$$ scenario could be a candidate for ITER steady state operation.« less

  20. The economics of bootstrapping space industries - Development of an analytic computer model

    NASA Technical Reports Server (NTRS)

    Goldberg, A. H.; Criswell, D. R.

    1982-01-01

    A simple economic model of 'bootstrapping' industrial growth in space and on the Moon is presented. An initial space manufacturing facility (SMF) is assumed to consume lunar materials to enlarge the productive capacity in space. After reaching a predetermined throughput, the enlarged SMF is devoted to products which generate revenue continuously in proportion to the accumulated output mass (such as space solar power stations). Present discounted value and physical estimates for the general factors of production (transport, capital efficiency, labor, etc.) are combined to explore optimum growth in terms of maximized discounted revenues. It is found that 'bootstrapping' reduces the fractional cost to a space industry of transport off-Earth, permits more efficient use of a given transport fleet. It is concluded that more attention should be given to structuring 'bootstrapping' scenarios in which 'learning while doing' can be more fully incorporated in program analysis.

  1. Heating and current drive requirements for ideal MHD stability and ITB sustainment in ITER steady state scenarios

    NASA Astrophysics Data System (ADS)

    Poli, Francesca

    2012-10-01

    Steady state scenarios envisaged for ITER aim at optimizing the bootstrap current, while maintaining sufficient confinement and stability to provide the necessary fusion yield. Non-inductive scenarios will need to operate with Internal Transport Barriers (ITBs) in order to reach adequate fusion gain at typical currents of 9 MA. However, the large pressure gradients associated with ITBs in regions of weak or negative magnetic shear can be conducive to ideal MHD instabilities in a wide range of βN, reducing the no-wall limit. Scenarios are established as relaxed flattop states with time-dependent transport simulations with TSC [1]. Fully non-inductive configurations with current in the range of 7-10 MA and various heating mixes (NB, EC, IC and LH) have been studied against variations of the pressure profile peaking and of the Greenwald fraction. It is found that stable equilibria have qmin> 2 and moderate ITBs at 2/3 of the minor radius [2]. The ExB flow shear from toroidal plasma rotation is expected to be low in ITER, with a major role in the ITB dynamics being played by magnetic geometry. Combinations of H&CD sources that maintain reverse or weak magnetic shear profiles throughout the discharge and ρ(qmin)>=0.5 are the focus of this work. The ITER EC upper launcher, designed for NTM control, can provide enough current drive off-axis to sustain moderate ITBs at mid-radius and maintain a non-inductive current of 8-9MA and H98>=1.5 with the day one heating mix. LH heating and current drive is effective in modifying the current profile off-axis, facilitating the formation of stronger ITBs in the rampup phase, their sustainment at larger radii and larger bootstrap fraction. The implications for steady state operation and fusion performance are discussed.[4pt] [1] Jardin S.C. et al, J. Comput. Phys. 66 (1986) 481[0pt] [2] Poli F.M. et al, Nucl. Fusion 52 (2012) 063027.

  2. Bootstrap current control studies in the Wendelstein 7-X stellarator using the free-plasma-boundary version of the SIESTA MHD equilibrium code

    NASA Astrophysics Data System (ADS)

    Peraza-Rodriguez, H.; Reynolds-Barredo, J. M.; Sanchez, R.; Tribaldos, V.; Geiger, J.

    2018-02-01

    The recently developed free-plasma-boundary version of the SIESTA MHD equilibrium code (Hirshman et al 2011 Phys. Plasmas 18 062504; Peraza-Rodriguez et al 2017 Phys. Plasmas 24 082516) is used for the first time to study scenarios with considerable bootstrap currents for the Wendelstein 7-X (W7-X) stellarator. Bootstrap currents in the range of tens of kAs can lead to the formation of unwanted magnetic island chains or stochastic regions within the plasma and alter the boundary rotational transform due to the small shear in W7-X. The latter issue is of relevance since the island divertor operation of W7-X relies on a proper positioning of magnetic island chains at the plasma edge to control the particle and energy exhaust towards the divertor plates. Two scenarios are examined with the new free-plasma-boundary capabilities of SIESTA: a freely evolving bootstrap current one that illustrates the difficulties arising from the dislocation of the boundary islands, and a second one in which off-axis electron cyclotron current drive (ECCD) is applied to compensate the effects of the bootstrap current and keep the island divertor configuration intact. SIESTA finds that off-axis ECCD is indeed able to keep the location and phase of the edge magnetic island chain unchanged, but it may also lead to an undesired stochastization of parts of the confined plasma if the EC deposition radial profile becomes too narrow.

  3. Insight from uncertainty: bootstrap-derived diffusion metrics differentially predict memory function among older adults.

    PubMed

    Vorburger, Robert S; Habeck, Christian G; Narkhede, Atul; Guzman, Vanessa A; Manly, Jennifer J; Brickman, Adam M

    2016-01-01

    Diffusion tensor imaging suffers from an intrinsic low signal-to-noise ratio. Bootstrap algorithms have been introduced to provide a non-parametric method to estimate the uncertainty of the measured diffusion parameters. To quantify the variability of the principal diffusion direction, bootstrap-derived metrics such as the cone of uncertainty have been proposed. However, bootstrap-derived metrics are not independent of the underlying diffusion profile. A higher mean diffusivity causes a smaller signal-to-noise ratio and, thus, increases the measurement uncertainty. Moreover, the goodness of the tensor model, which relies strongly on the complexity of the underlying diffusion profile, influences bootstrap-derived metrics as well. The presented simulations clearly depict the cone of uncertainty as a function of the underlying diffusion profile. Since the relationship of the cone of uncertainty and common diffusion parameters, such as the mean diffusivity and the fractional anisotropy, is not linear, the cone of uncertainty has a different sensitivity. In vivo analysis of the fornix reveals the cone of uncertainty to be a predictor of memory function among older adults. No significant correlation occurs with the common diffusion parameters. The present work not only demonstrates the cone of uncertainty as a function of the actual diffusion profile, but also discloses the cone of uncertainty as a sensitive predictor of memory function. Future studies should incorporate bootstrap-derived metrics to provide more comprehensive analysis.

  4. Evaluating the Invariance of Cognitive Profile Patterns Derived from Profile Analysis via Multidimensional Scaling (PAMS): A Bootstrapping Approach

    ERIC Educational Resources Information Center

    Kim, Se-Kang

    2010-01-01

    The aim of the current study is to validate the invariance of major profile patterns derived from multidimensional scaling (MDS) by bootstrapping. Profile Analysis via Multidimensional Scaling (PAMS) was employed to obtain profiles and bootstrapping was used to construct the sampling distributions of the profile coordinates and the empirical…

  5. Comparison of bootstrap approaches for estimation of uncertainties of DTI parameters.

    PubMed

    Chung, SungWon; Lu, Ying; Henry, Roland G

    2006-11-01

    Bootstrap is an empirical non-parametric statistical technique based on data resampling that has been used to quantify uncertainties of diffusion tensor MRI (DTI) parameters, useful in tractography and in assessing DTI methods. The current bootstrap method (repetition bootstrap) used for DTI analysis performs resampling within the data sharing common diffusion gradients, requiring multiple acquisitions for each diffusion gradient. Recently, wild bootstrap was proposed that can be applied without multiple acquisitions. In this paper, two new approaches are introduced called residual bootstrap and repetition bootknife. We show that repetition bootknife corrects for the large bias present in the repetition bootstrap method and, therefore, better estimates the standard errors. Like wild bootstrap, residual bootstrap is applicable to single acquisition scheme, and both are based on regression residuals (called model-based resampling). Residual bootstrap is based on the assumption that non-constant variance of measured diffusion-attenuated signals can be modeled, which is actually the assumption behind the widely used weighted least squares solution of diffusion tensor. The performances of these bootstrap approaches were compared in terms of bias, variance, and overall error of bootstrap-estimated standard error by Monte Carlo simulation. We demonstrate that residual bootstrap has smaller biases and overall errors, which enables estimation of uncertainties with higher accuracy. Understanding the properties of these bootstrap procedures will help us to choose the optimal approach for estimating uncertainties that can benefit hypothesis testing based on DTI parameters, probabilistic fiber tracking, and optimizing DTI methods.

  6. Advances in the high bootstrap fraction regime on DIII-D towards the Q  =  5 mission of ITER steady state

    NASA Astrophysics Data System (ADS)

    Qian, J. P.; Garofalo, A. M.; Gong, X. Z.; Ren, Q. L.; Ding, S. Y.; Solomon, W. M.; Xu, G. S.; Grierson, B. A.; Guo, W. F.; Holcomb, C. T.; McClenaghan, J.; McKee, G. R.; Pan, C. K.; Huang, J.; Staebler, G. M.; Wan, B. N.

    2017-05-01

    Recent EAST/DIII-D joint experiments on the high poloidal beta {β\\text{P}} regime in DIII-D have extended operation with internal transport barriers (ITBs) and excellent energy confinement (H 98y2 ~ 1.6) to higher plasma current, for lower q 95  ⩽  7.0, and more balanced neutral beam injection (NBI) (torque injection  <  2 Nm), for lower plasma rotation than previous results (Garofalo et al, IAEA 2014, Gong et al 2014 IAEA Int. Conf. on Fusion Energy). Transport analysis and experimental measurements at low toroidal rotation suggest that the E  ×  B shear effect is not key to the ITB formation in these high {β\\text{P}} discharges. Experiments and TGLF modeling show that the Shafranov shift has a key stabilizing effect on turbulence. Extrapolation of the DIII-D results using a 0D model shows that with the improved confinement, the high bootstrap fraction regime could achieve fusion gain Q  =  5 in ITER at {β\\text{N}} ~ 2.9 and q 95 ~ 7. With the optimization of q(0), the required improved confinement is achievable when using 1.5D TGLF-SAT1 for transport simulations. Results reported in this paper suggest that the DIII-D high {β\\text{P}} scenario could be a candidate for ITER steady state operation.

  7. Prospects for steady-state scenarios on JET

    NASA Astrophysics Data System (ADS)

    Litaudon, X.; Bizarro, J. P. S.; Challis, C. D.; Crisanti, F.; DeVries, P. C.; Lomas, P.; Rimini, F. G.; Tala, T. J. J.; Akers, R.; Andrew, Y.; Arnoux, G.; Artaud, J. F.; Baranov, Yu F.; Beurskens, M.; Brix, M.; Cesario, R.; DeLa Luna, E.; Fundamenski, W.; Giroud, C.; Hawkes, N. C.; Huber, A.; Joffrin, E.; Pitts, R. A.; Rachlew, E.; Reyes-Cortes, S. D. A.; Sharapov, S. E.; Zastrow, K. D.; Zimmermann, O.; JET EFDA contributors, the

    2007-09-01

    In the 2006 experimental campaign, progress has been made on JET to operate non-inductive scenarios at higher applied powers (31 MW) and density (nl ~ 4 × 1019 m-3), with ITER-relevant safety factor (q95 ~ 5) and plasma shaping, taking advantage of the new divertor capabilities. The extrapolation of the performance using transport modelling benchmarked on the experimental database indicates that the foreseen power upgrade (~45 MW) will allow the development of non-inductive scenarios where the bootstrap current is maximized together with the fusion yield and not, as in present-day experiments, at its expense. The tools for the long-term JET programme are the new ITER-like ICRH antenna (~15 MW), an upgrade of the NB power (35 MW/20 s or 17.5 MW/40 s), a new ITER-like first wall, a new pellet injector for edge localized mode control together with improved diagnostic and control capability. Operation with the new wall will set new constraints on non-inductive scenarios that are already addressed experimentally and in the modelling. The fusion performance and driven current that could be reached at high density and power have been estimated using either 0D or 1-1/2D validated transport models. In the high power case (45 MW), the calculations indicate the potential for the operational space of the non-inductive regime to be extended in terms of current (~2.5 MA) and density (nl > 5 × 1019 m-3), with high βN (βN > 3.0) and a fraction of the bootstrap current within 60-70% at high toroidal field (~3.5 T).

  8. A neural network based reputation bootstrapping approach for service selection

    NASA Astrophysics Data System (ADS)

    Wu, Quanwang; Zhu, Qingsheng; Li, Peng

    2015-10-01

    With the concept of service-oriented computing becoming widely accepted in enterprise application integration, more and more computing resources are encapsulated as services and published online. Reputation mechanism has been studied to establish trust on prior unknown services. One of the limitations of current reputation mechanisms is that they cannot assess the reputation of newly deployed services as no record of their previous behaviours exists. Most of the current bootstrapping approaches merely assign default reputation values to newcomers. However, by this kind of methods, either newcomers or existing services will be favoured. In this paper, we present a novel reputation bootstrapping approach, where correlations between features and performance of existing services are learned through an artificial neural network (ANN) and they are then generalised to establish a tentative reputation when evaluating new and unknown services. Reputations of services published previously by the same provider are also incorporated for reputation bootstrapping if available. The proposed reputation bootstrapping approach is seamlessly embedded into an existing reputation model and implemented in the extended service-oriented architecture. Empirical studies of the proposed approach are shown at last.

  9. Impact of bootstrap current and Landau-fluid closure on ELM crashes and transport

    NASA Astrophysics Data System (ADS)

    Chen, J. G.; Xu, X. Q.; Ma, C. H.; Lei, Y. A.

    2018-05-01

    Results presented here are from 6-field Landau-Fluid simulations using shifted circular cross-section tokamak equilibria on BOUT++ framework. Linear benchmark results imply that the collisional and collisionless Landau resonance closures make a little difference on linear growth rate spectra which are quite close to the results with the flux limited Spitzer-Härm parallel flux. Both linear and nonlinear simulations show that the plasma current profile plays dual roles on the peeling-ballooning modes that it can drive the low-n peeling modes and stabilize the high-n ballooning modes. For fixed total pressure and current, as the pedestal current decreases due to the bootstrap current which becomes smaller when the density (collisionality) increases, the operational point is shifted downwards vertically in the Jped - α diagram, resulting in threshold changes of different modes. The bootstrap current can slightly increase radial turbulence spreading range and enhance the energy and particle transports by increasing the perturbed amplitude and broadening cross-phase frequency distribution.

  10. Validation of neoclassical bootstrap current models in the edge of an H-mode plasma.

    PubMed

    Wade, M R; Murakami, M; Politzer, P A

    2004-06-11

    Analysis of the parallel electric field E(parallel) evolution following an L-H transition in the DIII-D tokamak indicates the generation of a large negative pulse near the edge which propagates inward, indicative of the generation of a noninductive edge current. Modeling indicates that the observed E(parallel) evolution is consistent with a narrow current density peak generated in the plasma edge. Very good quantitative agreement is found between the measured E(parallel) evolution and that expected from neoclassical theory predictions of the bootstrap current.

  11. Multi-baseline bootstrapping at the Navy precision optical interferometer

    NASA Astrophysics Data System (ADS)

    Armstrong, J. T.; Schmitt, H. R.; Mozurkewich, D.; Jorgensen, A. M.; Muterspaugh, M. W.; Baines, E. K.; Benson, J. A.; Zavala, Robert T.; Hutter, D. J.

    2014-07-01

    The Navy Precision Optical Interferometer (NPOI) was designed from the beginning to support baseline boot- strapping with equally-spaced array elements. The motivation was the desire to image the surfaces of resolved stars with the maximum resolution possible with a six-element array. Bootstrapping two baselines together to track fringes on a third baseline has been used at the NPOI for many years, but the capabilities of the fringe tracking software did not permit us to bootstrap three or more baselines together. Recently, both a new backend (VISION; Tennessee State Univ.) and new hardware and firmware (AZ Embedded Systems and New Mexico Tech, respectively) for the current hybrid backend have made multi-baseline bootstrapping possible.

  12. Bootstrap Current for the Edge Pedestal Plasma in a Diverted Tokamak Geometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koh, S.; Chang, C. S.; Ku, S.

    The edge bootstrap current plays a critical role in the equilibrium and stability of the steep edge pedestal plasma. The pedestal plasma has an unconventional and difficult neoclassical property, as compared with the core plasma. It has a narrow passing particle region in velocity space that can be easily modified or destroyed by Coulomb collisions. At the same time, the edge pedestal plasma has steep pressure and electrostatic potential gradients whose scale-lengths are comparable with the ion banana width, and includes a magnetic separatrix surface, across which the topological properties of the magnetic field and particle orbits change abruptly. Amore » driftkinetic particle code XGC0, equipped with a mass-momentum-energy conserving collision operator, is used to study the edge bootstrap current in a realistic diverted magnetic field geometry with a self-consistent radial electric field. When the edge electrons are in the weakly collisional banana regime, surprisingly, the present kinetic simulation confirms that the existing analytic expressions [represented by O. Sauter et al. , Phys. Plasmas 6 , 2834 (1999)] are still valid in this unconventional region, except in a thin radial layer in contact with the magnetic separatrix. The agreement arises from the dominance of the electron contribution to the bootstrap current compared with ion contribution and from a reasonable separation of the trapped-passing dynamics without a strong collisional mixing. However, when the pedestal electrons are in plateau-collisional regime, there is significant deviation of numerical results from the existing analytic formulas, mainly due to large effective collisionality of the passing and the boundary layer trapped particles in edge region. In a conventional aspect ratio tokamak, the edge bootstrap current from kinetic simulation can be significantly less than that from the Sauter formula if the electron collisionality is high. On the other hand, when the aspect ratio is close to unity, the collisional edge bootstrap current can be significantly greater than that from the Sauter formula. Rapid toroidal rotation of the magnetic field lines at the high field side of a tight aspect-ratio tokamak is believed to be the cause of the different behavior. A new analytic fitting formula, as a simple modification to the Sauter formula, is obtained to bring the analytic expression to a better agreement with the edge kinetic simulation results« less

  13. Bootstrap current for the edge pedestal plasma in a diverted tokamak geometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koh, S.; Choe, W.; Chang, C. S.

    The edge bootstrap current plays a critical role in the equilibrium and stability of the steep edge pedestal plasma. The pedestal plasma has an unconventional and difficult neoclassical property, as compared with the core plasma. It has a narrow passing particle region in velocity space that can be easily modified or destroyed by Coulomb collisions. At the same time, the edge pedestal plasma has steep pressure and electrostatic potential gradients whose scale-lengths are comparable with the ion banana width, and includes a magnetic separatrix surface, across which the topological properties of the magnetic field and particle orbits change abruptly. Amore » drift-kinetic particle code XGC0, equipped with a mass-momentum-energy conserving collision operator, is used to study the edge bootstrap current in a realistic diverted magnetic field geometry with a self-consistent radial electric field. When the edge electrons are in the weakly collisional banana regime, surprisingly, the present kinetic simulation confirms that the existing analytic expressions [represented by O. Sauter et al., Phys. Plasmas 6, 2834 (1999)] are still valid in this unconventional region, except in a thin radial layer in contact with the magnetic separatrix. The agreement arises from the dominance of the electron contribution to the bootstrap current compared with ion contribution and from a reasonable separation of the trapped-passing dynamics without a strong collisional mixing. However, when the pedestal electrons are in plateau-collisional regime, there is significant deviation of numerical results from the existing analytic formulas, mainly due to large effective collisionality of the passing and the boundary layer trapped particles in edge region. In a conventional aspect ratio tokamak, the edge bootstrap current from kinetic simulation can be significantly less than that from the Sauter formula if the electron collisionality is high. On the other hand, when the aspect ratio is close to unity, the collisional edge bootstrap current can be significantly greater than that from the Sauter formula. Rapid toroidal rotation of the magnetic field lines at the high field side of a tight aspect-ratio tokamak is believed to be the cause of the different behavior. A new analytic fitting formula, as a simple modification to the Sauter formula, is obtained to bring the analytic expression to a better agreement with the edge kinetic simulation results.« less

  14. HIV-1 Transmission During Recent Infection and During Treatment Interruptions as Major Drivers of New Infections in the Swiss HIV Cohort Study.

    PubMed

    Marzel, Alex; Shilaih, Mohaned; Yang, Wan-Lin; Böni, Jürg; Yerly, Sabine; Klimkait, Thomas; Aubert, Vincent; Braun, Dominique L; Calmy, Alexandra; Furrer, Hansjakob; Cavassini, Matthias; Battegay, Manuel; Vernazza, Pietro L; Bernasconi, Enos; Günthard, Huldrych F; Kouyos, Roger D; Aubert, V; Battegay, M; Bernasconi, E; Böni, J; Bucher, H C; Burton-Jeangros, C; Calmy, A; Cavassini, M; Dollenmaier, G; Egger, M; Elzi, L; Fehr, J; Fellay, J; Furrer, H; Fux, C A; Gorgievski, M; Günthard, H F; Haerry, D; Hasse, B; Hirsch, H H; Hoffmann, M; Hösli, I; Kahlert, C; Kaiser, L; Keiser, O; Klimkait, T; Kouyos, R D; Kovari, H; Ledergerber, B; Martinetti, G; de Tejada, B Martinez; Metzner, K; Müller, N; Nadal, D; Nicca, D; Pantaleo, G; Rauch, A; Regenass, S; Rickenbach, M; Rudin, C; Schöni-Affolter, F; Schmid, P; Schüpbach, J; Speck, R; Tarr, P; Trkola, A; Vernazza, P L; Weber, R; Yerly, S

    2016-01-01

    Reducing the fraction of transmissions during recent human immunodeficiency virus (HIV) infection is essential for the population-level success of "treatment as prevention". A phylogenetic tree was constructed with 19 604 Swiss sequences and 90 994 non-Swiss background sequences. Swiss transmission pairs were identified using 104 combinations of genetic distance (1%-2.5%) and bootstrap (50%-100%) thresholds, to examine the effect of those criteria. Monophyletic pairs were classified as recent or chronic transmission based on the time interval between estimated seroconversion dates. Logistic regression with adjustment for clinical and demographic characteristics was used to identify risk factors associated with transmission during recent or chronic infection. Seroconversion dates were estimated for 4079 patients on the phylogeny, and comprised between 71 (distance, 1%; bootstrap, 100%) to 378 transmission pairs (distance, 2.5%; bootstrap, 50%). We found that 43.7% (range, 41%-56%) of the transmissions occurred during the first year of infection. Stricter phylogenetic definition of transmission pairs was associated with higher recent-phase transmission fraction. Chronic-phase viral load area under the curve (adjusted odds ratio, 3; 95% confidence interval, 1.64-5.48) and time to antiretroviral therapy (ART) start (adjusted odds ratio 1.4/y; 1.11-1.77) were associated with chronic-phase transmission as opposed to recent transmission. Importantly, at least 14% of the chronic-phase transmission events occurred after the transmitter had interrupted ART. We demonstrate a high fraction of transmission during recent HIV infection but also chronic transmissions after interruption of ART in Switzerland. Both represent key issues for treatment as prevention and underline the importance of early diagnosis and of early and continuous treatment. © The Author 2015. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  15. More accurate, calibrated bootstrap confidence intervals for correlating two autocorrelated climate time series

    NASA Astrophysics Data System (ADS)

    Olafsdottir, Kristin B.; Mudelsee, Manfred

    2013-04-01

    Estimation of the Pearson's correlation coefficient between two time series to evaluate the influences of one time depended variable on another is one of the most often used statistical method in climate sciences. Various methods are used to estimate confidence interval to support the correlation point estimate. Many of them make strong mathematical assumptions regarding distributional shape and serial correlation, which are rarely met. More robust statistical methods are needed to increase the accuracy of the confidence intervals. Bootstrap confidence intervals are estimated in the Fortran 90 program PearsonT (Mudelsee, 2003), where the main intention was to get an accurate confidence interval for correlation coefficient between two time series by taking the serial dependence of the process that generated the data into account. However, Monte Carlo experiments show that the coverage accuracy for smaller data sizes can be improved. Here we adapt the PearsonT program into a new version called PearsonT3, by calibrating the confidence interval to increase the coverage accuracy. Calibration is a bootstrap resampling technique, which basically performs a second bootstrap loop or resamples from the bootstrap resamples. It offers, like the non-calibrated bootstrap confidence intervals, robustness against the data distribution. Pairwise moving block bootstrap is used to preserve the serial correlation of both time series. The calibration is applied to standard error based bootstrap Student's t confidence intervals. The performances of the calibrated confidence intervals are examined with Monte Carlo simulations, and compared with the performances of confidence intervals without calibration, that is, PearsonT. The coverage accuracy is evidently better for the calibrated confidence intervals where the coverage error is acceptably small (i.e., within a few percentage points) already for data sizes as small as 20. One form of climate time series is output from numerical models which simulate the climate system. The method is applied to model data from the high resolution ocean model, INALT01 where the relationship between the Agulhas Leakage and the North Brazil Current is evaluated. Preliminary results show significant correlation between the two variables when there is 10 year lag between them, which is more or less the time that takes the Agulhas Leakage water to reach the North Brazil Current. Mudelsee, M., 2003. Estimating Pearson's correlation coefficient with bootstrap confidence interval from serially dependent time series. Mathematical Geology 35, 651-665.

  16. Reference limits for urinary fractional excretion of electrolytes in adult non-racing Greyhound dogs.

    PubMed

    Bennett, S L; Abraham, L A; Anderson, G A; Holloway, S A; Parry, B W

    2006-11-01

    To determine reference limits for urinary fractional excretion of electrolytes in Greyhound dogs. Urinary fractional excretion was calculated using a spot clearance method preceded by a 16 to 20 hour fast in 48 Greyhound dogs. Raw data analysed using the bootstrap estimate was used to calculate the reference limits. The observed range for urinary fractional excretion in Greyhound dogs was 0.0 to 0.77% for sodium, 0.9 to 14.7% for potassium, 0 to 0.66% for chloride, 0.03 to 0.22% for calcium and 0.4 to 20.1% for phosphate. Expressed as percentages, the suggested reference limits for fractional excretion in Greyhound dogs are as follows: sodium < or = 0.72, potassium < or = 12.2, chloride < or = 0.55, calcium < or = 0.13 and phosphate < or = 16.5. Veterinary practitioners may use these reference limits for urinary electrolyte fractional excretion when investigating renal tubular disease in Greyhound dogs.

  17. Non-inductive current generation in fusion plasmas with turbulence

    NASA Astrophysics Data System (ADS)

    Wang, Weixing; Ethier, S.; Startsev, E.; Chen, J.; Hahm, T. S.; Yoo, M. G.

    2017-10-01

    It is found that plasma turbulence may strongly influence non-inductive current generation. This may have radical impact on various aspects of tokamak physics. Our simulation study employs a global gyrokinetic model coupling self-consistent neoclassical and turbulent dynamics with focus on electron current. Distinct phases in electron current generation are illustrated in the initial value simulation. In the early phase before turbulence develops, the electron bootstrap current is established in a time scale of a few electron collision times, which closely agrees with the neoclassical prediction. The second phase follows when turbulence begins to saturate, during which turbulent fluctuations are found to strongly affect electron current. The profile structure, amplitude and phase space structure of electron current density are all significantly modified relative to the neoclassical bootstrap current by the presence of turbulence. Both electron parallel acceleration and parallel residual stress drive are shown to play important roles in turbulence-induced current generation. The current density profile is modified in a way that correlates with the fluctuation intensity gradient through its effect on k//-symmetry breaking in fluctuation spectrum. Turbulence is shown to deduct (enhance) plasma self-generated current in low (high) collisionality regime, and the reduction of total electron current relative to the neoclassical bootstrap current increases as collisionality decreases. The implication of this result to the fully non-inductive current operation in steady state burning plasma regime should be investigated. Finally, significant non-inductive current is observed in flat pressure region, which is a nonlocal effect and results from turbulence spreading induced current diffusion. Work supported by U.S. DOE Contract DE-AC02-09-CH11466.

  18. The Inverse Bagging Algorithm: Anomaly Detection by Inverse Bootstrap Aggregating

    NASA Astrophysics Data System (ADS)

    Vischia, Pietro; Dorigo, Tommaso

    2017-03-01

    For data sets populated by a very well modeled process and by another process of unknown probability density function (PDF), a desired feature when manipulating the fraction of the unknown process (either for enhancing it or suppressing it) consists in avoiding to modify the kinematic distributions of the well modeled one. A bootstrap technique is used to identify sub-samples rich in the well modeled process, and classify each event according to the frequency of it being part of such sub-samples. Comparisons with general MVA algorithms will be shown, as well as a study of the asymptotic properties of the method, making use of a public domain data set that models a typical search for new physics as performed at hadronic colliders such as the Large Hadron Collider (LHC).

  19. Exploration of high harmonic fast wave heating on the National Spherical Torus Experiment

    NASA Astrophysics Data System (ADS)

    Wilson, J. R.; Bell, R. E.; Bernabei, S.; Bitter, M.; Bonoli, P.; Gates, D.; Hosea, J.; LeBlanc, B.; Mau, T. K.; Medley, S.; Menard, J.; Mueller, D.; Ono, M.; Phillips, C. K.; Pinsker, R. I.; Raman, R.; Rosenberg, A.; Ryan, P.; Sabbagh, S.; Stutman, D.; Swain, D.; Takase, Y.; Wilgen, J.

    2003-05-01

    High harmonic fast wave (HHFW) heating has been proposed as a particularly attractive means for plasma heating and current drive in the high beta plasmas that are achievable in spherical torus (ST) devices. The National Spherical Torus Experiment (NSTX) [M. Ono, S. M. Kaye, S. Neumeyer et al., in Proceedings of the 18th IEEE/NPSS Symposium on Fusion Engineering, Albuquerque, 1999 (IEEE, Piscataway, NJ, 1999), p. 53] is such a device. An rf heating system has been installed on the NSTX to explore the physics of HHFW heating, current drive via rf waves and for use as a tool to demonstrate the attractiveness of the ST concept as a fusion device. To date, experiments have demonstrated many of the theoretical predictions for HHFW. In particular, strong wave absorption on electrons over a wide range of plasma parameters and wave parallel phase velocities, wave acceleration of energetic ions, and indications of current drive for directed wave spectra have been observed. In addition HHFW heating has been used to explore the energy transport properties of NSTX plasmas, to create H-mode discharges with a large fraction of bootstrap current and to control the plasma current profile during the early stages of the discharge.

  20. Gyrokinetic neoclassical study of the bootstrap current in the tokamak edge pedestal with fully non-linear Coulomb collisions

    DOE PAGES

    Hager, Robert; Chang, C. S.

    2016-04-08

    As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. In conclusion, a new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less

  1. Gyrokinetic neoclassical study of the bootstrap current in the tokamak edge pedestal with fully non-linear Coulomb collisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hager, Robert; Chang, C. S.

    As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. In conclusion, a new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less

  2. Gyrokinetic neoclassical study of the bootstrap current in the tokamak edge pedestal with fully non-linear Coulomb collisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hager, Robert, E-mail: rhager@pppl.gov; Chang, C. S., E-mail: cschang@pppl.gov

    As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. A new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less

  3. Exploration of spherical torus physics in the NSTX device

    NASA Astrophysics Data System (ADS)

    Ono, M.; Kaye, S. M.; Peng, Y.-K. M.; Barnes, G.; Blanchard, W.; Carter, M. D.; Chrzanowski, J.; Dudek, L.; Ewig, R.; Gates, D.; Hatcher, R. E.; Jarboe, T.; Jardin, S. C.; Johnson, D.; Kaita, R.; Kalish, M.; Kessel, C. E.; Kugel, H. W.; Maingi, R.; Majeski, R.; Manickam, J.; McCormack, B.; Menard, J.; Mueller, D.; Nelson, B. A.; Nelson, B. E.; Neumeyer, C.; Oliaro, G.; Paoletti, F.; Parsells, R.; Perry, E.; Pomphrey, N.; Ramakrishnan, S.; Raman, R.; Rewoldt, G.; Robinson, J.; Roquemore, A. L.; Ryan, P.; Sabbagh, S.; Swain, D.; Synakowski, E. J.; Viola, M.; Williams, M.; Wilson, J. R.; NSTX Team

    2000-03-01

    The National Spherical Torus Experiment (NSTX) is being built at Princeton Plasma Physics Laboratory to test the fusion physics principles for the spherical torus concept at the MA level. The NSTX nominal plasma parameters are R0 = 85 cm, a = 67 cm, R/a >= 1.26, Bt = 3 kG, Ip = 1 MA, q95 = 14, elongation κ <= 2.2, triangularity δ <= 0.5 and a plasma pulse length of up to 5 s. The plasma heating/current drive tools are high harmonic fast wave (6 MW, 5 s), neutral beam injection (5 MW, 80 keV, 5 s) and coaxial helicity injection. Theoretical calculations predict that NSTX should provide exciting possibilities for exploring a number of important new physics regimes, including very high plasma β, naturally high plasma elongation, high bootstrap current fraction, absolute magnetic well and high pressure driven sheared flow. In addition, the NSTX programme plans to explore fully non-inductive plasma startup as well as a dispersive scrape-off layer for heat and particle flux handling.

  4. Investigation of the n  =  1 resistive wall modes in the ITER high-mode confinement

    NASA Astrophysics Data System (ADS)

    Zheng, L. J.; Kotschenreuther, M. T.; Valanju, P.

    2017-06-01

    The n  =  1 resistive wall mode (RWM) stability of ITER high-mode confinement is investigated with bootstrap current included for equilibrium, together with the rotation and diamagnetic drift effects for stability. Here, n is the toroidal mode number. We use the CORSICA code for computing the free boundary equilibrium and AEGIS code for stability. We find that the inclusion of bootstrap current for equilibrium is critical. It can reduce the local magnetic shear in the pedestal, so that the infernal mode branches can develop. Consequently, the n  =  1 modes become unstable without a stabilizing wall at a considerably lower beta limit, driven by the steep pressure gradient in the pedestal. Typical values of the wall position stabilize the ideal mode, but give rise to the ‘pedestal’ resistive wall modes. We find that the rotation can contribute a stabilizing effect on RWMs and the diamagnetic drift effects can further improve the stability in the co-current rotation case. But, generally speaking, the rotation stabilization effects are not as effective as the case without including the bootstrap current effects on equilibrium. We also find that the diamagnetic drift effects are actually destabilizing when there is a counter-current rotation.

  5. Compatibility of internal transport barrier with steady-state operation in the high bootstrap fraction regime on DIII-D

    DOE PAGES

    Garofalo, Andrea M.; Gong, Xianzu; Grierson, Brian A.; ...

    2015-11-16

    Recent EAST/DIII-D joint experiments on the high poloidal beta tokamak regime in DIII-D have demonstrated fully noninductive operation with an internal transport barrier (ITB) at large minor radius, at normalized fusion performance increased by ≥30% relative to earlier work. The advancement was enabled by improved understanding of the “relaxation oscillations”, previously attributed to repetitive ITB collapses, and of the fast ion behavior in this regime. It was found that the “relaxation oscillations” are coupled core-edge modes 2 amenable to wall-stabilization, and that fast ion losses which previously dictated a large plasma-wall separation to avoid wall over-heating, can be reduced tomore » classical levels with sufficient plasma density. By using optimized waveforms of the plasma-wall separation and plasma density, fully noninductive plasmas have been sustained for long durations with excellent energy confinement quality, bootstrap fraction ≥ 80%, β N ≤ 4 , β P ≥ 3 , and β T ≥ 2%. Finally, these results bolster the applicability of the high poloidal beta tokamak regime toward the realization of a steady-state fusion reactor.« less

  6. A condition for small bootstrap current in three-dimensional toroidal configurations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mikhailov, M. I., E-mail: mikhaylov-mi@nrcki.ru; Nührenberg, J.; Zille, R.

    2016-11-15

    It is shown that, if the maximum of the magnetic field strength on a magnetic surface in a threedimensional magnetic confinement configuration with stellarator symmetry constitutes a line that is orthogonal to the field lines and crosses the symmetry line, then the bootstrap current density is smaller compared to that in quasi-axisymmetric (qa) [J. Nührenberg et al., in Proc. of Joint Varenna−Lausanne Int. Workshop on Theory of Fusion Plasmas, Varenna, 1994, p. 3] and quasi-helically (qh) symmetric [J. Nührenberg and R. Zille, Phys. Lett. A 129, 113 (1988)] configurations.

  7. Improvement of Current Drive Efficiency in Projected FNSF Discharges

    NASA Astrophysics Data System (ADS)

    Prater, R.; Chan, V.; Garofalo, A.

    2012-10-01

    The Fusion Nuclear Science Facility - Advanced Tokamak (FNSF-AT) is envisioned as a facility that uses the tokamak approach to address the development of the AT path to fusion and fusion's energy objectives. It uses copper coils for a compact device with high βN and moderate power gain. The major radius is 2.7 m and central toroidal field is 5.44 T. Achieving the required confinement and stability at βN˜3.7 requires a current profile with negative central shear and qmin>1. Off-axis Electron Cyclotron Current Drive (ECCD), in addition to high bootstrap current fraction, can help support this current profile. Using the applied EC frequency and launch location as free parameters, a systematic study has been carried out to optimize the ECCD in the range ρ= 0.5-0.7. Using a top launch, making use of a large toroidal component to the launch direction, adjusting the vertical launch angle so that the rays propagate nearly parallel to the resonance, and adjusting the frequency for optimum total current give a high dimensionless efficiency of 0.44 for a broad ECCD profile peaked at ρ=0.7, and the driven current is 17 kA/MW for n20= 2.1 and Te= 10.3 keV locally.

  8. Lower hybrid current drive in experiments for transport barriers at high βN of JET (Joint European Torus)

    NASA Astrophysics Data System (ADS)

    Cesario, R. C.; Castaldo, C.; Fonseca, A.; De Angelis, R.; Parail, V.; Smeulders, P.; Beurskens, M.; Brix, M.; Calabrò, G.; De Vries, P.; Mailloux, J.; Pericoli, V.; Ravera, G.; Zagorski, R.

    2007-09-01

    LHCD has been used in JET experiments aimed at producing internal transport barriers (ITBs) in highly triangular plasmas (δ≈0.4) at high βN (up to 3) for steady-state application. The LHCD is a potentially valuable tool for (i) modifying the target q-profile, which can help avoid deleterious MHD modes and favour the formation of ITBs, and (ii) contributing to the non-inductive current drive required to prolong such plasma regimes. The q-profile evolution has been simulated during the current ramp-up phase for such a discharge (B0 = 2.3 T, IP = 1.5 MA) where 2 MW of LHCD has been coupled. The JETTO code was used taking measured plasma profiles, and the LHCD profile modeled by the LHstar code. The results are in agreement with MSE measurements and indicate the importance of the elevated electron temperature due to LHCD, as well as the driven current. During main heating with 18 MW of NBI and 3 MW of ICRH the bootstrap current density at the edge also becomes large, consistently with the observed reduction of the local turbulence and of the MHD activity. JETTO modelling suggests that the bootstrap current can reduce the magnetic shear (sh) at large radius, potentially affecting the MHD stability and turbulence behaviour in this region. Keywords: lower hybrid current drive (LHCD), bootstrap current, q (safety factor) and shear (sh) profile evolutions.

  9. Carving out the end of the world or (superconformal bootstrap in six dimensions)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Chi-Ming; Lin, Ying-Hsuan

    We bootstrap N=(1,0) superconformal field theories in six dimensions, by analyzing the four-point function of flavor current multiplets. By assuming E 8 flavor group, we present universal bounds on the central charge C T and the flavor central charge C J. Based on the numerical data, we conjecture that the rank-one E-string theory saturates the universal lower bound on C J , and numerically determine the spectrum of long multiplets in the rank-one E-string theory. We comment on the possibility of solving the higher-rank E-string theories by bootstrap and thereby probing M-theory on AdS 7×S 4/Z 2 .

  10. Carving out the end of the world or (superconformal bootstrap in six dimensions)

    DOE PAGES

    Chang, Chi-Ming; Lin, Ying-Hsuan

    2017-08-29

    We bootstrap N=(1,0) superconformal field theories in six dimensions, by analyzing the four-point function of flavor current multiplets. By assuming E 8 flavor group, we present universal bounds on the central charge C T and the flavor central charge C J. Based on the numerical data, we conjecture that the rank-one E-string theory saturates the universal lower bound on C J , and numerically determine the spectrum of long multiplets in the rank-one E-string theory. We comment on the possibility of solving the higher-rank E-string theories by bootstrap and thereby probing M-theory on AdS 7×S 4/Z 2 .

  11. Innovation cascades: artefacts, organization and attributions

    PubMed Central

    2016-01-01

    Innovation cascades inextricably link the introduction of new artefacts, transformations in social organization, and the emergence of new functionalities and new needs. This paper describes a positive feedback dynamic, exaptive bootstrapping, through which these cascades proceed, and the characteristics of the relationships in which the new attributions that drive this dynamic are generated. It concludes by arguing that the exaptive bootstrapping dynamic is the principal driver of our current Innovation Society. PMID:26926284

  12. AdS/CFT in Fractional Dimension and Higher-Spins at One Loop

    NASA Astrophysics Data System (ADS)

    Skvortsov, Evgeny; Tran, Tung

    2017-08-01

    Large-$N$, $\\epsilon$-expansion or the conformal bootstrap allow one to make sense of some of conformal field theories in non-integer dimension, which suggests that AdS/CFT may also extend to fractional dimensions. It was shown recently that the sphere free energy and the $a$-anomaly coefficient of the free scalar field can be reproduced as a one-loop effect in the dual higher-spin theory in a number of integer dimensions. We extend this result to all integer and also to fractional dimensions. Upon changing the boundary conditions in the higher-spin theory the sphere free energy of the large-$N$ Wilson-Fisher CFT can also be reproduced from the higher-spin side.

  13. Metrics for More than Two Points at Once

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    The conventional definition of a topological metric over a space specifies properties that must be obeyed by any measure of "how separated" two points in that space are. Here it is shown how to extend that definition, and in particular the triangle inequality, to concern arbitrary numbers of points. Such a measure of how separated the points within a collection are can be bootstrapped, to measure "how separated" from each other are two (or more) collections. The measure presented here also allows fractional membership of an element in a collection. This means it directly concerns measures of "how spread out" a probability distribution over a space is. When such a measure is bootstrapped to compare two collections, it allows us to measure how separated two probability distributions are, or more generally, how separated a distribution of distributions is.

  14. High-beta, steady-state hybrid scenario on DIII-D

    DOE PAGES

    Petty, C. C.; Kinsey, J. E.; Holcomb, C. T.; ...

    2015-12-17

    Here, the potential of the hybrid scenario (first developed as an advanced inductive scenario for high fluence) as a regime for high-beta, steady-state plasmas is demonstrated on the DIII-D tokamak. These experiments show that the beneficial characteristics of hybrids, namely safety factor ≥1 with low central magnetic shear, high stability limits and excellent confinement, are maintained when strong central current drive (electron cyclotron and neutral beam) is applied to increase the calculated non-inductive fraction to ≈100% (≈50% bootstrap current). The best discharges achieve normalized beta of 3.4, IPB98(y,2) confinement factor of 1.4, surface loop voltage of 0.01 V, and nearlymore » equal electron and ion temperatures at low collisionality. A zero-dimensional physics model shows that steady-state hybrid operation with Q fus ~ 5 is feasible in FDF and ITER. The advantage of the hybrid scenario as an Advanced Tokamak regime is that the external current drive can be deposited near the plasma axis where the efficiency is high; additionally, good alignment between the current drive and plasma current profiles is not necessary as the poloidal magnetic flux pumping self-organizes the current density profile in hybrids with an m/n=3/2 tearing mode.« less

  15. Initial exploration of scenarios with Internal Transport Barrier in the first NBI-heated L-mode TCV plasmas

    NASA Astrophysics Data System (ADS)

    Piron, Chiara; Sauter, Olivier; Coda, Stefano; Merle, Antoine; Karpushov, Alexander; Pigatto, Leonardo; Bolzonella, Tommaso; Piovesan, Paolo; Vianello, Nicola; TCV Team; EUROfusion MST1 Team

    2016-10-01

    Fully non-inductive operation of high performance plasmas is one of the main objectives of contemporary Tokamak research. In this perspective, plasmas with Internal Transport Barriers (ITBs) are an attractive scenario, since they can attain a high fraction of bootstrap current. In this work we start exploring ITB scenarios on the Tokamak à Configuration Variable (TCV) heated by a newly available 1MW Neutral Beam Injector (NBI). Here we investigate for the first time in this device the impact of the additional NBI power on the performance and stability of L-mode plasmas with ITBs. Results of both experimental data analyses and ASTRA transport simulations are presented. The work examines also the Magneto Hydro-Dynamics (MHD) activity and stability of the explored plasmas. In particular, the role of plasma magnetic equilibrium parameters, such as plasma elongation and triangularity, on the sustainment of these NBI-heated ITB scenarios is discussed.

  16. Integrated Scenario Modeling of NSTX Advanced Plasma Configurations

    NASA Astrophysics Data System (ADS)

    Kessel, Charles; Synakowski, Edward

    2003-10-01

    The Spherical Torus will provide an attractive fusion energy source if it can demonstrate the following major features: high elongation and triangularity, 100% non-inductive current with a credible path to high bootstrap fractions, non-solenoidal startup and current rampup, high beta with stabilization of RWM instabilities, and sufficiently high energy confinement. NSTX has specific experimental milestones to examine these features, and integrated scenario modeling is helping to understand how these configurations might be produced and what tools are needed to access this operating space. Simulations with the Tokamak Simulation Code (TSC), CURRAY, and JSOLVER/BALMSC/PEST2 have identified fully non-inductively sustained, high beta plasmas that rely on strong plasma shaping accomplished with a PF coil modification, off-axis current drive from Electron Bernstein Waves (EBW), flexible on-axis heating and CD from High Harmonic Fast Wave (HHFW) and Neutral Beam Injection (NBI), and density control. Ideal MHD stability shows that with wall stabilization through plasma rotation and/or RWM feedback coils, a beta of 40% is achievable, with 100% non-inductive current sustained for 4 current diffusion times. Experimental data and theory are combined to produce a best extrapolation to these regimes, which is continuously improved as the discharges approach these parameters, and theoretical/computational methods expand. Further investigations and development for integrated scenario modeling on NSTX is discussed.

  17. Comparison of fusion alpha performance in JET advanced scenario and H-mode plasmas

    NASA Astrophysics Data System (ADS)

    Asunta, O.; Kurki-Suonio, T.; Tala, T.; Sipilä, S.; Salomaa, R.; contributors, JET-EFDA

    2008-12-01

    Currently, plasmas with internal transport barriers (ITBs) appear the most likely candidates for steady-state scenarios for future fusion reactors. In such plasmas, the broad hot and dense region in the plasma core leads to high fusion gain, while the cool edge protects the integrity of the first wall. Economically desirable large bootstrap current fraction and low inductive current drive may, however, lead to degraded fast ion confinement. In this work the confinement and heating profile of fusion alphas were compared between H-mode and ITB plasmas in realistic JET geometry. The work was carried out using the Monte Carlo-based guiding-center-following code ASCOT. For the same plasma current, the ITB discharges were found to produce four to eight times more fusion power than a comparable ELMy H-mode discharge. Unfortunately, also the alpha particle losses were larger (~16%) compared with the H-mode discharge (7%). In the H-mode discharges, alpha power was deposited to the plasma symmetrically around the magnetic axis, whereas in the current-hole discharge, the power was spread out to a larger volume in the plasma center. This was due to wider particle orbits, and the magnetic structure allowing for a broader hot region in the centre.

  18. Exploration of High Harmonic Fast Wave Heating on the National Spherical Torus Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.R. Wilson; R.E. Bell; S. Bernabei

    2003-02-11

    High Harmonic Fast Wave (HHFW) heating has been proposed as a particularly attractive means for plasma heating and current drive in the high-beta plasmas that are achievable in spherical torus (ST) devices. The National Spherical Torus Experiment (NSTX) [Ono, M., Kaye, S.M., Neumeyer, S., et al., Proceedings, 18th IEEE/NPSS Symposium on Fusion Engineering, Albuquerque, 1999, (IEEE, Piscataway, NJ (1999), p. 53.)] is such a device. An radio-frequency (rf) heating system has been installed on NSTX to explore the physics of HHFW heating, current drive via rf waves and for use as a tool to demonstrate the attractiveness of the STmore » concept as a fusion device. To date, experiments have demonstrated many of the theoretical predictions for HHFW. In particular, strong wave absorption on electrons over a wide range of plasma parameters and wave parallel phase velocities, wave acceleration of energetic ions, and indications of current drive for directed wave spectra have been observed. In addition HHFW heating has been used to explore the energy transport properties of NSTX plasmas, to create H-mode (high-confinement mode) discharges with a large fraction of bootstrap current and to control the plasma current profile during the early stages of the discharge.« less

  19. Transport simulation of EAST long-pulse H-mode discharge with integrated modeling

    NASA Astrophysics Data System (ADS)

    Wu, M. Q.; Li, G. Q.; Chen, J. L.; Du, H. F.; Gao, X.; Ren, Q. L.; Li, K.; Chan, Vincent; Pan, C. K.; Ding, S. Y.; Jian, X.; Zhu, X.; Lian, H.; Qian, J. P.; Gong, X. Z.; Zang, Q.; Duan, Y. M.; Liu, H. Q.; Lyu, B.

    2018-04-01

    In the 2017 EAST experimental campaign, a steady-state long-pulse H-mode discharge lasting longer than 100 s has been obtained using only radio frequency heating and current drive, and the confinement quality is slightly better than standard H-mode, H98y2 ~ 1.1, with stationary peaked electron temperature profiles. Integrated modeling of one long-pulse H-mode discharge in the 2016 EAST experimental campaign has been performed with equilibrium code EFIT, and transport codes TGYRO and ONETWO under integrated modeling framework OMFIT. The plasma current is fully-noninductively driven with a combination of ~2.2 MW LHW, ~0.3 MW ECH and ~1.1 MW ICRF. Time evolution of the predicted electron and ion temperature profiles through integrated modeling agree closely with that from measurements. The plasma current (I p ~ 0.45 MA) and electron density are kept constantly. A steady-state is achieved using integrated modeling, and the bootstrap current fraction is ~28%, the RF drive current fraction is ~72%. The predicted current density profile matches the experimental one well. Analysis shows that electron cyclotron heating (ECH) makes large contribution to the plasma confinement when heating in the core region while heating in large radius does smaller improvement, also a more peaked LHW driven current profile is got when heating in the core. Linear analysis shows that the high-k modes instability (electron temperature gradient driven modes) is suppressed in the core region where exists weak electron internal transport barriers. The trapped electron modes dominates in the low-k region, which is mainly responsible for driving the electron energy flux. It is found that the ECH heating effect is very local and not the main cause to sustained the good confinement, the peaked current density profile has the most important effect on plasma confinement improvement. Transport analysis of the long-pulse H-mode experiments on EAST will be helpful to build future experiments.

  20. Extrapolation of the DIII-D high poloidal beta scenario to ITER steady-state using transport modeling

    NASA Astrophysics Data System (ADS)

    McClenaghan, J.; Garofalo, A. M.; Meneghini, O.; Smith, S. P.

    2016-10-01

    Transport modeling of a proposed ITER steady-state scenario based on DIII-D high βP discharges finds that the core confinement may be improved with either sufficient rotation or a negative central shear q-profile. The high poloidal beta scenario is characterized by a large bootstrap current fraction( 80%) which reduces the demands on the external current drive, and a large radius internal transport barrier which is associated with improved normalized confinement. Typical temperature and density profiles from the non-inductive high poloidal beta scenario on DIII-D are scaled according to 0D modeling predictions of the requirements for achieving Q=5 steady state performance in ITER with ``day one'' H&CD capabilities. Then, TGLF turbulence modeling is carried out under systematic variations of the toroidal rotation and the core q-profile. Either strong negative central magnetic shear or rotation are found to successfully provide the turbulence suppression required to maintain the temperature and density profiles. This work supported by the US Department of Energy under DE-FC02-04ER54698.

  1. Impurities in a non-axisymmetric plasma. Transport and effect on bootstrap current

    DOE PAGES

    Mollén, A.; Landreman, M.; Smith, H. M.; ...

    2015-11-20

    Impurities cause radiation losses and plasma dilution, and in stellarator plasmas the neoclassical ambipolar radial electric field is often unfavorable for avoiding strong impurity peaking. In this work we use a new continuum drift-kinetic solver, the SFINCS code (the Stellarator Fokker-Planck Iterative Neoclassical Conservative Solver) [M. Landreman et al., Phys. Plasmas 21 (2014) 042503] which employs the full linearized Fokker-Planck-Landau operator, to calculate neoclassical impurity transport coefficients for a Wendelstein 7-X (W7-X) magnetic configuration. We compare SFINCS calculations with theoretical asymptotes in the high collisionality limit. We observe and explain a 1/nu-scaling of the inter-species radial transport coefficient at lowmore » collisionality, arising due to the field term in the inter-species collision operator, and which is not found with simplified collision models even when momentum correction is applied. However, this type of scaling disappears if a radial electric field is present. We use SFINCS to analyze how the impurity content affects the neoclassical impurity dynamics and the bootstrap current. We show that a change in plasma effective charge Z eff of order unity can affect the bootstrap current enough to cause a deviation in the divertor strike point locations.« less

  2. The prospects for magnetohydrodynamic stability in advanced tokamak regimes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manickam, J.; Chance, M.S.; Jardin, S.C.

    1994-05-01

    Stability analysis of advanced regime tokamaks is presented. Here advanced regimes are defined to include configurations where the ratio of the bootstrap current, [ital I][sub BS], to the total plasma current, [ital I][sub [ital p

  3. Integrated modeling of plasma ramp-up in DIII-D ITER-like and high bootstrap current scenario discharges

    NASA Astrophysics Data System (ADS)

    Wu, M. Q.; Pan, C. K.; Chan, V. S.; Li, G. Q.; Garofalo, A. M.; Jian, X.; Liu, L.; Ren, Q. L.; Chen, J. L.; Gao, X.; Gong, X. Z.; Ding, S. Y.; Qian, J. P.; Cfetr Physics Team

    2018-04-01

    Time-dependent integrated modeling of DIII-D ITER-like and high bootstrap current plasma ramp-up discharges has been performed with the equilibrium code EFIT, and the transport codes TGYRO and ONETWO. Electron and ion temperature profiles are simulated by TGYRO with the TGLF (SAT0 or VX model) turbulent and NEO neoclassical transport models. The VX model is a new empirical extension of the TGLF turbulent model [Jian et al., Nucl. Fusion 58, 016011 (2018)], which captures the physics of multi-scale interaction between low-k and high-k turbulence from nonlinear gyro-kinetic simulation. This model is demonstrated to accurately model low Ip discharges from the EAST tokamak. Time evolution of the plasma current density profile is simulated by ONETWO with the experimental current ramp-up rate. The general trend of the predicted evolution of the current density profile is consistent with that obtained from the equilibrium reconstruction with Motional Stark effect constraints. The predicted evolution of βN , li , and βP also agrees well with the experiments. For the ITER-like cases, the predicted electron and ion temperature profiles using TGLF_Sat0 agree closely with the experimental measured profiles, and are demonstrably better than other proposed transport models. For the high bootstrap current case, the predicted electron and ion temperature profiles perform better in the VX model. It is found that the SAT0 model works well at high IP (>0.76 MA) while the VX model covers a wider range of plasma current ( IP > 0.6 MA). The results reported in this paper suggest that the developed integrated modeling could be a candidate for ITER and CFETR ramp-up engineering design modeling.

  4. Incorporating external evidence in trial-based cost-effectiveness analyses: the use of resampling methods

    PubMed Central

    2014-01-01

    Background Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. Methods We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. Results In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. Conclusions The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes. PMID:24888356

  5. Incorporating external evidence in trial-based cost-effectiveness analyses: the use of resampling methods.

    PubMed

    Sadatsafavi, Mohsen; Marra, Carlo; Aaron, Shawn; Bryan, Stirling

    2014-06-03

    Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes.

  6. High internal inductance for steady-state operation in ITER and a reactor

    DOE PAGES

    Ferron, John R.; Holcomb, Christopher T.; Luce, Timothy C.; ...

    2015-06-26

    Increased confinement and ideal stability limits at relatively high values of the internal inductance (more » $${{\\ell}_{i}}$$ ) have enabled an attractive scenario for steady-state tokamak operation to be demonstrated in DIII-D. Normalized plasma pressure in the range appropriate for a reactor has been achieved in high elongation and triangularity double-null divertor discharges with $${{\\beta}_{\\text{N}}}\\approx 5$$ at $${{\\ell}_{i}}\\approx 1.3$$ , near the ideal $n=1$ kink stability limit calculated without the effect of a stabilizing vacuum vessel wall, with the ideal-wall limit still higher at $${{\\beta}_{\\text{N}}}>5.5$$ . Confinement is above the H-mode level with $${{H}_{98\\left(\\text{y},2\\right)}}\\approx 1.8$$ . At $${{q}_{95}}\\approx 7.5$$ , the current is overdriven, with bootstrap current fraction $${{f}_{\\text{BS}}}\\approx 0.8$$ , noninductive current fraction $${{f}_{\\text{NI}}}>1$$ and negative surface voltage. For ITER (which has a single-null divertor shape), operation at $${{\\ell}_{i}}\\approx 1$$ is a promising option with $${{f}_{\\text{BS}}}\\approx 0.5$$ and the remaining current driven externally near the axis where the electron cyclotron current drive efficiency is high. This scenario has been tested in the ITER shape in DIII-D at $${{q}_{95}}=4.8$$ , so far reaching $${{f}_{\\text{NI}}}=0.7$$ and $${{f}_{\\text{BS}}}=0.4$$ at $${{\\beta}_{\\text{N}}}\\approx 3.5$$ with performance appropriate for the ITER Q=5 mission, $${{H}_{89}}{{\\beta}_{\\text{N}}}/q_{95}^{2}\\approx 0.3$$ . Modeling studies explored how increased current drive power for DIII-D could be applied to maintain a stationary, fully noninductive high $${{\\ell}_{i}}$$ discharge. Lastly, stable solutions in the double-null shape are found without the vacuum vessel wall at $${{\\beta}_{\\text{N}}}=4$$ , $${{\\ell}_{i}}=1.07$$ and $${{f}_{\\text{BS}}}=0.5$$ , and at $${{\\beta}_{\\text{N}}}=5$$ with the vacuum vessel wall.« less

  7. Signal detection theory and vestibular perception: III. Estimating unbiased fit parameters for psychometric functions.

    PubMed

    Chaudhuri, Shomesh E; Merfeld, Daniel M

    2013-03-01

    Psychophysics generally relies on estimating a subject's ability to perform a specific task as a function of an observed stimulus. For threshold studies, the fitted functions are called psychometric functions. While fitting psychometric functions to data acquired using adaptive sampling procedures (e.g., "staircase" procedures), investigators have encountered a bias in the spread ("slope" or "threshold") parameter that has been attributed to the serial dependency of the adaptive data. Using simulations, we confirm this bias for cumulative Gaussian parametric maximum likelihood fits on data collected via adaptive sampling procedures, and then present a bias-reduced maximum likelihood fit that substantially reduces the bias without reducing the precision of the spread parameter estimate and without reducing the accuracy or precision of the other fit parameters. As a separate topic, we explain how to implement this bias reduction technique using generalized linear model fits as well as other numeric maximum likelihood techniques such as the Nelder-Mead simplex. We then provide a comparison of the iterative bootstrap and observed information matrix techniques for estimating parameter fit variance from adaptive sampling procedure data sets. The iterative bootstrap technique is shown to be slightly more accurate; however, the observed information technique executes in a small fraction (0.005 %) of the time required by the iterative bootstrap technique, which is an advantage when a real-time estimate of parameter fit variance is required.

  8. Causality constraints in conformal field theory

    DOE PAGES

    Hartman, Thomas; Jain, Sachin; Kundu, Sandipan

    2016-05-17

    Causality places nontrivial constraints on QFT in Lorentzian signature, for example fixing the signs of certain terms in the low energy Lagrangian. In d dimensional conformal field theory, we show how such constraints are encoded in crossing symmetry of Euclidean correlators, and derive analogous constraints directly from the conformal bootstrap (analytically). The bootstrap setup is a Lorentzian four-point function corresponding to propagation through a shockwave. Crossing symmetry fixes the signs of certain log terms that appear in the conformal block expansion, which constrains the interactions of low-lying operators. As an application, we use the bootstrap to rederive the well knownmore » sign constraint on the (Φ) 4 coupling in effective field theory, from a dual CFT. We also find constraints on theories with higher spin conserved currents. As a result, our analysis is restricted to scalar correlators, but we argue that similar methods should also impose nontrivial constraints on the interactions of spinning operators« less

  9. Conformal Bootstrap in Mellin Space

    NASA Astrophysics Data System (ADS)

    Gopakumar, Rajesh; Kaviraj, Apratim; Sen, Kallol; Sinha, Aninda

    2017-02-01

    We propose a new approach towards analytically solving for the dynamical content of conformal field theories (CFTs) using the bootstrap philosophy. This combines the original bootstrap idea of Polyakov with the modern technology of the Mellin representation of CFT amplitudes. We employ exchange Witten diagrams with built-in crossing symmetry as our basic building blocks rather than the conventional conformal blocks in a particular channel. Demanding consistency with the operator product expansion (OPE) implies an infinite set of constraints on operator dimensions and OPE coefficients. We illustrate the power of this method in the ɛ expansion of the Wilson-Fisher fixed point by reproducing anomalous dimensions and, strikingly, obtaining OPE coefficients to higher orders in ɛ than currently available using other analytic techniques (including Feynman diagram calculations). Our results enable us to get a somewhat better agreement between certain observables in the 3D Ising model and the precise numerical values that have been recently obtained.

  10. On heat loading, novel divertors, and fusion reactors

    NASA Astrophysics Data System (ADS)

    Kotschenreuther, M.; Valanju, P. M.; Mahajan, S. M.; Wiley, J. C.

    2007-07-01

    The limited thermal power handling capacity of the standard divertors (used in current as well as projected tokamaks) is likely to force extremely high (˜90%) radiation fractions frad in tokamak fusion reactors that have heating powers considerably larger than ITER [D. J. Campbell, Phys. Plasmas 8, 2041 (2001)]. Such enormous values of necessary frad could have serious and debilitating consequences on the core confinement, stability, and dependability for a fusion power reactor, especially in reactors with Internal Transport Barriers. A new class of divertors, called X-divertors (XD), which considerably enhance the divertor thermal capacity through a flaring of the field lines only near the divertor plates, may be necessary and sufficient to overcome these problems and lead to a dependable fusion power reactor with acceptable economics. X-divertors will lower the bar on the necessary confinement to bring it in the range of the present experimental results. Its ability to reduce the radiative burden imparts the X-divertor with a key advantage. Lower radiation demands allow sharply peaked density profiles that enhance the bootstrap fraction creating the possibility for a highly increased beta for the same beta normal discharges. The X-divertor emerges as a beta-enhancer capable of raising it by up to roughly a factor of 2.

  11. Modeling of Steady-state Scenarios for the Fusion Nuclear Science Facility, Advanced Tokamak Approach

    NASA Astrophysics Data System (ADS)

    Garofalo, A. M.; Chan, V. S.; Prater, R.; Smith, S. P.; St. John, H. E.; Meneghini, O.

    2013-10-01

    A Fusion National Science Facility (FNSF) would complement ITER in addressing the community identified science and technology gaps to a commercially attractive DEMO, including breeding tritium and completing the fuel cycle, qualifying nuclear materials for high fluence, developing suitable materials for the plasma-boundary interface, and demonstrating power extraction. Steady-state plasma operation is highly desirable to address the requirements for fusion nuclear technology testing [1]. The Advanced Tokamak (AT) is a strong candidate for an FNSF as a consequence of its mature physics base, capability to address the key issues with a more compact device, and the direct relevance to an attractive target power plant. Key features of AT are fully noninductive current drive, strong plasma cross section shaping, internal profiles consistent with high bootstrap fraction, and operation at high beta, typically above the free boundary limit, βN > 3 . Work supported by GA IR&D funding, DE-FC02-04ER54698, and DE-FG02-95ER43309.

  12. Solenoid-free plasma start-up in spherical tokamaks

    NASA Astrophysics Data System (ADS)

    Raman, R.; Shevchenko, V. F.

    2014-10-01

    The central solenoid is an intrinsic part of all present-day tokamaks and most spherical tokamaks. The spherical torus (ST) confinement concept is projected to operate at high toroidal beta and at a high fraction of the non-inductive bootstrap current as required for an efficient reactor system. The use of a conventional solenoid in a ST-based fusion nuclear facility is generally believed to not be a possibility. Solenoid-free plasma start-up is therefore an area of extensive worldwide research activity. Solenoid-free plasma start-up is also relevant to steady-state tokamak operation, as the central transformer coil of a conventional aspect ratio tokamak reactor would be located in a high radiation environment but would be needed only during the initial discharge initiation and current ramp-up phases. Solenoid-free operation also provides greater flexibility in the selection of the aspect ratio and simplifies the reactor design. Plasma start-up methods based on induction from external poloidal field coils, helicity injection and radio frequency current drive have all made substantial progress towards meeting this important need for the ST. Some of these systems will now undergo the final stages of test in a new generation of large STs, which are scheduled to begin operations during the next two years. This paper reviews research to date on methods for inducing the initial start-up current in STs without reliance on the conventional central solenoid.

  13. Recent results from the electron cyclotron heated plasmas in Tokamak à Configuration Variable (TCV)

    NASA Astrophysics Data System (ADS)

    Henderson, M. A.; Alberti, S.; Angioni, C.; Arnoux, G.; Behn, R.; Blanchard, P.; Bosshard, P.; Camenen, Y.; Coda, S.; Condrea, I.; Goodman, T. P.; Hofmann, F.; Hogge, J.-Ph.; Karpushov, A.; Manini, A.; Martynov, An.; Moret, J.-M.; Nikkola, P.; Nelson-Melby, E.; Pochelon, A.; Porte, L.; Sauter, O.; Ahmed, S. M.; Andrèbe, Y.; Appert, K.; Chavan, R.; Degeling, A.; Duval, B. P.; Etienne, P.; Fasel, D.; Fasoli, A.; Favez, J.-Y.; Furno, I.; Horacek, J.; Isoz, P.; Joye, B.; Klimanov, I.; Lavanchy, P.; Lister, J. B.; Llobet, X.; Magnin, J.-C.; Marlétaz, B.; Marmillod, P.; Martin, Y.; Mayor, J.-M.; Mylnar, J.; Paris, P. J.; Perez, A.; Peysson, Y.; Pitts, R. A.; Raju, D.; Reimerdes, H.; Scarabosio, A.; Scavino, E.; Seo, S. H.; Siravo, U.; Sushkov, A.; Tonetti, G.; Tran, M. Q.; Weisen, H.; Wischmeier, M.; Zabolotsky, A.; Yhuang, G.

    2003-05-01

    In noninductively driven discharges, 0.9 MW second harmonic (X2) off-axis co-electron cyclotron current drive deposition is combined with 0.45 MW X2 central heating to create an electron internal transport barrier (eITB) in steady plasma conditions resulting in a 1.6-fold increase of the confinement time (τEe) over ITER-98L-mode scaling. The eITB is associated with a reversed shear current profile enhanced by a large bootstrap current fraction (up to 80%) and is sustained for up to 10 current redistribution times. A linear dependence of the confinement improvement on the product of the global shear reversal factor (q0/qmin) and the reversed shear volume (ρq-min2) is shown. In other discharges heated with X2 the sawteeth are destabilized (respectively stabilized) when heating just inside (respectively outside) the q=1 surface. Control of the sawteeth may allow the avoidance of neoclassical tearing modes that can be seeded by the sawtooth instability. Results on H-mode and highly elongated plasmas using the newly completed third harmonic (X3) system and achieving up to 100% absorption are also discussed, along with comparison of experimental results with the TORAY-GA ray tracing code [K. Matsuda, IEEE Trans. Plasma Sci. PS-17, 6 (1989); R. H. Cohen, Phys. Fluids 30, 2442 (1987)].

  14. Predicting pain relief: Use of pre-surgical trigeminal nerve diffusion metrics in trigeminal neuralgia.

    PubMed

    Hung, Peter S-P; Chen, David Q; Davis, Karen D; Zhong, Jidan; Hodaie, Mojgan

    2017-01-01

    Trigeminal neuralgia (TN) is a chronic neuropathic facial pain disorder that commonly responds to surgery. A proportion of patients, however, do not benefit and suffer ongoing pain. There are currently no imaging tools that permit the prediction of treatment response. To address this paucity, we used diffusion tensor imaging (DTI) to determine whether pre-surgical trigeminal nerve microstructural diffusivities can prognosticate response to TN treatment. In 31 TN patients and 16 healthy controls, multi-tensor tractography was used to extract DTI-derived metrics-axial (AD), radial (RD), mean diffusivity (MD), and fractional anisotropy (FA)-from the cisternal segment, root entry zone and pontine segment of trigeminal nerves for false discovery rate-corrected Student's t -tests. Ipsilateral diffusivities were bootstrap resampled to visualize group-level diffusivity thresholds of long-term response. To obtain an individual-level statistical classifier of surgical response, we conducted discriminant function analysis (DFA) with the type of surgery chosen alongside ipsilateral measurements and ipsilateral/contralateral ratios of AD and RD from all regions of interest as prediction variables. Abnormal diffusivity in the trigeminal pontine fibers, demonstrated by increased AD, highlighted non-responders (n = 14) compared to controls. Bootstrap resampling revealed three ipsilateral diffusivity thresholds of response-pontine AD, MD, cisternal FA-separating 85% of non-responders from responders. DFA produced an 83.9% (71.0% using leave-one-out-cross-validation) accurate prognosticator of response that successfully identified 12/14 non-responders. Our study demonstrates that pre-surgical DTI metrics can serve as a highly predictive, individualized tool to prognosticate surgical response. We further highlight abnormal pontine segment diffusivities as key features of treatment non-response and confirm the axiom that central pain does not commonly benefit from peripheral treatments.

  15. Longitudinal Average Attributable Fraction as a Method for Studying Time-Varying Conditions and Treatments on Recurrent Self-rated Health: The Case of Medications in Older Adults with Multiple Chronic Conditions

    PubMed Central

    Allore, Heather G.; Zhan, Yilei; Tinetti, Mary; Trentalange, Mark; McAvay, Gail

    2015-01-01

    Purpose The objective is to modify the longitudinal extension of the average attributable fraction (LE-AAF) for recurrent outcomes with time-varying exposures and control for covariates. Methods We included Medicare Current Beneficiary Survey participants with two or more chronic conditions enrolled from 2005-2009 with follow-up through 2011. Nine time-varying medications indicated for nine time-varying common chronic conditions and 14 out of 18 forward-selected participant characteristics were used as control variables in the generalized estimating equations step of the LE-AAF to estimate associations with the recurrent universal health outcome self-rated health (SRH). Modifications of the LE-AAF were made to accommodate these indicated medication-condition interactions and covariates. Variability was empirically estimated by bias-corrected and accelerated bootstrapping. Results In the adjusted LE-AAF, thiazide, warfarin and clopidogrel had significant contributions of 1.2%, 0.4%, 0.2% respectively to low (poor or fair) SRH; while there were no significant contributions of the other medications to SRH. Hyperlipidemia significantly contributed 4.6% to high SRH. All the other conditions except atrial fibrillation contributed significantly to low SRH. Conclusions Our modifications to the LE-AAF method apply to a recurrent binary outcome with time-varying factors accounting for covariates. PMID:26033374

  16. Integrated modelling of steady-state scenarios and heating and current drive mixes for ITER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murakami, Masanori; Park, Jin Myung; Giruzzi, G.

    2011-01-01

    Recent progress on ITER steady-state (SS) scenario modelling by the ITPA-IOS group is reviewed. Code-to-code benchmarks as the IOS group's common activities for the two SS scenarios (weak shear scenario and internal transport barrier scenario) are discussed in terms of transport, kinetic profiles, and heating and current drive (CD) sources using various transport codes. Weak magnetic shear scenarios integrate the plasma core and edge by combining a theory-based transport model (GLF23) with scaled experimental boundary profiles. The edge profiles (at normalized radius rho = 0.8-1.0) are adopted from an edge-localized mode-averaged analysis of a DIII-D ITER demonstration discharge. A fullymore » noninductive SS scenario is achieved with fusion gain Q = 4.3, noninductive fraction f(NI) = 100%, bootstrap current fraction f(BS) = 63% and normalized beta beta(N) = 2.7 at plasma current I(p) = 8MA and toroidal field B(T) = 5.3 T using ITER day-1 heating and CD capability. Substantial uncertainties come from outside the radius of setting the boundary conditions (rho = 0.8). The present simulation assumed that beta(N)(rho) at the top of the pedestal (rho = 0.91) is about 25% above the peeling-ballooning threshold. ITER will have a challenge to achieve the boundary, considering different operating conditions (T(e)/T(i) approximate to 1 and density peaking). Overall, the experimentally scaled edge is an optimistic side of the prediction. A number of SS scenarios with different heating and CD mixes in a wide range of conditions were explored by exploiting the weak-shear steady-state solution procedure with the GLF23 transport model and the scaled experimental edge. The results are also presented in the operation space for DT neutron power versus stationary burn pulse duration with assumed poloidal flux availability at the beginning of stationary burn, indicating that the long pulse operation goal (3000s) at I(p) = 9 MA is possible. Source calculations in these simulations have been revised for electron cyclotron current drive including parallel momentum conservation effects and for neutral beam current drive with finite orbit and magnetic pitch effects.« less

  17. A comparison of bootstrap methods and an adjusted bootstrap approach for estimating the prediction error in microarray classification.

    PubMed

    Jiang, Wenyu; Simon, Richard

    2007-12-20

    This paper first provides a critical review on some existing methods for estimating the prediction error in classifying microarray data where the number of genes greatly exceeds the number of specimens. Special attention is given to the bootstrap-related methods. When the sample size n is small, we find that all the reviewed methods suffer from either substantial bias or variability. We introduce a repeated leave-one-out bootstrap (RLOOB) method that predicts for each specimen in the sample using bootstrap learning sets of size ln. We then propose an adjusted bootstrap (ABS) method that fits a learning curve to the RLOOB estimates calculated with different bootstrap learning set sizes. The ABS method is robust across the situations we investigate and provides a slightly conservative estimate for the prediction error. Even with small samples, it does not suffer from large upward bias as the leave-one-out bootstrap and the 0.632+ bootstrap, and it does not suffer from large variability as the leave-one-out cross-validation in microarray applications. Copyright (c) 2007 John Wiley & Sons, Ltd.

  18. Associations between dietary and lifestyle risk factors and colorectal cancer in the Scottish population.

    PubMed

    Theodoratou, Evropi; Farrington, Susan M; Tenesa, Albert; McNeill, Geraldine; Cetnarskyj, Roseanne; Korakakis, Emmanouil; Din, Farhat V N; Porteous, Mary E; Dunlop, Malcolm G; Campbell, Harry

    2014-01-01

    Colorectal cancer (CRC) accounts for 9.7% of all cancer cases and for 8% of all cancer-related deaths. Established risk factors include personal or family history of CRC as well as lifestyle and dietary factors. We investigated the relationship between CRC and demographic, lifestyle, food and nutrient risk factors through a case-control study that included 2062 patients and 2776 controls from Scotland. Forward and backward stepwise regression was applied and the stability of the models was assessed in 1000 bootstrap samples. The variables that were automatically selected to be included by the forward or backward stepwise regression and whose selection was verified by bootstrap sampling in the current study were family history, dietary energy, 'high-energy snack foods', eggs, juice, sugar-sweetened beverages and white fish (associated with an increased CRC risk) and NSAIDs, coffee and magnesium (associated with a decreased CRC risk). Application of forward and backward stepwise regression in this CRC study identified some already established as well as some novel potential risk factors. Bootstrap findings suggest that examination of the stability of regression models by bootstrap sampling is useful in the interpretation of study findings. 'High-energy snack foods' and high-energy drinks (including sugar-sweetened beverages and fruit juices) as risk factors for CRC have not been reported previously and merit further investigation as such snacks and beverages are important contributors in European and North American diets.

  19. Using the Bootstrap Concept to Build an Adaptable and Compact Subversion Artifice

    DTIC Science & Technology

    2003-06-01

    however, and the current “second generation” of microkernel implementations has resulted in significantly better performance. Of note is the L4 micro...63 c. GEMSOS Kernel .....................................................................63 d. L4 ... Microkernel ........................................................................64 VI. CONCLUSIONS

  20. Fast, Exact Bootstrap Principal Component Analysis for p > 1 million

    PubMed Central

    Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim

    2015-01-01

    Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject (p) is much larger than the number of subjects (n), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n-dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n-dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p-dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings (p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) (p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods. PMID:27616801

  1. A boostrap algorithm for temporal signal reconstruction in the presence of noise from its fractional Fourier transformed intensity spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Cheng-Yang; /Fermilab

    2011-02-01

    A bootstrap algorithm for reconstructing the temporal signal from four of its fractional Fourier intensity spectra in the presence of noise is described. An optical arrangement is proposed which realises the bootstrap method for the measurement of ultrashort laser pulses. The measurement of short laser pulses which are less than 1 ps is an ongoing challenge in optical physics. One reason is that no oscilloscope exists today which can directly measure the time structure of these pulses and so it becomes necessary to invent other techniques which indirectly provide the necessary information for temporal pulse reconstruction. One method called FROGmore » (frequency resolved optical gating) has been in use since 19911 and is one of the popular methods for recovering these types of short pulses. The idea behind FROG is the use of multiple time-correlated pulse measurements in the frequency domain for the reconstruction. Multiple data sets are required because only intensity information is recorded and not phase, and thus by collecting multiple data sets, there is enough redundant measurements to yield the original time structure, but not necessarily uniquely (or even up to an arbitrary constant phase offset). The objective of this paper is to describe another method which is simpler than FROG. Instead of collecting many auto-correlated data sets, only two spectral intensity measurements of the temporal signal are needed in the absence of noise. The first can be from the intensity components of its usual Fourier transform and the second from its FrFT (fractional Fourier transform). In the presence of noise, a minimum of four measurements are required with the same FrFT order but with two different apertures. Armed with these two or four measurements, a unique solution up to a constant phase offset can be constructed.« less

  2. Speeding Up Non-Parametric Bootstrap Computations for Statistics Based on Sample Moments in Small/Moderate Sample Size Applications

    PubMed Central

    Chaibub Neto, Elias

    2015-01-01

    In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson’s sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling. PMID:26125965

  3. Physics Basis for the Advanced Tokamak Fusion Power Plant ARIES-AT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S.C. Jardin; C.E. Kessel; T.K. Mau

    2003-10-07

    The advanced tokamak is considered as the basis for a fusion power plant. The ARIES-AT design has an aspect ratio of A always equal to R/a = 4.0, an elongation and triangularity of kappa = 2.20, delta = 0.90 (evaluated at the separatrix surface), a toroidal beta of beta = 9.1% (normalized to the vacuum toroidal field at the plasma center), which corresponds to a normalized beta of bN * 100 x b/(I(sub)P(MA)/a(m)B(T)) = 5.4. These beta values are chosen to be 10% below the ideal-MHD stability limit. The bootstrap-current fraction is fBS * I(sub)BS/I(sub)P = 0.91. This leads tomore » a design with total plasma current I(sub)P = 12.8 MA, and toroidal field of 11.1 T (at the coil edge) and 5.8 T (at the plasma center). The major and minor radii are 5.2 and 1.3 m, respectively. The effects of H-mode edge gradients and the stability of this configuration to non-ideal modes is analyzed. The current-drive system consists of ICRF/FW for on-axis current drive and a lower-hybrid system for off-axis. Tran sport projections are presented using the drift-wave based GLF23 model. The approach to power and particle exhaust using both plasma core and scrape-off-layer radiation is presented.« less

  4. Bootstrapping the O(N) archipelago

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kos, Filip; Poland, David; Simmons-Duffin, David

    2015-11-17

    We study 3d CFTs with an O(N) global symmetry using the conformal bootstrap for a system of mixed correlators. Specifically, we consider all nonvanishing scalar four-point functions containing the lowest dimension O(N) vector Φ i and the lowest dimension O(N) singlet s, assumed to be the only relevant operators in their symmetry representations. The constraints of crossing symmetry and unitarity for these four-point functions force the scaling dimensions (Δ Φ , Δ s ) to lie inside small islands. Here, we also make rigorous determinations of current two-point functions in the O(2) and O(3) models, with applications to transport inmore » condensed matter systems.« less

  5. Not seeing the forest for the trees: size of the minimum spanning trees (MSTs) forest and branch significance in MST-based phylogenetic analysis.

    PubMed

    Teixeira, Andreia Sofia; Monteiro, Pedro T; Carriço, João A; Ramirez, Mário; Francisco, Alexandre P

    2015-01-01

    Trees, including minimum spanning trees (MSTs), are commonly used in phylogenetic studies. But, for the research community, it may be unclear that the presented tree is just a hypothesis, chosen from among many possible alternatives. In this scenario, it is important to quantify our confidence in both the trees and the branches/edges included in such trees. In this paper, we address this problem for MSTs by introducing a new edge betweenness metric for undirected and weighted graphs. This spanning edge betweenness metric is defined as the fraction of equivalent MSTs where a given edge is present. The metric provides a per edge statistic that is similar to that of the bootstrap approach frequently used in phylogenetics to support the grouping of taxa. We provide methods for the exact computation of this metric based on the well known Kirchhoff's matrix tree theorem. Moreover, we implement and make available a module for the PHYLOViZ software and evaluate the proposed metric concerning both effectiveness and computational performance. Analysis of trees generated using multilocus sequence typing data (MLST) and the goeBURST algorithm revealed that the space of possible MSTs in real data sets is extremely large. Selection of the edge to be represented using bootstrap could lead to unreliable results since alternative edges are present in the same fraction of equivalent MSTs. The choice of the MST to be presented, results from criteria implemented in the algorithm that must be based in biologically plausible models.

  6. Copula based prediction models: an application to an aortic regurgitation study

    PubMed Central

    Kumar, Pranesh; Shoukri, Mohamed M

    2007-01-01

    Background: An important issue in prediction modeling of multivariate data is the measure of dependence structure. The use of Pearson's correlation as a dependence measure has several pitfalls and hence application of regression prediction models based on this correlation may not be an appropriate methodology. As an alternative, a copula based methodology for prediction modeling and an algorithm to simulate data are proposed. Methods: The method consists of introducing copulas as an alternative to the correlation coefficient commonly used as a measure of dependence. An algorithm based on the marginal distributions of random variables is applied to construct the Archimedean copulas. Monte Carlo simulations are carried out to replicate datasets, estimate prediction model parameters and validate them using Lin's concordance measure. Results: We have carried out a correlation-based regression analysis on data from 20 patients aged 17–82 years on pre-operative and post-operative ejection fractions after surgery and estimated the prediction model: Post-operative ejection fraction = - 0.0658 + 0.8403 (Pre-operative ejection fraction); p = 0.0008; 95% confidence interval of the slope coefficient (0.3998, 1.2808). From the exploratory data analysis, it is noted that both the pre-operative and post-operative ejection fractions measurements have slight departures from symmetry and are skewed to the left. It is also noted that the measurements tend to be widely spread and have shorter tails compared to normal distribution. Therefore predictions made from the correlation-based model corresponding to the pre-operative ejection fraction measurements in the lower range may not be accurate. Further it is found that the best approximated marginal distributions of pre-operative and post-operative ejection fractions (using q-q plots) are gamma distributions. The copula based prediction model is estimated as: Post -operative ejection fraction = - 0.0933 + 0.8907 × (Pre-operative ejection fraction); p = 0.00008 ; 95% confidence interval for slope coefficient (0.4810, 1.3003). For both models differences in the predicted post-operative ejection fractions in the lower range of pre-operative ejection measurements are considerably different and prediction errors due to copula model are smaller. To validate the copula methodology we have re-sampled with replacement fifty independent bootstrap samples and have estimated concordance statistics 0.7722 (p = 0.0224) for the copula model and 0.7237 (p = 0.0604) for the correlation model. The predicted and observed measurements are concordant for both models. The estimates of accuracy components are 0.9233 and 0.8654 for copula and correlation models respectively. Conclusion: Copula-based prediction modeling is demonstrated to be an appropriate alternative to the conventional correlation-based prediction modeling since the correlation-based prediction models are not appropriate to model the dependence in populations with asymmetrical tails. Proposed copula-based prediction model has been validated using the independent bootstrap samples. PMID:17573974

  7. Parameter exploration for a Compact Advanced Tokamak DEMO

    NASA Astrophysics Data System (ADS)

    Weisberg, D. B.; Buttery, R. J.; Ferron, J. R.; Garofalo, A. M.; Snyder, P. B.; Turnbull, A. D.; Holcomb, C. T.; McClenaghan, J.; Canik, J.; Park, J.-M.

    2017-10-01

    A new parameter study has explored a range of design points to assess the physics feasibility for a compact 200MWe advanced tokamak DEMO that combines high beta (βN < 4) and high toroidal field (BT = 6 - 7 T). A unique aspect of this study is the use of a FASTRAN modeling suite that combines integrated transport, pedestal, stability, and heating & current drive calculations to predict steady-state solutions with neutral beam and helicon powered current drive. This study has identified a range of design solutions in a compact (R0 = 4 m), high-field (BT = 6 - 7 T), strongly-shaped (κ = 2 , δ = 0.6) device. Unlike previous proposals, C-AT DEMO takes advantage of high-beta operation as well as emerging advances in magnet technology to demonstrate net electric production in a moderately sized machine. We present results showing that the large bootstrap fraction and low recirculating power enabled by high normalized beta can achieve tolerable heat and neutron load with good H-mode access. The prediction of operating points with simultaneously achieved high-confinement (H98 < 1.3), high-density (fGW < 1.3), and high-beta warrants additional assessment of this approach towards a cost-attractive DEMO device. Work supported by the US DOE under DE-FC02-04ER54698.

  8. Coefficient Omega Bootstrap Confidence Intervals: Nonnormal Distributions

    ERIC Educational Resources Information Center

    Padilla, Miguel A.; Divers, Jasmin

    2013-01-01

    The performance of the normal theory bootstrap (NTB), the percentile bootstrap (PB), and the bias-corrected and accelerated (BCa) bootstrap confidence intervals (CIs) for coefficient omega was assessed through a Monte Carlo simulation under conditions not previously investigated. Of particular interests were nonnormal Likert-type and binary items.…

  9. Tests of Independence for Ordinal Data Using Bootstrap.

    ERIC Educational Resources Information Center

    Chan, Wai; Yung, Yiu-Fai; Bentler, Peter M.; Tang, Man-Lai

    1998-01-01

    Two bootstrap tests are proposed to test the independence hypothesis in a two-way cross table. Monte Carlo studies are used to compare the traditional asymptotic test with these bootstrap methods, and the bootstrap methods are found superior in two ways: control of Type I error and statistical power. (SLD)

  10. Does Bootstrap Procedure Provide Biased Estimates? An Empirical Examination for a Case of Multiple Regression.

    ERIC Educational Resources Information Center

    Fan, Xitao

    This paper empirically and systematically assessed the performance of bootstrap resampling procedure as it was applied to a regression model. Parameter estimates from Monte Carlo experiments (repeated sampling from population) and bootstrap experiments (repeated resampling from one original bootstrap sample) were generated and compared. Sample…

  11. Using the Descriptive Bootstrap to Evaluate Result Replicability (Because Statistical Significance Doesn't)

    ERIC Educational Resources Information Center

    Spinella, Sarah

    2011-01-01

    As result replicability is essential to science and difficult to achieve through external replicability, the present paper notes the insufficiency of null hypothesis statistical significance testing (NHSST) and explains the bootstrap as a plausible alternative, with a heuristic example to illustrate the bootstrap method. The bootstrap relies on…

  12. Performance of Bootstrapping Approaches To Model Test Statistics and Parameter Standard Error Estimation in Structural Equation Modeling.

    ERIC Educational Resources Information Center

    Nevitt, Jonathan; Hancock, Gregory R.

    2001-01-01

    Evaluated the bootstrap method under varying conditions of nonnormality, sample size, model specification, and number of bootstrap samples drawn from the resampling space. Results for the bootstrap suggest the resampling-based method may be conservative in its control over model rejections, thus having an impact on the statistical power associated…

  13. Nonparametric bootstrap analysis with applications to demographic effects in demand functions.

    PubMed

    Gozalo, P L

    1997-12-01

    "A new bootstrap proposal, labeled smooth conditional moment (SCM) bootstrap, is introduced for independent but not necessarily identically distributed data, where the classical bootstrap procedure fails.... A good example of the benefits of using nonparametric and bootstrap methods is the area of empirical demand analysis. In particular, we will be concerned with their application to the study of two important topics: what are the most relevant effects of household demographic variables on demand behavior, and to what extent present parametric specifications capture these effects." excerpt

  14. Resilient Diffusive Clouds

    DTIC Science & Technology

    2017-02-01

    scale blade servers (Dell PowerEdge) [20]. It must be recognized however, that the findings are distributed over this collection of architectures not...current operating system designs run into millions of lines of code. Moreover, they compound the opportunity for compromise by granting device drivers...properties (e.g. IP & MAC address) so as to invalidate an adversary’s surveillance data. The current running and bootstrapping instances of the micro

  15. Lung cancer signature biomarkers: tissue specific semantic similarity based clustering of digital differential display (DDD) data.

    PubMed

    Srivastava, Mousami; Khurana, Pankaj; Sugadev, Ragumani

    2012-11-02

    The tissue-specific Unigene Sets derived from more than one million expressed sequence tags (ESTs) in the NCBI, GenBank database offers a platform for identifying significantly and differentially expressed tissue-specific genes by in-silico methods. Digital differential display (DDD) rapidly creates transcription profiles based on EST comparisons and numerically calculates, as a fraction of the pool of ESTs, the relative sequence abundance of known and novel genes. However, the process of identifying the most likely tissue for a specific disease in which to search for candidate genes from the pool of differentially expressed genes remains difficult. Therefore, we have used 'Gene Ontology semantic similarity score' to measure the GO similarity between gene products of lung tissue-specific candidate genes from control (normal) and disease (cancer) sets. This semantic similarity score matrix based on hierarchical clustering represents in the form of a dendrogram. The dendrogram cluster stability was assessed by multiple bootstrapping. Multiple bootstrapping also computes a p-value for each cluster and corrects the bias of the bootstrap probability. Subsequent hierarchical clustering by the multiple bootstrapping method (α = 0.95) identified seven clusters. The comparative, as well as subtractive, approach revealed a set of 38 biomarkers comprising four distinct lung cancer signature biomarker clusters (panel 1-4). Further gene enrichment analysis of the four panels revealed that each panel represents a set of lung cancer linked metastasis diagnostic biomarkers (panel 1), chemotherapy/drug resistance biomarkers (panel 2), hypoxia regulated biomarkers (panel 3) and lung extra cellular matrix biomarkers (panel 4). Expression analysis reveals that hypoxia induced lung cancer related biomarkers (panel 3), HIF and its modulating proteins (TGM2, CSNK1A1, CTNNA1, NAMPT/Visfatin, TNFRSF1A, ETS1, SRC-1, FN1, APLP2, DMBT1/SAG, AIB1 and AZIN1) are significantly down regulated. All down regulated genes in this panel were highly up regulated in most other types of cancers. These panels of proteins may represent signature biomarkers for lung cancer and will aid in lung cancer diagnosis and disease monitoring as well as in the prediction of responses to therapeutics.

  16. Quasi-Axially Symmetric Stellarators with 3 Field Periods

    NASA Astrophysics Data System (ADS)

    Garabedian, Paul; Ku, Long-Poe

    1998-11-01

    Compact hybrid configurations with 2 field periods have been studied recently as candidates for a proof of principle experiment at PPPL, cf. A. Reiman et al., Physics design of a high beta quasi-axially symmetric stellarator, J. Plas. Fus. Res. SERIES 1, 429(1998). This enterprise has led us to the discovery of a family of quasi-axially symmetric stellarators with 3 field periods that seem to have significant advantages, although their aspect ratios are a little larger. They have reversed shear and perform better in a local analysis of ballooning modes. Nonlinear equilibrium and stability calculations predict that the average beta limit may be as high as 6% if the bootstrap current turns out to be as big as that expected in comparable tokamaks. The concept relies on a combination of helical fields and bootstrap current to achieve adequate rotational transform at low aspect ratio. A detailed manuscript describing some of this work will be published soon, cf. P.R. Garabedian, Quasi-axially symmetric stellarators, Proc. Natl. Acad. Sci. USA 95 (1998).

  17. The Success of Linear Bootstrapping Models: Decision Domain-, Expertise-, and Criterion-Specific Meta-Analysis

    PubMed Central

    Kaufmann, Esther; Wittmann, Werner W.

    2016-01-01

    The success of bootstrapping or replacing a human judge with a model (e.g., an equation) has been demonstrated in Paul Meehl’s (1954) seminal work and bolstered by the results of several meta-analyses. To date, however, analyses considering different types of meta-analyses as well as the potential dependence of bootstrapping success on the decision domain, the level of expertise of the human judge, and the criterion for what constitutes an accurate decision have been missing from the literature. In this study, we addressed these research gaps by conducting a meta-analysis of lens model studies. We compared the results of a traditional (bare-bones) meta-analysis with findings of a meta-analysis of the success of bootstrap models corrected for various methodological artifacts. In line with previous studies, we found that bootstrapping was more successful than human judgment. Furthermore, bootstrapping was more successful in studies with an objective decision criterion than in studies with subjective or test score criteria. We did not find clear evidence that the success of bootstrapping depended on the decision domain (e.g., education or medicine) or on the judge’s level of expertise (novice or expert). Correction of methodological artifacts increased the estimated success of bootstrapping, suggesting that previous analyses without artifact correction (i.e., traditional meta-analyses) may have underestimated the value of bootstrapping models. PMID:27327085

  18. Efficient bootstrap estimates for tail statistics

    NASA Astrophysics Data System (ADS)

    Breivik, Øyvind; Aarnes, Ole Johan

    2017-03-01

    Bootstrap resamples can be used to investigate the tail of empirical distributions as well as return value estimates from the extremal behaviour of the sample. Specifically, the confidence intervals on return value estimates or bounds on in-sample tail statistics can be obtained using bootstrap techniques. However, non-parametric bootstrapping from the entire sample is expensive. It is shown here that it suffices to bootstrap from a small subset consisting of the highest entries in the sequence to make estimates that are essentially identical to bootstraps from the entire sample. Similarly, bootstrap estimates of confidence intervals of threshold return estimates are found to be well approximated by using a subset consisting of the highest entries. This has practical consequences in fields such as meteorology, oceanography and hydrology where return values are calculated from very large gridded model integrations spanning decades at high temporal resolution or from large ensembles of independent and identically distributed model fields. In such cases the computational savings are substantial.

  19. How does public opinion become extreme?

    NASA Astrophysics Data System (ADS)

    Ramos, Marlon; Shao, Jia; Reis, Saulo D. S.; Anteneodo, Celia; Andrade, José S.; Havlin, Shlomo; Makse, Hernán A.

    2015-05-01

    We investigate the emergence of extreme opinion trends in society by employing statistical physics modeling and analysis on polls that inquire about a wide range of issues such as religion, economics, politics, abortion, extramarital sex, books, movies, and electoral vote. The surveys lay out a clear indicator of the rise of extreme views. The precursor is a nonlinear relation between the fraction of individuals holding a certain extreme view and the fraction of individuals that includes also moderates, e.g., in politics, those who are “very conservative” versus “moderate to very conservative” ones. We propose an activation model of opinion dynamics with interaction rules based on the existence of individual “stubbornness” that mimics empirical observations. According to our modeling, the onset of nonlinearity can be associated to an abrupt bootstrap-percolation transition with cascades of extreme views through society. Therefore, it represents an early-warning signal to forecast the transition from moderate to extreme views. Moreover, by means of a phase diagram we can classify societies according to the percolative regime they belong to, in terms of critical fractions of extremists and people’s ties.

  20. How does public opinion become extreme?

    PubMed Central

    Ramos, Marlon; Shao, Jia; Reis, Saulo D. S.; Anteneodo, Celia; Andrade, José S.; Havlin, Shlomo; Makse, Hernán A.

    2015-01-01

    We investigate the emergence of extreme opinion trends in society by employing statistical physics modeling and analysis on polls that inquire about a wide range of issues such as religion, economics, politics, abortion, extramarital sex, books, movies, and electoral vote. The surveys lay out a clear indicator of the rise of extreme views. The precursor is a nonlinear relation between the fraction of individuals holding a certain extreme view and the fraction of individuals that includes also moderates, e.g., in politics, those who are “very conservative” versus “moderate to very conservative” ones. We propose an activation model of opinion dynamics with interaction rules based on the existence of individual “stubbornness” that mimics empirical observations. According to our modeling, the onset of nonlinearity can be associated to an abrupt bootstrap-percolation transition with cascades of extreme views through society. Therefore, it represents an early-warning signal to forecast the transition from moderate to extreme views. Moreover, by means of a phase diagram we can classify societies according to the percolative regime they belong to, in terms of critical fractions of extremists and people’s ties. PMID:25989484

  1. How does public opinion become extreme?

    PubMed

    Ramos, Marlon; Shao, Jia; Reis, Saulo D S; Anteneodo, Celia; Andrade, José S; Havlin, Shlomo; Makse, Hernán A

    2015-05-19

    We investigate the emergence of extreme opinion trends in society by employing statistical physics modeling and analysis on polls that inquire about a wide range of issues such as religion, economics, politics, abortion, extramarital sex, books, movies, and electoral vote. The surveys lay out a clear indicator of the rise of extreme views. The precursor is a nonlinear relation between the fraction of individuals holding a certain extreme view and the fraction of individuals that includes also moderates, e.g., in politics, those who are "very conservative" versus "moderate to very conservative" ones. We propose an activation model of opinion dynamics with interaction rules based on the existence of individual "stubbornness" that mimics empirical observations. According to our modeling, the onset of nonlinearity can be associated to an abrupt bootstrap-percolation transition with cascades of extreme views through society. Therefore, it represents an early-warning signal to forecast the transition from moderate to extreme views. Moreover, by means of a phase diagram we can classify societies according to the percolative regime they belong to, in terms of critical fractions of extremists and people's ties.

  2. What Teachers Should Know About the Bootstrap: Resampling in the Undergraduate Statistics Curriculum

    PubMed Central

    Hesterberg, Tim C.

    2015-01-01

    Bootstrapping has enormous potential in statistics education and practice, but there are subtle issues and ways to go wrong. For example, the common combination of nonparametric bootstrapping and bootstrap percentile confidence intervals is less accurate than using t-intervals for small samples, though more accurate for larger samples. My goals in this article are to provide a deeper understanding of bootstrap methods—how they work, when they work or not, and which methods work better—and to highlight pedagogical issues. Supplementary materials for this article are available online. [Received December 2014. Revised August 2015] PMID:27019512

  3. The Physics of Basis For A Conservative Physics And Conservative Technology Tokamak Power Plant, ARIES-ACT2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kessel, C. E.; Poli, F. M.

    2014-03-04

    The conservative physics and conservative technology tokamak power plant ARIES-ACT2 has a major radius of 9.75 m at aspect ratio of 4.0, strong shaping with elongation of 2.2 and triangularity of 0.63. The no wall βN reaches ~ 2.4, limited by n=1 external kink mode, and can be extended to 3.2 with a stabilizing shell behind the ring structure shield. The bootstrap current fraction is 77% with a q95 of 8.0, requiring about ~ 4.0 MA of external current drive. This current is supplied with 30 MW of ICRF/FW and 80 MW of negative ion NB. Up to 1.0 MAmore » can be driven with LH with no wall, and 1.5 or more MA can be driven with a stabilizing shell. EC was examined and is most effective for safety factor control over ρ ~ 0.2-0.6 with 20 MW. The pedestal density is ~ 0.65x10 20/m 3 and the temperature is ~ 9.0 keV. The H98 factor is 1.25, n/n Gr = 1.3, and the net power to LH threshold power is 1.3-1.4 in the flattop. Due to the high toroidal field and high central temperature the cyclotron radiation loss was found to be high depending on the first wall reflectivity.« less

  4. The PIT-trap-A "model-free" bootstrap procedure for inference about regression models with discrete, multivariate responses.

    PubMed

    Warton, David I; Thibaut, Loïc; Wang, Yi Alice

    2017-01-01

    Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)-common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of "model-free bootstrap", adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods.

  5. Bootstrap Percolation on Homogeneous Trees Has 2 Phase Transitions

    NASA Astrophysics Data System (ADS)

    Fontes, L. R. G.; Schonmann, R. H.

    2008-09-01

    We study the threshold θ bootstrap percolation model on the homogeneous tree with degree b+1, 2≤ θ≤ b, and initial density p. It is known that there exists a nontrivial critical value for p, which we call p f , such that a) for p> p f , the final bootstrapped configuration is fully occupied for almost every initial configuration, and b) if p< p f , then for almost every initial configuration, the final bootstrapped configuration has density of occupied vertices less than 1. In this paper, we establish the existence of a distinct critical value for p, p c , such that 0< p c < p f , with the following properties: 1) if p≤ p c , then for almost every initial configuration there is no infinite cluster of occupied vertices in the final bootstrapped configuration; 2) if p> p c , then for almost every initial configuration there are infinite clusters of occupied vertices in the final bootstrapped configuration. Moreover, we show that 3) for p< p c , the distribution of the occupied cluster size in the final bootstrapped configuration has an exponential tail; 4) at p= p c , the expected occupied cluster size in the final bootstrapped configuration is infinite; 5) the probability of percolation of occupied vertices in the final bootstrapped configuration is continuous on [0, p f ] and analytic on ( p c , p f ), admitting an analytic continuation from the right at p c and, only in the case θ= b, also from the left at p f .

  6. Bootstrap-based methods for estimating standard errors in Cox's regression analyses of clustered event times.

    PubMed

    Xiao, Yongling; Abrahamowicz, Michal

    2010-03-30

    We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.

  7. Visuospatial bootstrapping: implicit binding of verbal working memory to visuospatial representations in children and adults.

    PubMed

    Darling, Stephen; Parker, Mary-Jane; Goodall, Karen E; Havelka, Jelena; Allen, Richard J

    2014-03-01

    When participants carry out visually presented digit serial recall, their performance is better if they are given the opportunity to encode extra visuospatial information at encoding-a phenomenon that has been termed visuospatial bootstrapping. This bootstrapping is the result of integration of information from different modality-specific short-term memory systems and visuospatial knowledge in long term memory, and it can be understood in the context of recent models of working memory that address multimodal binding (e.g., models incorporating an episodic buffer). Here we report a cross-sectional developmental study that demonstrated visuospatial bootstrapping in adults (n=18) and 9-year-old children (n=15) but not in 6-year-old children (n=18). This is the first developmental study addressing visuospatial bootstrapping, and results demonstrate that the developmental trajectory of bootstrapping is different from that of basic verbal and visuospatial working memory. This pattern suggests that bootstrapping (and hence integrative functions such as those associated with the episodic buffer) emerge independent of the development of basic working memory slave systems during childhood. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. A bootstrap based space-time surveillance model with an application to crime occurrences

    NASA Astrophysics Data System (ADS)

    Kim, Youngho; O'Kelly, Morton

    2008-06-01

    This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.

  9. Augmenting Literacy: The Role of Expertise in Digital Writing

    ERIC Educational Resources Information Center

    Van Ittersum, Derek

    2011-01-01

    This essay presents a model of reflective use of writing technologies, one that provides a means of more fully exploiting the possibilities of these tools for transforming writing activity. Derived from the work of computer designer Douglas Engelbart, the "bootstrapping" model of reflective use extends current arguments in the field…

  10. Variance Estimation Using Replication Methods in Structural Equation Modeling with Complex Sample Data

    ERIC Educational Resources Information Center

    Stapleton, Laura M.

    2008-01-01

    This article discusses replication sampling variance estimation techniques that are often applied in analyses using data from complex sampling designs: jackknife repeated replication, balanced repeated replication, and bootstrapping. These techniques are used with traditional analyses such as regression, but are currently not used with structural…

  11. What You See Is What You Get!

    ERIC Educational Resources Information Center

    Harrison, David

    1979-01-01

    The issue of observability and the relative roles of the senses and reason in understanding the world is reviewed. Eastern "mystical" philosophy serves as a focus in which interpretations of quantum mechanics, as well as the current bootstrap-quark controversy are seen in some slightly different contexts. (Author/GA)

  12. Aspect ratio effects on neoclassical tearing modes from comparison between DIII-D and National Spherical Torus Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    La Haye, R. J.; Buttery, R. J.; Gerhardt, S. P.

    Neoclassical tearing mode islands are sustained by helically perturbed bootstrap currents arising at finite beta from toroidal effects that trap a fraction of the particles in non-circulating orbits. DIII-D and NSTX are here operated with similar shape and cross-sectional area but almost a factor of two difference in inverse aspect ratio a/R. In these experiments, destabilized n=1 tearing modes were self-stabilized (reached the 'marginal point') by reducing neutral-beam power and thus beta. The measure of the marginal island gives information on the small-island stabilizing physics that in part (with seeding) governs onset. The marginal island width on NSTX is foundmore » to be about three times the ion banana width and agrees with that measured in DIII-D, except for DIII-D modes closer to the magnetic axis, which are about two times the ion banana width. There is a balance of the helically perturbed bootstrap term with small island effects with the sum of the classical and curvature terms in the modified Rutherford equation for tearing-mode stability at the experimental marginal point. Empirical evaluation of this sum indicates that while the stabilizing effect of the curvature term is negligible in DIII-D, it is important in NSTX. The mode temporal behavior from the start of neutral-beam injection reduction also suggests that NSTX operates closer to marginal classical tearing stability; this explains why there is little hysteresis in beta between mode onset, saturation, and self-stabilization (while DIII-D has large hysteresis in beta). NIMROD code module component calculations based on DIII-D and NSTX reconstructed experimental equilibria are used to diagnose and confirm the relative importance of the stabilizing curvature effect, an advantage for low aspect ratio; the relatively greater curvature effect makes for less susceptibility to NTM onset even if the classical tearing stability index is near marginal.« less

  13. An SAS Macro for Implementing the Modified Bollen-Stine Bootstrap for Missing Data: Implementing the Bootstrap Using Existing Structural Equation Modeling Software

    ERIC Educational Resources Information Center

    Enders, Craig K.

    2005-01-01

    The Bollen-Stine bootstrap can be used to correct for standard error and fit statistic bias that occurs in structural equation modeling (SEM) applications due to nonnormal data. The purpose of this article is to demonstrate the use of a custom SAS macro program that can be used to implement the Bollen-Stine bootstrap with existing SEM software.…

  14. Comparison of Sample Size by Bootstrap and by Formulas Based on Normal Distribution Assumption.

    PubMed

    Wang, Zuozhen

    2018-01-01

    Bootstrapping technique is distribution-independent, which provides an indirect way to estimate the sample size for a clinical trial based on a relatively smaller sample. In this paper, sample size estimation to compare two parallel-design arms for continuous data by bootstrap procedure are presented for various test types (inequality, non-inferiority, superiority, and equivalence), respectively. Meanwhile, sample size calculation by mathematical formulas (normal distribution assumption) for the identical data are also carried out. Consequently, power difference between the two calculation methods is acceptably small for all the test types. It shows that the bootstrap procedure is a credible technique for sample size estimation. After that, we compared the powers determined using the two methods based on data that violate the normal distribution assumption. To accommodate the feature of the data, the nonparametric statistical method of Wilcoxon test was applied to compare the two groups in the data during the process of bootstrap power estimation. As a result, the power estimated by normal distribution-based formula is far larger than that by bootstrap for each specific sample size per group. Hence, for this type of data, it is preferable that the bootstrap method be applied for sample size calculation at the beginning, and that the same statistical method as used in the subsequent statistical analysis is employed for each bootstrap sample during the course of bootstrap sample size estimation, provided there is historical true data available that can be well representative of the population to which the proposed trial is planning to extrapolate.

  15. Application of the Bootstrap Methods in Factor Analysis.

    ERIC Educational Resources Information Center

    Ichikawa, Masanori; Konishi, Sadanori

    1995-01-01

    A Monte Carlo experiment was conducted to investigate the performance of bootstrap methods in normal theory maximum likelihood factor analysis when the distributional assumption was satisfied or unsatisfied. Problems arising with the use of bootstrap methods are highlighted. (SLD)

  16. Fast Ion Transport Studies in DIII-D High βN Steady-State Scenarios

    NASA Astrophysics Data System (ADS)

    Holcomb, C. T.

    2014-10-01

    DIII-D research is identifying paths to optimize energetic particle (EP) transport in high βN steady-state tokamak scenarios. Operation with qmin > 2 is predicted to achieve high βN, confinement, and bootstrap fraction. However DIII-D experiments have shown that Alfvén eigenmodes (AE) and correlated EP transport can limit the performance of some qmin > 2 plasmas. Enhanced EP transport occurs in plasmas with qmin = 2-2.5, q95 = 5-7, and relatively long slowing down time. Strong AEs are present, the confinement factor H89 = 1.6-1.8 and βN is limited to ~3 by the available power. These observations are consistent with EP transport models having a critical gradient in βf. However, adjusting the parameters can recover classical EP confinement or improve thermal confinement so that H89 > 2 . One example is a scenario with βP and βN ~ 3 . 2 , qmin > 3 and q95 ~ 11 developed to test control of long pulse, high heat flux operation on devices like EAST. This has an internal transport barrier at ρ ~ 0 . 7 , bootstrap fraction >75%, density limit fraction ~1, and H89 >= 2 . In these cases AE activity and EP transport is very dynamic - it varies between classical and anomalous from shot to shot and within shots. Thus these plasmas are close to a threshold for enhanced EP transport. This may be governed by a combination of a relatively low ∇βfast due to good thermal confinement and lower beam power, short slowing down time, and possibly changes to the q-profile. Another example is scenarios with qmin ~ 1.1. These typically have classical EP confinement and good thermal confinement. Thus by using its flexible parameters and profile control tools DIII-D is comparing a wide range of steady-state scenarios to identify the key physics setting EP transport. Work supported by the US Department of Energy under DE-AC52-07NA27344, SC-G903402, DE-FC02-04ER54698, and DE-AC02-09CH11466.

  17. External heating and current drive source requirements towards steady-state operation in ITER

    NASA Astrophysics Data System (ADS)

    Poli, F. M.; Kessel, C. E.; Bonoli, P. T.; Batchelor, D. B.; Harvey, R. W.; Snyder, P. B.

    2014-07-01

    Steady state scenarios envisaged for ITER aim at optimizing the bootstrap current, while maintaining sufficient confinement and stability to provide the necessary fusion yield. Non-inductive scenarios will need to operate with internal transport barriers (ITBs) in order to reach adequate fusion gain at typical currents of 9 MA. However, the large pressure gradients associated with ITBs in regions of weak or negative magnetic shear can be conducive to ideal MHD instabilities, reducing the no-wall limit. The E × B flow shear from toroidal plasma rotation is expected to be low in ITER, with a major role in the ITB dynamics being played by magnetic geometry. Combinations of heating and current drive (H/CD) sources that sustain reversed magnetic shear profiles throughout the discharge are the focus of this work. Time-dependent transport simulations indicate that a combination of electron cyclotron (EC) and lower hybrid (LH) waves is a promising route towards steady state operation in ITER. The LH forms and sustains expanded barriers and the EC deposition at mid-radius freezes the bootstrap current profile stabilizing the barrier and leading to confinement levels 50% higher than typical H-mode energy confinement times. Using LH spectra with spectrum centred on parallel refractive index of 1.75-1.85, the performance of these plasma scenarios is close to the ITER target of 9 MA non-inductive current, global confinement gain H98 = 1.6 and fusion gain Q = 5.

  18. Small sample mediation testing: misplaced confidence in bootstrapped confidence intervals.

    PubMed

    Koopman, Joel; Howe, Michael; Hollenbeck, John R; Sin, Hock-Peng

    2015-01-01

    Bootstrapping is an analytical tool commonly used in psychology to test the statistical significance of the indirect effect in mediation models. Bootstrapping proponents have particularly advocated for its use for samples of 20-80 cases. This advocacy has been heeded, especially in the Journal of Applied Psychology, as researchers are increasingly utilizing bootstrapping to test mediation with samples in this range. We discuss reasons to be concerned with this escalation, and in a simulation study focused specifically on this range of sample sizes, we demonstrate not only that bootstrapping has insufficient statistical power to provide a rigorous hypothesis test in most conditions but also that bootstrapping has a tendency to exhibit an inflated Type I error rate. We then extend our simulations to investigate an alternative empirical resampling method as well as a Bayesian approach and demonstrate that they exhibit comparable statistical power to bootstrapping in small samples without the associated inflated Type I error. Implications for researchers testing mediation hypotheses in small samples are presented. For researchers wishing to use these methods in their own research, we have provided R syntax in the online supplemental materials. (c) 2015 APA, all rights reserved.

  19. Active core profile and transport modification by application of ion Bernstein wave power in the Princeton Beta Experiment-Modification

    NASA Astrophysics Data System (ADS)

    LeBlanc, B.; Batha, S.; Bell, R.; Bernabei, S.; Blush, L.; de la Luna, E.; Doerner, R.; Dunlap, J.; England, A.; Garcia, I.; Ignat, D.; Isler, R.; Jones, S.; Kaita, R.; Kaye, S.; Kugel, H.; Levinton, F.; Luckhardt, S.; Mutoh, T.; Okabayashi, M.; Ono, M.; Paoletti, F.; Paul, S.; Petravich, G.; Post-Zwicker, A.; Sauthoff, N.; Schmitz, L.; Sesnic, S.; Takahashi, H.; Talvard, M.; Tighe, W.; Tynan, G.; von Goeler, S.; Woskov, P.; Zolfaghari, A.

    1995-03-01

    Application of Ion Bernstein Wave Heating (IBWH) into the Princeton Beta Experiment-Modification (PBX-M) [Phys. Fluids B 2, 1271 (1990)] tokamak stabilizes sawtooth oscillations and generates peaked density profiles. A transport barrier, spatially correlated with the IBWH power deposition profile, is observed in the core of IBWH-assisted neutral beam injection (NBI) discharges. A precursor to the fully developed barrier is seen in the soft x-ray data during edge localized mode (ELM) activity. Sustained IBWH operation is conducive to a regime where the barrier supports large ∇ne, ∇Te, ∇νφ, and ∇Ti, delimiting the confinement zone. This regime is reminiscent of the H(high) mode, but with a confinement zone moved inward. The core region has better than H-mode confinement while the peripheral region is L(low)-mode-like. The peaked profile enhances NBI core deposition and increases nuclear reactivity. An increase in central Ti results from χi reduction (compared to the H mode) and better beam penetration. Bootstrap current fractions of up to 0.32-0.35 locally and 0.28 overall were obtained when an additional NBI burst is applied to this plasma.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Jinping P.; Garofalo, Andrea M.; Gong, Xianzu Z.

    Recent EAST/DIII-D joint experiments on the high poloidal betamore » $${{\\beta}_{\\text{P}}}$$ regime in DIII-D have extended operation with internal transport barriers (ITBs) and excellent energy confinement (H 98y2 ~ 1.6) to higher plasma current, for lower q 95 ≤ 7.0, and more balanced neutral beam injection (NBI) (torque injection < 2 Nm), for lower plasma rotation than previous results. Transport analysis and experimental measurements at low toroidal rotation suggest that the E × B shear effect is not key to the ITB formation in these high $${{\\beta}_{\\text{P}}}$$ discharges. Experiments and TGLF modeling show that the Shafranov shift has a key stabilizing effect on turbulence. Extrapolation of the DIII-D results using a 0D model shows that with the improved confinement, the high bootstrap fraction regime could achieve fusion gain Q = 5 in ITER at $${{\\beta}_{\\text{N}}}$$ ~ 2.9 and q 95 ~ 7. With the optimization of q(0), the required improved confinement is achievable when using 1.5D TGLF-SAT1 for transport simulations. Furthermore, results reported in this paper suggest that the DIII-D high $${{\\beta}_{\\text{P}}}$$ scenario could be a candidate for ITER steady state operation.« less

  1. Computational Study of Anomalous Transport in High Beta DIII-D Discharges with ITBs

    NASA Astrophysics Data System (ADS)

    Pankin, Alexei; Garofalo, Andrea; Grierson, Brian; Kritz, Arnold; Rafiq, Tariq

    2015-11-01

    The advanced tokamak scenarios require a large bootstrap current fraction and high β. These large values are often outside the range that occurs in ``conventional'' tokamak discharges. The GLF23, TGLF, and MMM transport models have been previously validated for discharges with parameters associated with ``conventional'' tokamak discharges. It has been demonstrated that the TGLF model under-predicts anomalous transport in high β DIII-D discharges [A.M. Garofalo et al. 2015 TTF Workshop]. In this research, the validity of MMM7.1 model [T. Rafiq et al. Phys. Plasmas 20 032506 (2013)] is tested for high β DIII-D discharges with low and high torque. In addition, the sensitivity of the anomalous transport to β is examined. It is shown that the MMM7.1 model over-predicts the anomalous transport in the DIII-D discharge 154406. In particular, a significant level of anomalous transport is found just outside the internal transport barrier. Differences in the anomalous transport predicted using TGLF and MMM7.1 are reviewed. Mechanisms for quenching of anomalous transport in the ITB regions of high-beta discharges are investigated. This research is supported by US Department of Energy.

  2. Bootstrap confidence levels for phylogenetic trees.

    PubMed

    Efron, B; Halloran, E; Holmes, S

    1996-07-09

    Evolutionary trees are often estimated from DNA or RNA sequence data. How much confidence should we have in the estimated trees? In 1985, Felsenstein [Felsenstein, J. (1985) Evolution 39, 783-791] suggested the use of the bootstrap to answer this question. Felsenstein's method, which in concept is a straightforward application of the bootstrap, is widely used, but has been criticized as biased in the genetics literature. This paper concerns the use of the bootstrap in the tree problem. We show that Felsenstein's method is not biased, but that it can be corrected to better agree with standard ideas of confidence levels and hypothesis testing. These corrections can be made by using the more elaborate bootstrap method presented here, at the expense of considerably more computation.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Batchelor, D.B.; Carreras, B.A.; Hirshman, S.P.

    Significant progress has been made in the development of new modest-size compact stellarator devices that could test optimization principles for the design of a more attractive reactor. These are 3 and 4 field period low-aspect-ratio quasi-omnigenous (QO) stellarators based on an optimization method that targets improved confinement, stability, ease of coil design, low-aspect-ratio, and low bootstrap current.

  4. Working Memory Deficits and Social Problems in Children with ADHD

    ERIC Educational Resources Information Center

    Kofler, Michael J.; Rapport, Mark D.; Bolden, Jennifer; Sarver, Dustin E.; Raiker, Joseph S.; Alderson, R. Matt

    2011-01-01

    Social problems are a prevalent feature of ADHD and reflect a major source of functional impairment for these children. The current study examined the impact of working memory deficits on parent- and teacher-reported social problems in a sample of children with ADHD and typically developing boys (N = 39). Bootstrapped, bias-corrected mediation…

  5. The Development of Spontaneous Sound-Shape Matching in Monolingual and Bilingual Infants during the First Year

    ERIC Educational Resources Information Center

    Pejovic, Jovana; Molnar, Monika

    2017-01-01

    Recently it has been proposed that sensitivity to nonarbitrary relationships between speech sounds and objects potentially bootstraps lexical acquisition. However, it is currently unclear whether preverbal infants (e.g., before 6 months of age) with different linguistic profiles are sensitive to such nonarbitrary relationships. Here, the authors…

  6. An algebraic approach to the analytic bootstrap

    DOE PAGES

    Alday, Luis F.; Zhiboedov, Alexander

    2017-04-27

    We develop an algebraic approach to the analytic bootstrap in CFTs. By acting with the Casimir operator on the crossing equation we map the problem of doing large spin sums to any desired order to the problem of solving a set of recursion relations. We compute corrections to the anomalous dimension of large spin operators due to the exchange of a primary and its descendants in the crossed channel and show that this leads to a Borel-summable expansion. Here, we analyse higher order corrections to the microscopic CFT data in the direct channel and its matching to infinite towers ofmore » operators in the crossed channel. We apply this method to the critical O(N ) model. At large N we reproduce the first few terms in the large spin expansion of the known two-loop anomalous dimensions of higher spin currents in the traceless symmetric representation of O(N ) and make further predictions. At small N we present the results for the truncated large spin expansion series of anomalous dimensions of higher spin currents.« less

  7. Coefficient Alpha Bootstrap Confidence Interval under Nonnormality

    ERIC Educational Resources Information Center

    Padilla, Miguel A.; Divers, Jasmin; Newton, Matthew

    2012-01-01

    Three different bootstrap methods for estimating confidence intervals (CIs) for coefficient alpha were investigated. In addition, the bootstrap methods were compared with the most promising coefficient alpha CI estimation methods reported in the literature. The CI methods were assessed through a Monte Carlo simulation utilizing conditions…

  8. Pearson-type goodness-of-fit test with bootstrap maximum likelihood estimation.

    PubMed

    Yin, Guosheng; Ma, Yanyuan

    2013-01-01

    The Pearson test statistic is constructed by partitioning the data into bins and computing the difference between the observed and expected counts in these bins. If the maximum likelihood estimator (MLE) of the original data is used, the statistic generally does not follow a chi-squared distribution or any explicit distribution. We propose a bootstrap-based modification of the Pearson test statistic to recover the chi-squared distribution. We compute the observed and expected counts in the partitioned bins by using the MLE obtained from a bootstrap sample. This bootstrap-sample MLE adjusts exactly the right amount of randomness to the test statistic, and recovers the chi-squared distribution. The bootstrap chi-squared test is easy to implement, as it only requires fitting exactly the same model to the bootstrap data to obtain the corresponding MLE, and then constructs the bin counts based on the original data. We examine the test size and power of the new model diagnostic procedure using simulation studies and illustrate it with a real data set.

  9. The PIT-trap—A “model-free” bootstrap procedure for inference about regression models with discrete, multivariate responses

    PubMed Central

    Thibaut, Loïc; Wang, Yi Alice

    2017-01-01

    Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)—common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of “model-free bootstrap”, adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods. PMID:28738071

  10. Bootstrap Estimates of Standard Errors in Generalizability Theory

    ERIC Educational Resources Information Center

    Tong, Ye; Brennan, Robert L.

    2007-01-01

    Estimating standard errors of estimated variance components has long been a challenging task in generalizability theory. Researchers have speculated about the potential applicability of the bootstrap for obtaining such estimates, but they have identified problems (especially bias) in using the bootstrap. Using Brennan's bias-correcting procedures…

  11. Problems with Multivariate Normality: Can the Multivariate Bootstrap Help?

    ERIC Educational Resources Information Center

    Thompson, Bruce

    Multivariate normality is required for some statistical tests. This paper explores the implications of violating the assumption of multivariate normality and illustrates a graphical procedure for evaluating multivariate normality. The logic for using the multivariate bootstrap is presented. The multivariate bootstrap can be used when distribution…

  12. Investigation of geomagnetic induced current at high latitude during the storm-time variation

    NASA Astrophysics Data System (ADS)

    Falayi, E. O.; Ogunmodimu, O.; Bolaji, O. S.; Ayanda, J. D.; Ojoniyi, O. S.

    2017-06-01

    During the geomagnetic disturbances, the geomagnetically induced current (GIC) are influenced by the geoelectric field flowing in conductive Earth. In this paper, we studied the variability of GICs, the time derivatives of the geomagnetic field (dB/dt), geomagnetic indices: Symmetric disturbance field in H (SYM-H) index, AU (eastward electrojet) and AL (westward electrojet) indices, Interplanetary parameters such as solar wind speed (v), and interplanetary magnetic field (Bz) during the geomagnetic storms on 31 March 2001, 21 October 2001, 6 November 2001, 29 October 2003, 31 October 2003 and 9 November 2004 with high solar wind speed due to a coronal mass ejection. Wavelet spectrum based approach was employed to analyze the GIC time series in a sequence of time scales of one to twenty four hours. It was observed that there are more concentration of power between the 14-24 h on 31 March 2001, 17-24 h on 21 October 2001, 1-7 h on 6 November 2001, two peaks were observed between 5-8 h and 21-24 h on 29 October 2003, 1-3 h on 31 October 2003 and 18-22 h on 9 November 2004. Bootstrap method was used to obtain regression correlations between the time derivative of the geomagnetic field (dB/dt) and the observed values of the geomagnetic induced current on 31 March 2001, 21 October 2001, 6 November 2001, 29 October 2003, 31 October 2003 and 9 November 2004 which shows a distributed cluster of correlation coefficients at around r = -0.567, -0.717, -0.477, -0.419, -0.210 and r = -0.488 respectively. We observed that high energy wavelet coefficient correlated well with bootstrap correlation, while low energy wavelet coefficient gives low bootstrap correlation. It was noticed that the geomagnetic storm has a influence on GIC and geomagnetic field derivatives (dB/dt). This might be ascribed to the coronal mass ejection with solar wind due to particle acceleration processes in the solar atmosphere.

  13. Transport in the plateau regime in a tokamak pedestal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seol, J.; Shaing, K. C.

    In a tokamak H-mode, a strong E Multiplication-Sign B flow shear is generated during the L-H transition. Turbulence in a pedestal is suppressed significantly by this E Multiplication-Sign B flow shear. In this case, neoclassical transport may become important. The neoclassical fluxes are calculated in the plateau regime with the parallel plasma flow using their kinetic definitions. In an axisymmetric tokamak, the neoclassical particles fluxes can be decomposed into the banana-plateau flux and the Pfirsch-Schlueter flux. The banana-plateau particle flux is driven by the parallel viscous force and the Pfirsch-Schlueter flux by the poloidal variation of the friction force. Themore » combined quantity of the radial electric field and the parallel flow is determined by the flux surface averaged parallel momentum balance equation rather than requiring the ambipolarity of the total particle fluxes. In this process, the Pfirsch-Schlueter flux does not appear in the flux surface averaged parallel momentum equation. Only the banana-plateau flux is used to determine the parallel flow in the form of the flux surface averaged parallel viscosity. The heat flux, obtained using the solution of the parallel momentum balance equation, decreases exponentially in the presence of sonic M{sub p} without any enhancement over that in the standard neoclassical theory. Here, M{sub p} is a combination of the poloidal E Multiplication-Sign B flow and the parallel mass flow. The neoclassical bootstrap current in the plateau regime is presented. It indicates that the neoclassical bootstrap current also is related only to the banana-plateau fluxes. Finally, transport fluxes are calculated when M{sub p} is large enough to make the parallel electron viscosity comparable with the parallel ion viscosity. It is found that the bootstrap current has a finite value regardless of the magnitude of M{sub p}.« less

  14. From current-driven to neoclassically driven tearing modes.

    PubMed

    Reimerdes, H; Sauter, O; Goodman, T; Pochelon, A

    2002-03-11

    In the TCV tokamak, the m/n = 2/1 island is observed in low-density discharges with central electron-cyclotron current drive. The evolution of its width has two distinct growth phases, one of which can be linked to a "conventional" tearing mode driven unstable by the current profile and the other to a neoclassical tearing mode driven by a perturbation of the bootstrap current. The TCV results provide the first clear observation of such a destabilization mechanism and reconcile the theory of conventional and neoclassical tearing modes, which differ only in the dominant driving term.

  15. Three-dimensional magnetohydrodynamic equilibrium of quiescent H-modes in tokamak systems

    NASA Astrophysics Data System (ADS)

    Cooper, W. A.; Graves, J. P.; Duval, B. P.; Sauter, O.; Faustin, J. M.; Kleiner, A.; Lanthaler, S.; Patten, H.; Raghunathan, M.; Tran, T.-M.; Chapman, I. T.; Ham, C. J.

    2016-06-01

    Three dimensional free boundary magnetohydrodynamic equilibria that recover saturated ideal kink/peeling structures are obtained numerically. Simulations that model the JET tokamak at fixed < β > =1.7% with a large edge bootstrap current that flattens the q-profile near the plasma boundary demonstrate that a radial parallel current density ribbon with a dominant m /n  =  5/1 Fourier component at {{I}\\text{t}}=2.2 MA develops into a broadband spectrum when the toroidal current I t is increased to 2.5 MA.

  16. Impact of Sampling Density on the Extent of HIV Clustering

    PubMed Central

    Novitsky, Vlad; Moyo, Sikhulile; Lei, Quanhong; DeGruttola, Victor

    2014-01-01

    Abstract Identifying and monitoring HIV clusters could be useful in tracking the leading edge of HIV transmission in epidemics. Currently, greater specificity in the definition of HIV clusters is needed to reduce confusion in the interpretation of HIV clustering results. We address sampling density as one of the key aspects of HIV cluster analysis. The proportion of viral sequences in clusters was estimated at sampling densities from 1.0% to 70%. A set of 1,248 HIV-1C env gp120 V1C5 sequences from a single community in Botswana was utilized in simulation studies. Matching numbers of HIV-1C V1C5 sequences from the LANL HIV Database were used as comparators. HIV clusters were identified by phylogenetic inference under bootstrapped maximum likelihood and pairwise distance cut-offs. Sampling density below 10% was associated with stochastic HIV clustering with broad confidence intervals. HIV clustering increased linearly at sampling density >10%, and was accompanied by narrowing confidence intervals. Patterns of HIV clustering were similar at bootstrap thresholds 0.7 to 1.0, but the extent of HIV clustering decreased with higher bootstrap thresholds. The origin of sampling (local concentrated vs. scattered global) had a substantial impact on HIV clustering at sampling densities ≥10%. Pairwise distances at 10% were estimated as a threshold for cluster analysis of HIV-1 V1C5 sequences. The node bootstrap support distribution provided additional evidence for 10% sampling density as the threshold for HIV cluster analysis. The detectability of HIV clusters is substantially affected by sampling density. A minimal genotyping density of 10% and sampling density of 50–70% are suggested for HIV-1 V1C5 cluster analysis. PMID:25275430

  17. Visceral sensitivity, anxiety, and smoking among treatment-seeking smokers.

    PubMed

    Zvolensky, Michael J; Bakhshaie, Jafar; Norton, Peter J; Smits, Jasper A J; Buckner, Julia D; Garey, Lorra; Manning, Kara

    2017-12-01

    It is widely recognized that smoking is related to abdominal pain and discomfort, as well as gastrointestinal disorders. Research has shown that visceral sensitivity, experiencing anxiety around gastrointestinal sensations, is associated with poorer gastrointestinal health and related health outcomes. Visceral sensitivity also increases anxiety symptoms and mediates the relation with other risk factors, including gastrointestinal distress. No work to date, however, has evaluated visceral sensitivity in the context of smoking despite the strong association between smoking and poor physical and mental health. The current study sought to examine visceral sensitivity as a unique predictor of cigarette dependence, threat-related smoking abstinence expectancies (somatic symptoms and harmful consequences), and perceived barriers for cessation via anxiety symptoms. Eighty-four treatment seeking adult daily smokers (M age =45.1years [SD=10.4]; 71.6% male) participated in this study. There was a statistically significant indirect effect of visceral sensitivity via general anxiety symptoms on cigarette dependence (b=0.02, SE=0.01, Bootstrapped 95% CI [0.006, 0.05]), smoking abstinence somatic expectancies (b=0.10, SE=0.03, Bootstrapped 95% CI [0.03, 0.19]), smoking abstinence harmful experiences (b=0.13, SE=0.05, Bootstrapped 95% CI [0.03, 0.25]), and barriers to cessation (b=0.05, SE=0.06, Bootstrapped 95% CI [0.01, 0.13]). Overall, the present study serves as an initial investigation into the nature of the associations between visceral sensitivity, anxiety symptoms, and clinically significant smoking processes among treatment-seeking smokers. Future work is needed to explore the extent to which anxiety accounts for relations between visceral sensitivity and other smoking processes (e.g., withdrawal, cessation outcome). Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Unbiased Estimates of Variance Components with Bootstrap Procedures

    ERIC Educational Resources Information Center

    Brennan, Robert L.

    2007-01-01

    This article provides general procedures for obtaining unbiased estimates of variance components for any random-model balanced design under any bootstrap sampling plan, with the focus on designs of the type typically used in generalizability theory. The results reported here are particularly helpful when the bootstrap is used to estimate standard…

  19. Explorations in Statistics: the Bootstrap

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2009-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fourth installment of Explorations in Statistics explores the bootstrap. The bootstrap gives us an empirical approach to estimate the theoretical variability among possible values of a sample statistic such as the…

  20. Bootstrapping Confidence Intervals for Robust Measures of Association.

    ERIC Educational Resources Information Center

    King, Jason E.

    A Monte Carlo simulation study was conducted to determine the bootstrap correction formula yielding the most accurate confidence intervals for robust measures of association. Confidence intervals were generated via the percentile, adjusted, BC, and BC(a) bootstrap procedures and applied to the Winsorized, percentage bend, and Pearson correlation…

  1. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  2. High performance advanced tokamak regimes in DIII-D for next-step experiments

    NASA Astrophysics Data System (ADS)

    Greenfield, C. M.; Murakami, M.; Ferron, J. R.; Wade, M. R.; Luce, T. C.; Petty, C. C.; Menard, J. E.; Petrie, T. W.; Allen, S. L.; Burrell, K. H.; Casper, T. A.; DeBoo, J. C.; Doyle, E. J.; Garofalo, A. M.; Gorelov, I. A.; Groebner, R. J.; Hobirk, J.; Hyatt, A. W.; Jayakumar, R. J.; Kessel, C. E.; La Haye, R. J.; Jackson, G. L.; Lohr, J.; Makowski, M. A.; Pinsker, R. I.; Politzer, P. A.; Prater, R.; Strait, E. J.; Taylor, T. S.; West, W. P.; DIII-D Team

    2004-05-01

    Advanced Tokamak (AT) research in DIII-D [K. H. Burrell for the DIII-D Team, in Proceedings of the 19th Fusion Energy Conference, Lyon, France, 2002 (International Atomic Energy Agency, Vienna, 2002) published on CD-ROM] seeks to provide a scientific basis for steady-state high performance operation in future devices. These regimes require high toroidal beta to maximize fusion output and poloidal beta to maximize the self-driven bootstrap current. Achieving these conditions requires integrated, simultaneous control of the current and pressure profiles, and active magnetohydrodynamic stability control. The building blocks for AT operation are in hand. Resistive wall mode stabilization via plasma rotation and active feedback with nonaxisymmetric coils allows routine operation above the no-wall beta limit. Neoclassical tearing modes are stabilized by active feedback control of localized electron cyclotron current drive (ECCD). Plasma shaping and profile control provide further improvements. Under these conditions, bootstrap supplies most of the current. Steady-state operation requires replacing the remaining Ohmic current, mostly located near the half radius, with noninductive external sources. In DIII-D this current is provided by ECCD, and nearly stationary AT discharges have been sustained with little remaining Ohmic current. Fast wave current drive is being developed to control the central magnetic shear. Density control, with divertor cryopumps, of AT discharges with edge localized moding H-mode edges facilitates high current drive efficiency at reactor relevant collisionalities. A sophisticated plasma control system allows integrated control of these elements. Close coupling between modeling and experiment is key to understanding the separate elements, their complex nonlinear interactions, and their integration into self-consistent high performance scenarios. Progress on this development, and its implications for next-step devices, will be illustrated by results of recent experiment and simulation efforts.

  3. Survival analysis of heart failure patients: A case study.

    PubMed

    Ahmad, Tanvir; Munir, Assia; Bhatti, Sajjad Haider; Aftab, Muhammad; Raza, Muhammad Ali

    2017-01-01

    This study was focused on survival analysis of heart failure patients who were admitted to Institute of Cardiology and Allied hospital Faisalabad-Pakistan during April-December (2015). All the patients were aged 40 years or above, having left ventricular systolic dysfunction, belonging to NYHA class III and IV. Cox regression was used to model mortality considering age, ejection fraction, serum creatinine, serum sodium, anemia, platelets, creatinine phosphokinase, blood pressure, gender, diabetes and smoking status as potentially contributing for mortality. Kaplan Meier plot was used to study the general pattern of survival which showed high intensity of mortality in the initial days and then a gradual increase up to the end of study. Martingale residuals were used to assess functional form of variables. Results were validated computing calibration slope and discrimination ability of model via bootstrapping. For graphical prediction of survival probability, a nomogram was constructed. Age, renal dysfunction, blood pressure, ejection fraction and anemia were found as significant risk factors for mortality among heart failure patients.

  4. HEXT, a software supporting tree-based screens for hybrid taxa in multilocus data sets, and an evaluation of the homoplasy excess test.

    PubMed

    Schneider, Kevin; Koblmüller, Stephan; Sefc, Kristina M

    2015-11-11

    The homoplasy excess test (HET) is a tree-based screen for hybrid taxa in multilocus nuclear phylogenies. Homoplasy between a hybrid taxon and the clades containing the parental taxa reduces bootstrap support in the tree. The HET is based on the expectation that excluding the hybrid taxon from the data set increases the bootstrap support for the parental clades, whereas excluding non-hybrid taxa has little effect on statistical node support. To carry out a HET, bootstrap trees are calculated with taxon-jackknife data sets, that is excluding one taxon (species, population) at a time. Excess increase in bootstrap support for certain nodes upon exclusion of a particular taxon indicates the hybrid (the excluded taxon) and its parents (the clades with increased support).We introduce a new software program, hext, which generates the taxon-jackknife data sets, runs the bootstrap tree calculations, and identifies excess bootstrap increases as outlier values in boxplot graphs. hext is written in r language and accepts binary data (0/1; e.g. AFLP) as well as co-dominant SNP and genotype data.We demonstrate the usefulness of hext in large SNP data sets containing putative hybrids and their parents. For instance, using published data of the genus Vitis (~6,000 SNP loci), hext output supports V. × champinii as a hybrid between V. rupestris and V. mustangensis .With simulated SNP and AFLP data sets, excess increases in bootstrap support were not always connected with the hybrid taxon (false positives), whereas the expected bootstrap signal failed to appear on several occasions (false negatives). Potential causes for both types of spurious results are discussed.With both empirical and simulated data sets, the taxon-jackknife output generated by hext provided additional signatures of hybrid taxa, including changes in tree topology across trees, consistent effects of exclusions of the hybrid and the parent taxa, and moderate (rather than excessive) increases in bootstrap support. hext significantly facilitates the taxon-jackknife approach to hybrid taxon detection, even though the simple test for excess bootstrap increase may not reliably identify hybrid taxa in all applications.

  5. Physics basis for an advanced physics and advanced technology tokamak power plant configuration: ARIES-ACT1

    DOE PAGES

    Kessel, C. E.; Poli, F. M.; Ghantous, K.; ...

    2015-01-01

    Here, the advanced physics and advanced technology tokamak power plant ARIES-ACT1 has a major radius of 6.25 m at an aspect ratio of 4.0, toroidal field of 6.0 T, strong shaping with elongation of 2.2, and triangularity of 0.63. The broadest pressure cases reached wall-stabilized β N ~ 5.75, limited by n = 3 external kink mode requiring a conducting shell at b/a = 0.3, requiring plasma rotation, feedback, and/or kinetic stabilization. The medium pressure peaking case reaches β N = 5.28 with B T = 6.75, while the peaked pressure case reaches β N < 5.15. Fast particle magnetohydrodynamicmore » stability shows that the alpha particles are unstable, but this leads to redistribution to larger minor radius rather than loss from the plasma. Edge and divertor plasma modeling shows that 75% of the power to the divertor can be radiated with an ITER-like divertor geometry, while >95% can be radiated in a stable detached mode with an orthogonal target and wide slot geometry. The bootstrap current fraction is 91% with a q95 of 4.5, requiring ~1.1 MA of external current drive. This current is supplied with 5 MW of ion cyclotron radio frequency/fast wave and 40 MW of lower hybrid current drive. Electron cyclotron is most effective for safety factor control over ρ~0.2 to 0.6 with 20 MW. The pedestal density is ~0.9×10 20/m 3, and the temperature is ~4.4 keV. The H98 factor is 1.65, n/n Gr = 1.0, and the ratio of net power to threshold power is 2.8 to 3.0 in the flattop.« less

  6. Core transport properties in JT-60U and JET identity plasmas

    NASA Astrophysics Data System (ADS)

    Litaudon, X.; Sakamoto, Y.; de Vries, P. C.; Salmi, A.; Tala, T.; Angioni, C.; Benkadda, S.; Beurskens, M. N. A.; Bourdelle, C.; Brix, M.; Crombé, K.; Fujita, T.; Futatani, S.; Garbet, X.; Giroud, C.; Hawkes, N. C.; Hayashi, N.; Hoang, G. T.; Hogeweij, G. M. D.; Matsunaga, G.; Nakano, T.; Oyama, N.; Parail, V.; Shinohara, K.; Suzuki, T.; Takechi, M.; Takenaga, H.; Takizuka, T.; Urano, H.; Voitsekhovitch, I.; Yoshida, M.; ITPA Transport Group; JT-60 Team; EFDA contributors, JET

    2011-07-01

    The paper compares the transport properties of a set of dimensionless identity experiments performed between JET and JT-60U in the advanced tokamak regime with internal transport barrier, ITB. These International Tokamak Physics Activity, ITPA, joint experiments were carried out with the same plasma shape, toroidal magnetic field ripple and dimensionless profiles as close as possible during the ITB triggering phase in terms of safety factor, normalized Larmor radius, normalized collision frequency, thermal beta, ratio of ion to electron temperatures. Similarities in the ITB triggering mechanisms and sustainment were observed when a good match was achieved of the most relevant normalized profiles except the toroidal Mach number. Similar thermal ion transport levels in the two devices have been measured in either monotonic or non-monotonic q-profiles. In contrast, differences between JET and JT-60U were observed on the electron thermal and particle confinement in reversed magnetic shear configurations. It was found that the larger shear reversal in the very centre (inside normalized radius of 0.2) of JT-60U plasmas allowed the sustainment of stronger electron density ITBs compared with JET. As a consequence of peaked density profile, the core bootstrap current density is more than five times higher in JT-60U compared with JET. Thanks to the bootstrap effect and the slightly broader neutral beam deposition, reversed magnetic shear configurations are self-sustained in JT-60U scenarios. Analyses of similarities and differences between the two devices address key questions on the validity of the usual assumptions made in ITER steady scenario modelling, e.g. a flat density profile in the core with thermal transport barrier? Such assumptions have consequences on the prediction of fusion performance, bootstrap current and on the sustainment of the scenario.

  7. Bootstrap Estimation of Sample Statistic Bias in Structural Equation Modeling.

    ERIC Educational Resources Information Center

    Thompson, Bruce; Fan, Xitao

    This study empirically investigated bootstrap bias estimation in the area of structural equation modeling (SEM). Three correctly specified SEM models were used under four different sample size conditions. Monte Carlo experiments were carried out to generate the criteria against which bootstrap bias estimation should be judged. For SEM fit indices,…

  8. A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment

    ERIC Educational Resources Information Center

    Finch, Holmes; Monahan, Patrick

    2008-01-01

    This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…

  9. Long multiplet bootstrap

    NASA Astrophysics Data System (ADS)

    Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker

    2017-10-01

    Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c≥ 13/24 for the central charge of such models, which we argue cannot be saturated by an interacting SCFT.

  10. Unconventional Expressions: Productive Syntax in the L2 Acquisition of Formulaic Language

    ERIC Educational Resources Information Center

    Bardovi-Harlig, Kathleen; Stringer, David

    2017-01-01

    This article presents a generative analysis of the acquisition of formulaic language as an alternative to current usage-based proposals. One influential view of the role of formulaic expressions in second language (L2) development is that they are a bootstrapping mechanism into the L2 grammar; an initial repertoire of constructions allows for…

  11. Epistemic uncertainty in the location and magnitude of earthquakes in Italy from Macroseismic data

    USGS Publications Warehouse

    Bakun, W.H.; Gomez, Capera A.; Stucchi, M.

    2011-01-01

    Three independent techniques (Bakun and Wentworth, 1997; Boxer from Gasperini et al., 1999; and Macroseismic Estimation of Earthquake Parameters [MEEP; see Data and Resources section, deliverable D3] from R.M.W. Musson and M.J. Jimenez) have been proposed for estimating an earthquake location and magnitude from intensity data alone. The locations and magnitudes obtained for a given set of intensity data are almost always different, and no one technique is consistently best at matching instrumental locations and magnitudes of recent well-recorded earthquakes in Italy. Rather than attempting to select one of the three solutions as best, we use all three techniques to estimate the location and the magnitude and the epistemic uncertainties among them. The estimates are calculated using bootstrap resampled data sets with Monte Carlo sampling of a decision tree. The decision-tree branch weights are based on goodness-of-fit measures of location and magnitude for recent earthquakes. The location estimates are based on the spatial distribution of locations calculated from the bootstrap resampled data. The preferred source location is the locus of the maximum bootstrap location spatial density. The location uncertainty is obtained from contours of the bootstrap spatial density: 68% of the bootstrap locations are within the 68% confidence region, and so on. For large earthquakes, our preferred location is not associated with the epicenter but with a location on the extended rupture surface. For small earthquakes, the epicenters are generally consistent with the location uncertainties inferred from the intensity data if an epicenter inaccuracy of 2-3 km is allowed. The preferred magnitude is the median of the distribution of bootstrap magnitudes. As with location uncertainties, the uncertainties in magnitude are obtained from the distribution of bootstrap magnitudes: the bounds of the 68% uncertainty range enclose 68% of the bootstrap magnitudes, and so on. The instrumental magnitudes for large and small earthquakes are generally consistent with the confidence intervals inferred from the distribution of bootstrap resampled magnitudes.

  12. Confidence Intervals for the Mean: To Bootstrap or Not to Bootstrap

    ERIC Educational Resources Information Center

    Calzada, Maria E.; Gardner, Holly

    2011-01-01

    The results of a simulation conducted by a research team involving undergraduate and high school students indicate that when data is symmetric the student's "t" confidence interval for a mean is superior to the studied non-parametric bootstrap confidence intervals. When data is skewed and for sample sizes n greater than or equal to 10,…

  13. The Beginner's Guide to the Bootstrap Method of Resampling.

    ERIC Educational Resources Information Center

    Lane, Ginny G.

    The bootstrap method of resampling can be useful in estimating the replicability of study results. The bootstrap procedure creates a mock population from a given sample of data from which multiple samples are then drawn. The method extends the usefulness of the jackknife procedure as it allows for computation of a given statistic across a maximal…

  14. Application of a New Resampling Method to SEM: A Comparison of S-SMART with the Bootstrap

    ERIC Educational Resources Information Center

    Bai, Haiyan; Sivo, Stephen A.; Pan, Wei; Fan, Xitao

    2016-01-01

    Among the commonly used resampling methods of dealing with small-sample problems, the bootstrap enjoys the widest applications because it often outperforms its counterparts. However, the bootstrap still has limitations when its operations are contemplated. Therefore, the purpose of this study is to examine an alternative, new resampling method…

  15. A Primer on Bootstrap Factor Analysis as Applied to Health Studies Research

    ERIC Educational Resources Information Center

    Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.

    2014-01-01

    Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…

  16. Forecasting drought risks for a water supply storage system using bootstrap position analysis

    USGS Publications Warehouse

    Tasker, Gary; Dunne, Paul

    1997-01-01

    Forecasting the likelihood of drought conditions is an integral part of managing a water supply storage and delivery system. Position analysis uses a large number of possible flow sequences as inputs to a simulation of a water supply storage and delivery system. For a given set of operating rules and water use requirements, water managers can use such a model to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows a few months ahead conditioned on the current reservoir levels and streamflows. The large number of possible flow sequences are generated using a stochastic streamflow model with a random resampling of innovations. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality and it allows incorporation of long-range weather forecasts into the analysis.

  17. Bootstrap position analysis for forecasting low flow frequency

    USGS Publications Warehouse

    Tasker, Gary D.; Dunne, P.

    1997-01-01

    A method of random resampling of residuals from stochastic models is used to generate a large number of 12-month-long traces of natural monthly runoff to be used in a position analysis model for a water-supply storage and delivery system. Position analysis uses the traces to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows conditioned on the current reservoir levels and streamflows. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality, fewer parameters need to be estimated directly from the data, and accounting for parameter uncertainty is easily done. For a given set of operating rules and water-use requirements for a system, water managers can use such a model as a decision-making tool to evaluate different operating rules. ?? ASCE,.

  18. Analysis of small sample size studies using nonparametric bootstrap test with pooled resampling method.

    PubMed

    Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A

    2017-06-30

    Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Comparison of Parametric and Nonparametric Bootstrap Methods for Estimating Random Error in Equipercentile Equating

    ERIC Educational Resources Information Center

    Cui, Zhongmin; Kolen, Michael J.

    2008-01-01

    This article considers two methods of estimating standard errors of equipercentile equating: the parametric bootstrap method and the nonparametric bootstrap method. Using a simulation study, these two methods are compared under three sample sizes (300, 1,000, and 3,000), for two test content areas (the Iowa Tests of Basic Skills Maps and Diagrams…

  20. Non-inductive current drive and transport in high βN plasmas in JET

    NASA Astrophysics Data System (ADS)

    Voitsekhovitch, I.; Alper, B.; Brix, M.; Budny, R. V.; Buratti, P.; Challis, C. D.; Ferron, J.; Giroud, C.; Joffrin, E.; Laborde, L.; Luce, T. C.; McCune, D.; Menard, J.; Murakami, M.; Park, J. M.; JET-EFDA contributors

    2009-05-01

    A route to stationary MHD stable operation at high βN has been explored at the Joint European Torus (JET) by optimizing the current ramp-up, heating start time and the waveform of neutral beam injection (NBI) power. In these scenarios the current ramp-up has been accompanied by plasma pre-heat (or the NBI has been started before the current flat-top) and NBI power up to 22 MW has been applied during the current flat-top. In the discharges considered transient total βN ≈ 3.3 and stationary (during high power phase) βN ≈ 3 have been achieved by applying the feedback control of βN with the NBI power in configurations with monotonic or flat core safety factor profile and without an internal transport barrier (ITB). The transport and current drive in this scenario is analysed here by using the TRANSP and ASTRA codes. The interpretative analysis performed with TRANSP shows that 50-70% of current is driven non-inductively; half of this current is due to the bootstrap current which has a broad profile since an ITB was deliberately avoided. The GLF23 transport model predicts the temperature profiles within a ±22% discrepancy with the measurements over the explored parameter space. Predictive simulations with this model show that the E × B rotational shear plays an important role for thermal ion transport in this scenario, producing up to a 40% increase of the ion temperature. By applying transport and current drive models validated in self-consistent simulations of given reference scenarios in a wider parameter space, the requirements for fully non-inductive stationary operation at JET are estimated. It is shown that the strong stiffness of the temperature profiles predicted by the GLF23 model restricts the bootstrap current at larger heating power. In this situation full non-inductive operation without an ITB can be rather expensive strongly relying on the external non-inductive current drive sources.

  1. swot: Super W Of Theta

    NASA Astrophysics Data System (ADS)

    Coupon, Jean; Leauthaud, Alexie; Kilbinger, Martin; Medezinski, Elinor

    2017-07-01

    SWOT (Super W Of Theta) computes two-point statistics for very large data sets, based on “divide and conquer” algorithms, mainly, but not limited to data storage in binary trees, approximation at large scale, parellelization (open MPI), and bootstrap and jackknife resampling methods “on the fly”. It currently supports projected and 3D galaxy auto and cross correlations, galaxy-galaxy lensing, and weighted histograms.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. Brooks; A.H. Reiman; G.H. Neilson

    High-beta, low-aspect-ratio (compact) stellarators are promising solutions to the problem of developing a magnetic plasma configuration for magnetic fusion power plants that can be sustained in steady-state without disrupting. These concepts combine features of stellarators and advanced tokamaks and have aspect ratios similar to those of tokamaks (2-4). They are based on computed plasma configurations that are shaped in three dimensions to provide desired stability and transport properties. Experiments are planned as part of a program to develop this concept. A beta = 4% quasi-axisymmetric plasma configuration has been evaluated for the National Compact Stellarator Experiment (NCSX). It has amore » substantial bootstrap current and is shaped to stabilize ballooning, external kink, vertical, and neoclassical tearing modes without feedback or close-fitting conductors. Quasi-omnigeneous plasma configurations stable to ballooning modes at beta = 4% have been evaluated for the Quasi-Omnigeneous Stellarator (QOS) experiment. These equilibria have relatively low bootstrap currents and are insensitive to changes in beta. Coil configurations have been calculated that reconstruct these plasma configurations, preserving their important physics properties. Theory- and experiment-based confinement analyses are used to evaluate the technical capabilities needed to reach target plasma conditions. The physics basis for these complementary experiments is described.« less

  3. Relationship Between Total and Bioaccessible Lead on Children's Blood Lead Levels in Urban Residential Philadelphia Soils.

    PubMed

    Bradham, Karen D; Nelson, Clay M; Kelly, Jack; Pomales, Ana; Scruton, Karen; Dignam, Tim; Misenheimer, John C; Li, Kevin; Obenour, Daniel R; Thomas, David J

    2017-09-05

    Relationships between total soil or bioaccessible lead (Pb), measured using an in vitro bioaccessibility assay, and children's blood lead levels (BLL) were investigated in an urban neighborhood in Philadelphia, PA, with a history of soil Pb contamination. Soil samples from 38 homes were analyzed to determine whether accounting for the bioaccessible Pb fraction improves statistical relationships with children's BLLs. Total soil Pb concentration ranged from 58 to 2821 mg/kg; the bioaccessible Pb concentration ranged from 47 to 2567 mg/kg. Children's BLLs ranged from 0.3 to 9.8 μg/dL. Hierarchical models were used to compare relationships between total or bioaccessible Pb in soil and children's BLLs. Total soil Pb concentration as the predictor accounted for 23% of the variability in child BLL; bioaccessible soil Pb concentration as the predictor accounted for 26% of BLL variability. A bootstrapping analysis confirmed a significant increase in R 2 for the model using bioaccessible soil Pb concentration as the predictor with 99.0% of bootstraps showing a positive increase. Estimated increases of 1.3 μg/dL and 1.5 μg/dL in BLL per 1000 mg/kg Pb in soil were observed for this study area using total and bioaccessible Pb concentrations, respectively. Children's age did not contribute significantly to the prediction of BLLs.

  4. Semantic Drift in Espresso-style Bootstrapping: Graph-theoretic Analysis and Evaluation in Word Sense Disambiguation

    NASA Astrophysics Data System (ADS)

    Komachi, Mamoru; Kudo, Taku; Shimbo, Masashi; Matsumoto, Yuji

    Bootstrapping has a tendency, called semantic drift, to select instances unrelated to the seed instances as the iteration proceeds. We demonstrate the semantic drift of Espresso-style bootstrapping has the same root as the topic drift of Kleinberg's HITS, using a simplified graph-based reformulation of bootstrapping. We confirm that two graph-based algorithms, the von Neumann kernels and the regularized Laplacian, can reduce the effect of semantic drift in the task of word sense disambiguation (WSD) on Senseval-3 English Lexical Sample Task. Proposed algorithms achieve superior performance to Espresso and previous graph-based WSD methods, even though the proposed algorithms have less parameters and are easy to calibrate.

  5. Risk model for estimating the 1-year risk of deferred lesion intervention following deferred revascularization after fractional flow reserve assessment.

    PubMed

    Depta, Jeremiah P; Patel, Jayendrakumar S; Novak, Eric; Gage, Brian F; Masrani, Shriti K; Raymer, David; Facey, Gabrielle; Patel, Yogesh; Zajarias, Alan; Lasala, John M; Amin, Amit P; Kurz, Howard I; Singh, Jasvindar; Bach, Richard G

    2015-02-21

    Although lesions deferred revascularization following fractional flow reserve (FFR) assessment have a low risk of adverse cardiac events, variability in risk for deferred lesion intervention (DLI) has not been previously evaluated. The aim of this study was to develop a prediction model to estimate 1-year risk of DLI for coronary lesions where revascularization was not performed following FFR assessment. A prediction model for DLI was developed from a cohort of 721 patients with 882 coronary lesions where revascularization was deferred based on FFR between 10/2002 and 7/2010. Deferred lesion intervention was defined as any revascularization of a lesion previously deferred following FFR. The final DLI model was developed using stepwise Cox regression and validated using bootstrapping techniques. An algorithm was constructed to predict the 1-year risk of DLI. During a mean (±SD) follow-up period of 4.0 ± 2.3 years, 18% of lesions deferred after FFR underwent DLI; the 1-year incidence of DLI was 5.3%, while the predicted risk of DLI varied from 1 to 40%. The final Cox model included the FFR value, age, current or former smoking, history of coronary artery disease (CAD) or prior percutaneous coronary intervention, multi-vessel CAD, and serum creatinine. The c statistic for the DLI prediction model was 0.66 (95% confidence interval, CI: 0.61-0.70). Patients deferred revascularization based on FFR have variation in their risk for DLI. A clinical prediction model consisting of five clinical variables and the FFR value can help predict the risk of DLI in the first year following FFR assessment. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2014. For permissions please email: journals.permissions@oup.com.

  6. Confidence limit calculation for antidotal potency ratio derived from lethal dose 50

    PubMed Central

    Manage, Ananda; Petrikovics, Ilona

    2013-01-01

    AIM: To describe confidence interval calculation for antidotal potency ratios using bootstrap method. METHODS: We can easily adapt the nonparametric bootstrap method which was invented by Efron to construct confidence intervals in such situations like this. The bootstrap method is a resampling method in which the bootstrap samples are obtained by resampling from the original sample. RESULTS: The described confidence interval calculation using bootstrap method does not require the sampling distribution antidotal potency ratio. This can serve as a substantial help for toxicologists, who are directed to employ the Dixon up-and-down method with the application of lower number of animals to determine lethal dose 50 values for characterizing the investigated toxic molecules and eventually for characterizing the antidotal protections by the test antidotal systems. CONCLUSION: The described method can serve as a useful tool in various other applications. Simplicity of the method makes it easier to do the calculation using most of the programming software packages. PMID:25237618

  7. National Spherical Torus Experiment (NSTX) Facility/Diagnostic Overview

    NASA Astrophysics Data System (ADS)

    Ono, M.

    2005-10-01

    The capabilities of the NSTX experimental facility and diagnostics continue to improve. The new TF joints are performing well at 4.5 kG. New in-board shaping coils were installed to produce plasmas with simultaneously high elongation ˜2.5 and high triangularity ˜0.8 needed for advanced operation. The EFC/RWM system with six external coils driven by three switching power amplifiers (1 kHz, 6 kA-turn) is now fully operational. With these new tools, we significantly expanded the NSTX operating parameters, achieving the highest controlled elongation of 2.75, a shape factor q95Ip/aBT of 37 MA/m-T, plasma volume of 14 m^3, stored energy of 430 kJ, normalized beta of 7.4 % MA/m-T, bootstrap current fraction of 60 % at 700 kA, and longest plasma pulse length of 1.5 s or about 4 times the resistive skin time. In the area of the plasma diagnostics, ten additional Thomson scattering channels are providing detailed measurement of the H-mode pedestal and internal barrier regions. The 8 channel MSE diagnostic is providing crucial j(r) measurements including high electron confinement reversed shear plasmas. A tangential microwave scattering system to measure electron-transport- relevant fluctuations is being commissioned.

  8. Topics in Statistical Calibration

    DTIC Science & Technology

    2014-03-27

    on a parametric bootstrap where, instead of sampling directly from the residuals , samples are drawn from a normal distribution. This procedure will...addition to centering them (Davison and Hinkley, 1997). When there are outliers in the residuals , the bootstrap distribution of x̂0 can become skewed or...based and inversion methods using the linear mixed-effects model. Then, a simple parametric bootstrap algorithm is proposed that can be used to either

  9. Long-Time Behavior and Critical Limit of Subcritical SQG Equations in Scale-Invariant Sobolev Spaces

    NASA Astrophysics Data System (ADS)

    Coti Zelati, Michele

    2018-02-01

    We consider the subcritical SQG equation in its natural scale-invariant Sobolev space and prove the existence of a global attractor of optimal regularity. The proof is based on a new energy estimate in Sobolev spaces to bootstrap the regularity to the optimal level, derived by means of nonlinear lower bounds on the fractional Laplacian. This estimate appears to be new in the literature and allows a sharp use of the subcritical nature of the L^∞ bounds for this problem. As a by-product, we obtain attractors for weak solutions as well. Moreover, we study the critical limit of the attractors and prove their stability and upper semicontinuity with respect to the strength of the diffusion.

  10. Variable selection under multiple imputation using the bootstrap in a prognostic study

    PubMed Central

    Heymans, Martijn W; van Buuren, Stef; Knol, Dirk L; van Mechelen, Willem; de Vet, Henrica CW

    2007-01-01

    Background Missing data is a challenging problem in many prognostic studies. Multiple imputation (MI) accounts for imputation uncertainty that allows for adequate statistical testing. We developed and tested a methodology combining MI with bootstrapping techniques for studying prognostic variable selection. Method In our prospective cohort study we merged data from three different randomized controlled trials (RCTs) to assess prognostic variables for chronicity of low back pain. Among the outcome and prognostic variables data were missing in the range of 0 and 48.1%. We used four methods to investigate the influence of respectively sampling and imputation variation: MI only, bootstrap only, and two methods that combine MI and bootstrapping. Variables were selected based on the inclusion frequency of each prognostic variable, i.e. the proportion of times that the variable appeared in the model. The discriminative and calibrative abilities of prognostic models developed by the four methods were assessed at different inclusion levels. Results We found that the effect of imputation variation on the inclusion frequency was larger than the effect of sampling variation. When MI and bootstrapping were combined at the range of 0% (full model) to 90% of variable selection, bootstrap corrected c-index values of 0.70 to 0.71 and slope values of 0.64 to 0.86 were found. Conclusion We recommend to account for both imputation and sampling variation in sets of missing data. The new procedure of combining MI with bootstrapping for variable selection, results in multivariable prognostic models with good performance and is therefore attractive to apply on data sets with missing values. PMID:17629912

  11. Assessing uncertainties in superficial water provision by different bootstrap-based techniques

    NASA Astrophysics Data System (ADS)

    Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo Mario

    2014-05-01

    An assessment of water security can incorporate several water-related concepts, characterizing the interactions between societal needs, ecosystem functioning, and hydro-climatic conditions. The superficial freshwater provision level depends on the methods chosen for 'Environmental Flow Requirement' estimations, which integrate the sources of uncertainty in the understanding of how water-related threats to aquatic ecosystem security arise. Here, we develop an uncertainty assessment of superficial freshwater provision based on different bootstrap techniques (non-parametric resampling with replacement). To illustrate this approach, we use an agricultural basin (291 km2) within the Cantareira water supply system in Brazil monitored by one daily streamflow gage (24-year period). The original streamflow time series has been randomly resampled for different times or sample sizes (N = 500; ...; 1000), then applied to the conventional bootstrap approach and variations of this method, such as: 'nearest neighbor bootstrap'; and 'moving blocks bootstrap'. We have analyzed the impact of the sampling uncertainty on five Environmental Flow Requirement methods, based on: flow duration curves or probability of exceedance (Q90%, Q75% and Q50%); 7-day 10-year low-flow statistic (Q7,10); and presumptive standard (80% of the natural monthly mean ?ow). The bootstrap technique has been also used to compare those 'Environmental Flow Requirement' (EFR) methods among themselves, considering the difference between the bootstrap estimates and the "true" EFR characteristic, which has been computed averaging the EFR values of the five methods and using the entire streamflow record at monitoring station. This study evaluates the bootstrapping strategies, the representativeness of streamflow series for EFR estimates and their confidence intervals, in addition to overview of the performance differences between the EFR methods. The uncertainties arisen during EFR methods assessment will be propagated through water security indicators referring to water scarcity and vulnerability, seeking to provide meaningful support to end-users and water managers facing the incorporation of uncertainties in the decision making process.

  12. A Pre-Screening Questionnaire to Predict Non-24-Hour Sleep-Wake Rhythm Disorder (N24HSWD) among the Blind

    PubMed Central

    Flynn-Evans, Erin E.; Lockley, Steven W.

    2016-01-01

    Study Objectives: There is currently no questionnaire-based pre-screening tool available to detect non-24-hour sleep-wake rhythm disorder (N24HSWD) among blind patients. Our goal was to develop such a tool, derived from gold standard, objective hormonal measures of circadian entrainment status, for the detection of N24HSWD among those with visual impairment. Methods: We evaluated the contribution of 40 variables in their ability to predict N24HSWD among 127 blind women, classified using urinary 6-sulfatoxymelatonin period, an objective marker of circadian entrainment status in this population. We subjected the 40 candidate predictors to 1,000 bootstrapped iterations of a logistic regression forward selection model to predict N24HSWD, with model inclusion set at the p < 0.05 level. We removed any predictors that were not selected at least 1% of the time in the 1,000 bootstrapped models and applied a second round of 1,000 bootstrapped logistic regression forward selection models to the remaining 23 candidate predictors. We included all questions that were selected at least 10% of the time in the final model. We subjected the selected predictors to a final logistic regression model to predict N24SWD over 1,000 bootstrapped models to calculate the concordance statistic and adjusted optimism of the final model. We used this information to generate a predictive model and determined the sensitivity and specificity of the model. Finally, we applied the model to a cohort of 1,262 blind women who completed the survey, but did not collect urine samples. Results: The final model consisted of eight questions. The concordance statistic, adjusted for bootstrapping, was 0.85. The positive predictive value was 88%, the negative predictive value was 79%. Applying this model to our larger dataset of women, we found that 61% of those without light perception, and 27% with some degree of light perception, would be referred for further screening for N24HSWD. Conclusions: Our model has predictive utility sufficient to serve as a pre-screening questionnaire for N24HSWD among the blind. Citation: Flynn-Evans EE, Lockley SW. A pre-screening questionnaire to predict non-24-hour sleep-wake rhythm disorder (N24HSWD) among the blind. J Clin Sleep Med 2016;12(5):703–710. PMID:26951421

  13. Counting conformal correlators

    NASA Astrophysics Data System (ADS)

    Kravchuk, Petr; Simmons-Duffin, David

    2018-02-01

    We introduce simple group-theoretic techniques for classifying conformallyinvariant tensor structures. With them, we classify tensor structures of general n-point functions of non-conserved operators, and n ≥ 4-point functions of general conserved currents, with or without permutation symmetries, and in any spacetime dimension d. Our techniques are useful for bootstrap applications. The rules we derive simultaneously count tensor structures for flat-space scattering amplitudes in d + 1 dimensions.

  14. Projecting High Beta Steady-State Scenarios from DIII-D Advanced Tokamk Discharges

    NASA Astrophysics Data System (ADS)

    Park, J. M.

    2013-10-01

    Fusion power plant studies based on steady-state tokamak operation suggest that normalized beta in the range of 4-6 is needed for economic viability. DIII-D is exploring a range of candidate high beta scenarios guided by FASTRAN modeling in a repeated cycle of experiment and modeling validation. FASTRAN is a new iterative numerical procedure coupled to the Integrated Plasma Simulator (IPS) that integrates models of core transport, heating and current drive, equilibrium and stability self-consistently to find steady state (d / dt = 0) solutions, and reproduces most features of DIII-D high beta discharges with a stationary current profile. Separately, modeling components such as core transport (TGLF) and off-axis neutral beam current drive (NUBEAM) show reasonable agreement with experiment. Projecting forward to scenarios possible on DIII-D with future upgrades, two self-consistent noninductive scenarios at βN > 4 are found: high qmin and high internal inductance li. Both have bootstrap current fraction fBS > 0 . 5 and rely on the planned addition of a second off-axis neutral beamline and increased electron cyclotron heating. The high qmin > 2 scenario achieves stable operation at βN as high as 5 by a very broad current density profile to improve the ideal-wall stabilization of low-n instabilities along with confinement enhancement from low magnetic shear. The li near 1 scenario does not depend on ideal-wall stabilization. Improved confinement from strong magnetic shear makes up for the lower pedestal needed to maintain li high. The tradeoff between increasing li and reduced edge pedestal determines the achievable βN (near 4) and fBS (near 0.5). This modeling identifies the necessary upgrades to achieve target scenarios and clarifies the pros and cons of particular scenarios to better inform the development of steady-state fusion. Supported by the US Department of Energy under DE-AC05-00OR22725 & DE-FC02-04ER54698.

  15. Learning predictive models that use pattern discovery--a bootstrap evaluative approach applied in organ functioning sequences.

    PubMed

    Toma, Tudor; Bosman, Robert-Jan; Siebes, Arno; Peek, Niels; Abu-Hanna, Ameen

    2010-08-01

    An important problem in the Intensive Care is how to predict on a given day of stay the eventual hospital mortality for a specific patient. A recent approach to solve this problem suggested the use of frequent temporal sequences (FTSs) as predictors. Methods following this approach were evaluated in the past by inducing a model from a training set and validating the prognostic performance on an independent test set. Although this evaluative approach addresses the validity of the specific models induced in an experiment, it falls short of evaluating the inductive method itself. To achieve this, one must account for the inherent sources of variation in the experimental design. The main aim of this work is to demonstrate a procedure based on bootstrapping, specifically the .632 bootstrap procedure, for evaluating inductive methods that discover patterns, such as FTSs. A second aim is to apply this approach to find out whether a recently suggested inductive method that discovers FTSs of organ functioning status is superior over a traditional method that does not use temporal sequences when compared on each successive day of stay at the Intensive Care Unit. The use of bootstrapping with logistic regression using pre-specified covariates is known in the statistical literature. Using inductive methods of prognostic models based on temporal sequence discovery within the bootstrap procedure is however novel at least in predictive models in the Intensive Care. Our results of applying the bootstrap-based evaluative procedure demonstrate the superiority of the FTS-based inductive method over the traditional method in terms of discrimination as well as accuracy. In addition we illustrate the insights gained by the analyst into the discovered FTSs from the bootstrap samples. Copyright 2010 Elsevier Inc. All rights reserved.

  16. Topical ketoprofen nanogel: artificial neural network optimization, clustered bootstrap validation, and in vivo activity evaluation based on longitudinal dose response modeling.

    PubMed

    Elkomy, Mohammed H; Elmenshawe, Shahira F; Eid, Hussein M; Ali, Ahmed M A

    2016-11-01

    This work aimed at investigating the potential of solid lipid nanoparticles (SLN) as carriers for topical delivery of Ketoprofen (KP); evaluating a novel technique incorporating Artificial Neural Network (ANN) and clustered bootstrap for optimization of KP-loaded SLN (KP-SLN); and demonstrating a longitudinal dose response (LDR) modeling-based approach to compare the activity of topical non-steroidal anti-inflammatory drug formulations. KP-SLN was fabricated by a modified emulsion/solvent evaporation method. Box-Behnken design was implemented to study the influence of glycerylpalmitostearate-to-KP ratio, Tween 80, and lecithin concentrations on particle size, entrapment efficiency, and amount of drug permeated through rat skin in 24 hours. Following clustered bootstrap ANN optimization, the optimized KP-SLN was incorporated into an aqueous gel and evaluated for rheology, in vitro release, permeability, skin irritation and in vivo activity using carrageenan-induced rat paw edema model and LDR mathematical model to analyze the time course of anti-inflammatory effect at various application durations. Lipid-to-drug ratio of 7.85 [bootstrap 95%CI: 7.63-8.51], Tween 80 of 1.27% [bootstrap 95%CI: 0.601-2.40%], and Lecithin of 0.263% [bootstrap 95%CI: 0.263-0.328%] were predicted to produce optimal characteristics. Compared with profenid® gel, the optimized KP-SLN gel exhibited slower release, faster permeability, better texture properties, greater efficacy, and similar potency. SLNs are safe and effective permeation enhancers. ANN coupled with clustered bootstrap is a useful method for finding optimal solutions and estimating uncertainty associated with them. LDR models allow mechanistic understanding of comparative in vivo performances of different topical formulations, and help design efficient dermatological bioequivalence assessment methods.

  17. Lightweight CoAP-Based Bootstrapping Service for the Internet of Things.

    PubMed

    Garcia-Carrillo, Dan; Marin-Lopez, Rafael

    2016-03-11

    The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.

  18. Lightweight CoAP-Based Bootstrapping Service for the Internet of Things

    PubMed Central

    Garcia-Carrillo, Dan; Marin-Lopez, Rafael

    2016-01-01

    The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length. PMID:26978362

  19. Using Cluster Bootstrapping to Analyze Nested Data With a Few Clusters.

    PubMed

    Huang, Francis L

    2018-04-01

    Cluster randomized trials involving participants nested within intact treatment and control groups are commonly performed in various educational, psychological, and biomedical studies. However, recruiting and retaining intact groups present various practical, financial, and logistical challenges to evaluators and often, cluster randomized trials are performed with a low number of clusters (~20 groups). Although multilevel models are often used to analyze nested data, researchers may be concerned of potentially biased results due to having only a few groups under study. Cluster bootstrapping has been suggested as an alternative procedure when analyzing clustered data though it has seen very little use in educational and psychological studies. Using a Monte Carlo simulation that varied the number of clusters, average cluster size, and intraclass correlations, we compared standard errors using cluster bootstrapping with those derived using ordinary least squares regression and multilevel models. Results indicate that cluster bootstrapping, though more computationally demanding, can be used as an alternative procedure for the analysis of clustered data when treatment effects at the group level are of primary interest. Supplementary material showing how to perform cluster bootstrapped regressions using R is also provided.

  20. Bootstrapping Least Squares Estimates in Biochemical Reaction Networks

    PubMed Central

    Linder, Daniel F.

    2015-01-01

    The paper proposes new computational methods of computing confidence bounds for the least squares estimates (LSEs) of rate constants in mass-action biochemical reaction network and stochastic epidemic models. Such LSEs are obtained by fitting the set of deterministic ordinary differential equations (ODEs), corresponding to the large volume limit of a reaction network, to network’s partially observed trajectory treated as a continuous-time, pure jump Markov process. In the large volume limit the LSEs are asymptotically Gaussian, but their limiting covariance structure is complicated since it is described by a set of nonlinear ODEs which are often ill-conditioned and numerically unstable. The current paper considers two bootstrap Monte-Carlo procedures, based on the diffusion and linear noise approximations for pure jump processes, which allow one to avoid solving the limiting covariance ODEs. The results are illustrated with both in-silico and real data examples from the LINE 1 gene retrotranscription model and compared with those obtained using other methods. PMID:25898769

  1. Percolation in education and application in the 21st century

    NASA Astrophysics Data System (ADS)

    Adler, Joan; Elfenbaum, Shaked; Sharir, Liran

    2017-03-01

    Percolation, "so simple you could teach it to your wife" (Chuck Newman, last century) is an ideal system to introduce young students to phase transitions. Two recent projects in the Computational Physics group at the Technion make this easy. One is a set of analog models to be mounted on our walls and enable visitors to switch between samples to see which mixtures of glass and metal objects have a percolating current. The second is a website enabling the creation of stereo samples of two and three dimensional clusters (suited for viewing with Oculus rift) on desktops, tablets and smartphones. Although there have been many physical applications for regular percolation in the past, for Bootstrap Percolation, where only sites with sufficient occupied neighbours remain active, there have not been a surfeit of condensed matter applications. We have found that the creation of diamond membranes for quantum computers can be modeled with a bootstrap process of graphitization in diamond, enabling prediction of optimal processing procedures.

  2. The synchronous neural interactions test as a functional neuromarker for post-traumatic stress disorder (PTSD): a robust classification method based on the bootstrap

    NASA Astrophysics Data System (ADS)

    Georgopoulos, A. P.; Tan, H.-R. M.; Lewis, S. M.; Leuthold, A. C.; Winskowski, A. M.; Lynch, J. K.; Engdahl, B.

    2010-02-01

    Traumatic experiences can produce post-traumatic stress disorder (PTSD) which is a debilitating condition and for which no biomarker currently exists (Institute of Medicine (US) 2006 Posttraumatic Stress Disorder: Diagnosis and Assessment (Washington, DC: National Academies)). Here we show that the synchronous neural interactions (SNI) test which assesses the functional interactions among neural populations derived from magnetoencephalographic (MEG) recordings (Georgopoulos A P et al 2007 J. Neural Eng. 4 349-55) can successfully differentiate PTSD patients from healthy control subjects. Externally cross-validated, bootstrap-based analyses yielded >90% overall accuracy of classification. In addition, all but one of 18 patients who were not receiving medications for their disease were correctly classified. Altogether, these findings document robust differences in brain function between the PTSD and control groups that can be used for differential diagnosis and which possess the potential for assessing and monitoring disease progression and effects of therapy.

  3. Service Mediation and Negotiation Bootstrapping as First Achievements Towards Self-adaptable Cloud Services

    NASA Astrophysics Data System (ADS)

    Brandic, Ivona; Music, Dejan; Dustdar, Schahram

    Nowadays, novel computing paradigms as for example Cloud Computing are gaining more and more on importance. In case of Cloud Computing users pay for the usage of the computing power provided as a service. Beforehand they can negotiate specific functional and non-functional requirements relevant for the application execution. However, providing computing power as a service bears different research challenges. On one hand dynamic, versatile, and adaptable services are required, which can cope with system failures and environmental changes. On the other hand, human interaction with the system should be minimized. In this chapter we present the first results in establishing adaptable, versatile, and dynamic services considering negotiation bootstrapping and service mediation achieved in context of the Foundations of Self-Governing ICT Infrastructures (FoSII) project. We discuss novel meta-negotiation and SLA mapping solutions for Cloud services bridging the gap between current QoS models and Cloud middleware and representing important prerequisites for the establishment of autonomic Cloud services.

  4. Phylogenetic relationships among arecoid palms (Arecaceae: Arecoideae)

    PubMed Central

    Baker, William J.; Norup, Maria V.; Clarkson, James J.; Couvreur, Thomas L. P.; Dowe, John L.; Lewis, Carl E.; Pintaud, Jean-Christophe; Savolainen, Vincent; Wilmot, Tomas; Chase, Mark W.

    2011-01-01

    Background and Aims The Arecoideae is the largest and most diverse of the five subfamilies of palms (Arecaceae/Palmae), containing >50 % of the species in the family. Despite its importance, phylogenetic relationships among Arecoideae are poorly understood. Here the most densely sampled phylogenetic analysis of Arecoideae available to date is presented. The results are used to test the current classification of the subfamily and to identify priority areas for future research. Methods DNA sequence data for the low-copy nuclear genes PRK and RPB2 were collected from 190 palm species, covering 103 (96 %) genera of Arecoideae. The data were analysed using the parsimony ratchet, maximum likelihood, and both likelihood and parsimony bootstrapping. Key Results and Conclusions Despite the recovery of paralogues and pseudogenes in a small number of taxa, PRK and RPB2 were both highly informative, producing well-resolved phylogenetic trees with many nodes well supported by bootstrap analyses. Simultaneous analyses of the combined data sets provided additional resolution and support. Two areas of incongruence between PRK and RPB2 were strongly supported by the bootstrap relating to the placement of tribes Chamaedoreeae, Iriarteeae and Reinhardtieae; the causes of this incongruence remain uncertain. The current classification within Arecoideae was strongly supported by the present data. Of the 14 tribes and 14 sub-tribes in the classification, only five sub-tribes from tribe Areceae (Basseliniinae, Linospadicinae, Oncospermatinae, Rhopalostylidinae and Verschaffeltiinae) failed to receive support. Three major higher level clades were strongly supported: (1) the RRC clade (Roystoneeae, Reinhardtieae and Cocoseae), (2) the POS clade (Podococceae, Oranieae and Sclerospermeae) and (3) the core arecoid clade (Areceae, Euterpeae, Geonomateae, Leopoldinieae, Manicarieae and Pelagodoxeae). However, new data sources are required to elucidate ambiguities that remain in phylogenetic relationships among and within the major groups of Arecoideae, as well as within the Areceae, the largest tribe in the palm family. PMID:21325340

  5. Comparison of parametric and bootstrap method in bioequivalence test.

    PubMed

    Ahn, Byung-Jin; Yim, Dong-Seok

    2009-10-01

    The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption.

  6. Comparison of Parametric and Bootstrap Method in Bioequivalence Test

    PubMed Central

    Ahn, Byung-Jin

    2009-01-01

    The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption. PMID:19915699

  7. Prenatal Drug Exposure and Adolescent Cortisol Reactivity: Association with Behavioral Concerns.

    PubMed

    Buckingham-Howes, Stacy; Mazza, Dayna; Wang, Yan; Granger, Douglas A; Black, Maureen M

    2016-09-01

    To examine stress reactivity in a sample of adolescents with prenatal drug exposure (PDE) by examining the consequences of PDE on stress-related adrenocortical reactivity, behavioral problems, and drug experimentation during adolescence. Participants (76 PDE, 61 non-drug exposed [NE]; 99% African-American; 50% male; mean age = 14.17 yr, SD = 1.17) provided a urine sample, completed a drug use questionnaire, and provided saliva samples (later assayed for cortisol) before and after a mild laboratory stress task. Caregivers completed the Behavior Assessment System for Children, Second Edition (BASC II) and reported their relationship to the adolescent. The NE group was more likely to exhibit task-related cortisol reactivity compared to the PDE group. Overall behavior problems and drug experimentation were comparable across groups with no differences between PDE and NE groups. In unadjusted mediation analyses, cortisol reactivity mediated the association between PDE and BASC II aggression scores (95% bootstrap confidence interval [CI], 0.04-4.28), externalizing problems scores (95% bootstrap CI, 0.03-4.50), and drug experimentation (95% bootstrap CI, 0.001-0.54). The associations remain with the inclusion of gender as a covariate but not when age is included. Findings support and expand current research in cortisol reactivity and PDE by demonstrating that cortisol reactivity attenuates the association between PDE and behavioral problems (aggression) and drug experimentation. If replicated, PDE may have long-lasting effects on stress-sensitive physiological mechanisms associated with behavioral problems (aggression) and drug experimentation in adolescence.

  8. Steady state plasma operation in RF dominated regimes on EAST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, X. J.; Zhao, Y. P.; Gong, X. Z.

    Significant progress has recently been made on EAST in the 2014 campaign, including the enhanced CW H&CD system over 20MW heating power (LHCD, ICRH and NBI), more than 70 diagnostics, ITER-like W-monoblock on upper divertor, two inner cryo-pumps and RMP coils, enabling EAST to investigate long pulse H mode operation with dominant electron heating and low torque to address the critical issues for ITER. H-mode plasmas were achieved by new H&CD system or 4.6GHz LHCD alone for the first time. Long pulse high performance H mode has been obtained by LHCD alone up to 28s at H{sub 98}∼1.2 or bymore » combing of ICRH and LHCD, no or small ELM was found in RF plasmas, which is essential for steady state operation in the future Tokamak. Plasma operation in low collision regimes were implemented by new 4.6GHz LHCD with core Te∼4.5keV. The non-inductive scenarios with high performance at high bootstrap current fraction have been demonstrated in RF dominated regimes for long pulse operation. Near full non-inductive CD discharges have been achieved. In addition, effective heating and decoupling method under multi-transmitter for ICRF system were developed in this campaign, etc. EAST could be in operation with over 30MW CW heating and current drive power (LHCD ICRH NBI and ECRH), enhanced diagnostic capabilities and full actively-cooled metal wall from 2015. It will therefore allow to access new confinement regimes and to extend these regimes towards to steady state operation.« less

  9. Bootstrap investigation of the stability of a Cox regression model.

    PubMed

    Altman, D G; Andersen, P K

    1989-07-01

    We describe a bootstrap investigation of the stability of a Cox proportional hazards regression model resulting from the analysis of a clinical trial of azathioprine versus placebo in patients with primary biliary cirrhosis. We have considered stability to refer both to the choice of variables included in the model and, more importantly, to the predictive ability of the model. In stepwise Cox regression analyses of 100 bootstrap samples using 17 candidate variables, the most frequently selected variables were those selected in the original analysis, and no other important variable was identified. Thus there was no reason to doubt the model obtained in the original analysis. For each patient in the trial, bootstrap confidence intervals were constructed for the estimated probability of surviving two years. It is shown graphically that these intervals are markedly wider than those obtained from the original model.

  10. Bootstrap and Counter-Bootstrap approaches for formation of the cortege of Informative indicators by Results of Measurements

    NASA Astrophysics Data System (ADS)

    Artemenko, M. V.; Chernetskaia, I. E.; Kalugina, N. M.; Shchekina, E. N.

    2018-04-01

    This article describes the solution of the actual problem of the productive formation of a cortege of informative measured features of the object of observation and / or control using author's algorithms for the use of bootstraps and counter-bootstraps technologies for processing the results of measurements of various states of the object on the basis of different volumes of the training sample. The work that is presented in this paper considers aggregation by specific indicators of informative capacity by linear, majority, logical and “greedy” methods, applied both individually and integrally. The results of the computational experiment are discussed, and in conclusion is drawn that the application of the proposed methods contributes to an increase in the efficiency of classification of the states of the object from the results of measurements.

  11. How bootstrap can help in forecasting time series with more than one seasonal pattern

    NASA Astrophysics Data System (ADS)

    Cordeiro, Clara; Neves, M. Manuela

    2012-09-01

    The search for the future is an appealing challenge in time series analysis. The diversity of forecasting methodologies is inevitable and is still in expansion. Exponential smoothing methods are the launch platform for modelling and forecasting in time series analysis. Recently this methodology has been combined with bootstrapping revealing a good performance. The algorithm (Boot. EXPOS) using exponential smoothing and bootstrap methodologies, has showed promising results for forecasting time series with one seasonal pattern. In case of more than one seasonal pattern, the double seasonal Holt-Winters methods and the exponential smoothing methods were developed. A new challenge was now to combine these seasonal methods with bootstrap and carry over a similar resampling scheme used in Boot. EXPOS procedure. The performance of such partnership will be illustrated for some well-know data sets existing in software.

  12. How Many Subjects are Needed for a Visual Field Normative Database? A Comparison of Ground Truth and Bootstrapped Statistics.

    PubMed

    Phu, Jack; Bui, Bang V; Kalloniatis, Michael; Khuu, Sieu K

    2018-03-01

    The number of subjects needed to establish the normative limits for visual field (VF) testing is not known. Using bootstrap resampling, we determined whether the ground truth mean, distribution limits, and standard deviation (SD) could be approximated using different set size ( x ) levels, in order to provide guidance for the number of healthy subjects required to obtain robust VF normative data. We analyzed the 500 Humphrey Field Analyzer (HFA) SITA-Standard results of 116 healthy subjects and 100 HFA full threshold results of 100 psychophysically experienced healthy subjects. These VFs were resampled (bootstrapped) to determine mean sensitivity, distribution limits (5th and 95th percentiles), and SD for different ' x ' and numbers of resamples. We also used the VF results of 122 glaucoma patients to determine the performance of ground truth and bootstrapped results in identifying and quantifying VF defects. An x of 150 (for SITA-Standard) and 60 (for full threshold) produced bootstrapped descriptive statistics that were no longer different to the original distribution limits and SD. Removing outliers produced similar results. Differences between original and bootstrapped limits in detecting glaucomatous defects were minimized at x = 250. Ground truth statistics of VF sensitivities could be approximated using set sizes that are significantly smaller than the original cohort. Outlier removal facilitates the use of Gaussian statistics and does not significantly affect the distribution limits. We provide guidance for choosing the cohort size for different levels of error when performing normative comparisons with glaucoma patients.

  13. A bootstrap estimation scheme for chemical compositional data with nondetects

    USGS Publications Warehouse

    Palarea-Albaladejo, J; Martín-Fernández, J.A; Olea, Ricardo A.

    2014-01-01

    The bootstrap method is commonly used to estimate the distribution of estimators and their associated uncertainty when explicit analytic expressions are not available or are difficult to obtain. It has been widely applied in environmental and geochemical studies, where the data generated often represent parts of whole, typically chemical concentrations. This kind of constrained data is generically called compositional data, and they require specialised statistical methods to properly account for their particular covariance structure. On the other hand, it is not unusual in practice that those data contain labels denoting nondetects, that is, concentrations falling below detection limits. Nondetects impede the implementation of the bootstrap and represent an additional source of uncertainty that must be taken into account. In this work, a bootstrap scheme is devised that handles nondetects by adding an imputation step within the resampling process and conveniently propagates their associated uncertainly. In doing so, it considers the constrained relationships between chemical concentrations originated from their compositional nature. Bootstrap estimates using a range of imputation methods, including new stochastic proposals, are compared across scenarios of increasing difficulty. They are formulated to meet compositional principles following the log-ratio approach, and an adjustment is introduced in the multivariate case to deal with nonclosed samples. Results suggest that nondetect bootstrap based on model-based imputation is generally preferable. A robust approach based on isometric log-ratio transformations appears to be particularly suited in this context. Computer routines in the R statistical programming language are provided. 

  14. Evaluating sufficient similarity for drinking-water disinfection by-product (DBP) mixtures with bootstrap hypothesis test procedures.

    PubMed

    Feder, Paul I; Ma, Zhenxu J; Bull, Richard J; Teuschler, Linda K; Rice, Glenn

    2009-01-01

    In chemical mixtures risk assessment, the use of dose-response data developed for one mixture to estimate risk posed by a second mixture depends on whether the two mixtures are sufficiently similar. While evaluations of similarity may be made using qualitative judgments, this article uses nonparametric statistical methods based on the "bootstrap" resampling technique to address the question of similarity among mixtures of chemical disinfectant by-products (DBP) in drinking water. The bootstrap resampling technique is a general-purpose, computer-intensive approach to statistical inference that substitutes empirical sampling for theoretically based parametric mathematical modeling. Nonparametric, bootstrap-based inference involves fewer assumptions than parametric normal theory based inference. The bootstrap procedure is appropriate, at least in an asymptotic sense, whether or not the parametric, distributional assumptions hold, even approximately. The statistical analysis procedures in this article are initially illustrated with data from 5 water treatment plants (Schenck et al., 2009), and then extended using data developed from a study of 35 drinking-water utilities (U.S. EPA/AMWA, 1989), which permits inclusion of a greater number of water constituents and increased structure in the statistical models.

  15. Using the Bootstrap Method to Evaluate the Critical Range of Misfit for Polytomous Rasch Fit Statistics.

    PubMed

    Seol, Hyunsoo

    2016-06-01

    The purpose of this study was to apply the bootstrap procedure to evaluate how the bootstrapped confidence intervals (CIs) for polytomous Rasch fit statistics might differ according to sample sizes and test lengths in comparison with the rule-of-thumb critical value of misfit. A total of 25 simulated data sets were generated to fit the Rasch measurement and then a total of 1,000 replications were conducted to compute the bootstrapped CIs under each of 25 testing conditions. The results showed that rule-of-thumb critical values for assessing the magnitude of misfit were not applicable because the infit and outfit mean square error statistics showed different magnitudes of variability over testing conditions and the standardized fit statistics did not exactly follow the standard normal distribution. Further, they also do not share the same critical range for the item and person misfit. Based on the results of the study, the bootstrapped CIs can be used to identify misfitting items or persons as they offer a reasonable alternative solution, especially when the distributions of the infit and outfit statistics are not well known and depend on sample size. © The Author(s) 2016.

  16. Regional melt-pond fraction and albedo of thin Arctic first-year drift ice in late summer

    NASA Astrophysics Data System (ADS)

    Divine, D. V.; Granskog, M. A.; Hudson, S. R.; Pedersen, C. A.; Karlsen, T. I.; Divina, S. A.; Renner, A. H. H.; Gerland, S.

    2015-02-01

    The paper presents a case study of the regional (≈150 km) morphological and optical properties of a relatively thin, 70-90 cm modal thickness, first-year Arctic sea ice pack in an advanced stage of melt. The study combines in situ broadband albedo measurements representative of the four main surface types (bare ice, dark melt ponds, bright melt ponds and open water) and images acquired by a helicopter-borne camera system during ice-survey flights. The data were collected during the 8-day ICE12 drift experiment carried out by the Norwegian Polar Institute in the Arctic, north of Svalbard at 82.3° N, from 26 July to 3 August 2012. A set of > 10 000 classified images covering about 28 km2 revealed a homogeneous melt across the study area with melt-pond coverage of ≈ 0.29 and open-water fraction of ≈ 0.11. A decrease in pond fractions observed in the 30 km marginal ice zone (MIZ) occurred in parallel with an increase in open-water coverage. The moving block bootstrap technique applied to sequences of classified sea-ice images and albedo of the four surface types yielded a regional albedo estimate of 0.37 (0.35; 0.40) and regional sea-ice albedo of 0.44 (0.42; 0.46). Random sampling from the set of classified images allowed assessment of the aggregate scale of at least 0.7 km2 for the study area. For the current setup configuration it implies a minimum set of 300 images to process in order to gain adequate statistics on the state of the ice cover. Variance analysis also emphasized the importance of longer series of in situ albedo measurements conducted for each surface type when performing regional upscaling. The uncertainty in the mean estimates of surface type albedo from in situ measurements contributed up to 95% of the variance of the estimated regional albedo, with the remaining variance resulting from the spatial inhomogeneity of sea-ice cover.

  17. Formation and sustainment of internal transport barriers in the International Thermonuclear Experimental Reactor with the baseline heating mixa)

    NASA Astrophysics Data System (ADS)

    Poli, Francesca M.; Kessel, Charles E.

    2013-05-01

    Plasmas with internal transport barriers (ITBs) are a potential and attractive route to steady-state operation in ITER. These plasmas exhibit radially localized regions of improved confinement with steep pressure gradients in the plasma core, which drive large bootstrap current and generate hollow current profiles and negative magnetic shear. This work examines the formation and sustainment of ITBs in ITER with electron cyclotron heating and current drive. The time-dependent transport simulations indicate that, with a trade-off of the power delivered to the equatorial and to the upper launcher, the sustainment of steady-state ITBs can be demonstrated in ITER with the baseline heating configuration.

  18. The Novaya Zemlya Event of 31 December 1992 and Seismic Identification Issues: Annual Seismic Research Symposium (15th) Held in Vail, Colorado on 8-10 September 1993

    DTIC Science & Technology

    1993-09-10

    1993). A bootstrap generalizedlikelihood ratio test in discriminant analysis, Proc. 15th Annual Seismic Research Symposium, in press. I Hedlin, M., J... ratio indicate that the event does not belong to the first class. The bootstrap technique is used here as well to set the critical value of the test ...Methodist University. Baek, J., H. L. Gray, W. A. Woodward and M.D. Fisk (1993). A Bootstrap Generalized Likelihood Ratio Test in Discriminant

  19. Combining Nordtest method and bootstrap resampling for measurement uncertainty estimation of hematology analytes in a medical laboratory.

    PubMed

    Cui, Ming; Xu, Lili; Wang, Huimin; Ju, Shaoqing; Xu, Shuizhu; Jing, Rongrong

    2017-12-01

    Measurement uncertainty (MU) is a metrological concept, which can be used for objectively estimating the quality of test results in medical laboratories. The Nordtest guide recommends an approach that uses both internal quality control (IQC) and external quality assessment (EQA) data to evaluate the MU. Bootstrap resampling is employed to simulate the unknown distribution based on the mathematical statistics method using an existing small sample of data, where the aim is to transform the small sample into a large sample. However, there have been no reports of the utilization of this method in medical laboratories. Thus, this study applied the Nordtest guide approach based on bootstrap resampling for estimating the MU. We estimated the MU for the white blood cell (WBC) count, red blood cell (RBC) count, hemoglobin (Hb), and platelets (Plt). First, we used 6months of IQC data and 12months of EQA data to calculate the MU according to the Nordtest method. Second, we combined the Nordtest method and bootstrap resampling with the quality control data and calculated the MU using MATLAB software. We then compared the MU results obtained using the two approaches. The expanded uncertainty results determined for WBC, RBC, Hb, and Plt using the bootstrap resampling method were 4.39%, 2.43%, 3.04%, and 5.92%, respectively, and 4.38%, 2.42%, 3.02%, and 6.00% with the existing quality control data (U [k=2]). For WBC, RBC, Hb, and Plt, the differences between the results obtained using the two methods were lower than 1.33%. The expanded uncertainty values were all less than the target uncertainties. The bootstrap resampling method allows the statistical analysis of the MU. Combining the Nordtest method and bootstrap resampling is considered a suitable alternative method for estimating the MU. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  20. Comparison between numerical and analytical results on the required rf current for stabilizing neoclassical tearing modes

    NASA Astrophysics Data System (ADS)

    Wang, Xiaojing; Yu, Qingquan; Zhang, Xiaodong; Zhang, Yang; Zhu, Sizheng; Wang, Xiaoguang; Wu, Bin

    2018-04-01

    Numerical studies on the stabilization of neoclassical tearing modes (NTMs) by electron cyclotron current drive (ECCD) have been carried out based on reduced MHD equations, focusing on the amount of the required driven current for mode stabilization and the comparison with analytical results. The dependence of the minimum driven current required for NTM stabilization on some parameters, including the bootstrap current density, radial width of the driven current, radial deviation of the driven current from the resonant surface, and the island width when applying ECCD, are studied. By fitting the numerical results, simple expressions for these dependences are obtained. Analysis based on the modified Rutherford equation (MRE) has also been carried out, and the corresponding results have the same trend as numerical ones, while a quantitative difference between them exists. This difference becomes smaller when the applied radio frequency (rf) current is smaller.

  1. Exploring the Replicability of a Study's Results: Bootstrap Statistics for the Multivariate Case.

    ERIC Educational Resources Information Center

    Thompson, Bruce

    1995-01-01

    Use of the bootstrap method in a canonical correlation analysis to evaluate the replicability of a study's results is illustrated. More confidence may be vested in research results that replicate. (SLD)

  2. The Role of GRAIL Orbit Determination in Preprocessing of Gravity Science Measurements

    NASA Technical Reports Server (NTRS)

    Kruizinga, Gerhard; Asmar, Sami; Fahnestock, Eugene; Harvey, Nate; Kahan, Daniel; Konopliv, Alex; Oudrhiri, Kamal; Paik, Meegyeong; Park, Ryan; Strekalov, Dmitry; hide

    2013-01-01

    The Gravity Recovery And Interior Laboratory (GRAIL) mission has constructed a lunar gravity field with unprecedented uniform accuracy on the farside and nearside of the Moon. GRAIL lunar gravity field determination begins with preprocessing of the gravity science measurements by applying corrections for time tag error, general relativity, measurement noise and biases. Gravity field determination requires the generation of spacecraft ephemerides of an accuracy not attainable with the pre-GRAIL lunar gravity fields. Therefore, a bootstrapping strategy was developed, iterating between science data preprocessing and lunar gravity field estimation in order to construct sufficiently accurate orbit ephemerides.This paper describes the GRAIL measurements, their dependence on the spacecraft ephemerides and the role of orbit determination in the bootstrapping strategy. Simulation results will be presented that validate the bootstrapping strategy followed by bootstrapping results for flight data, which have led to the latest GRAIL lunar gravity fields.

  3. Towards a bootstrap approach to higher orders of epsilon expansion

    NASA Astrophysics Data System (ADS)

    Dey, Parijat; Kaviraj, Apratim

    2018-02-01

    We employ a hybrid approach in determining the anomalous dimension and OPE coefficient of higher spin operators in the Wilson-Fisher theory. First we do a large spin analysis for CFT data where we use results obtained from the usual and the Mellin bootstrap and also from Feynman diagram literature. This gives new predictions at O( ɛ 4) and O( ɛ 5) for anomalous dimensions and OPE coefficients, and also provides a cross-check for the results from Mellin bootstrap. These higher orders get contributions from all higher spin operators in the crossed channel. We also use the bootstrap in Mellin space method for ϕ 3 in d = 6 - ɛ CFT where we calculate general higher spin OPE data. We demonstrate a higher loop order calculation in this approach by summing over contributions from higher spin operators of the crossed channel in the same spirit as before.

  4. Point Set Denoising Using Bootstrap-Based Radial Basis Function.

    PubMed

    Liew, Khang Jie; Ramli, Ahmad; Abd Majid, Ahmad

    2016-01-01

    This paper examines the application of a bootstrap test error estimation of radial basis functions, specifically thin-plate spline fitting, in surface smoothing. The presence of noisy data is a common issue of the point set model that is generated from 3D scanning devices, and hence, point set denoising is one of the main concerns in point set modelling. Bootstrap test error estimation, which is applied when searching for the smoothing parameters of radial basis functions, is revisited. The main contribution of this paper is a smoothing algorithm that relies on a bootstrap-based radial basis function. The proposed method incorporates a k-nearest neighbour search and then projects the point set to the approximated thin-plate spline surface. Therefore, the denoising process is achieved, and the features are well preserved. A comparison of the proposed method with other smoothing methods is also carried out in this study.

  5. Abstract: Inference and Interval Estimation for Indirect Effects With Latent Variable Models.

    PubMed

    Falk, Carl F; Biesanz, Jeremy C

    2011-11-30

    Models specifying indirect effects (or mediation) and structural equation modeling are both popular in the social sciences. Yet relatively little research has compared methods that test for indirect effects among latent variables and provided precise estimates of the effectiveness of different methods. This simulation study provides an extensive comparison of methods for constructing confidence intervals and for making inferences about indirect effects with latent variables. We compared the percentile (PC) bootstrap, bias-corrected (BC) bootstrap, bias-corrected accelerated (BC a ) bootstrap, likelihood-based confidence intervals (Neale & Miller, 1997), partial posterior predictive (Biesanz, Falk, and Savalei, 2010), and joint significance tests based on Wald tests or likelihood ratio tests. All models included three reflective latent variables representing the independent, dependent, and mediating variables. The design included the following fully crossed conditions: (a) sample size: 100, 200, and 500; (b) number of indicators per latent variable: 3 versus 5; (c) reliability per set of indicators: .7 versus .9; (d) and 16 different path combinations for the indirect effect (α = 0, .14, .39, or .59; and β = 0, .14, .39, or .59). Simulations were performed using a WestGrid cluster of 1680 3.06GHz Intel Xeon processors running R and OpenMx. Results based on 1,000 replications per cell and 2,000 resamples per bootstrap method indicated that the BC and BC a bootstrap methods have inflated Type I error rates. Likelihood-based confidence intervals and the PC bootstrap emerged as methods that adequately control Type I error and have good coverage rates.

  6. Combining test statistics and models in bootstrapped model rejection: it is a balancing act

    PubMed Central

    2014-01-01

    Background Model rejections lie at the heart of systems biology, since they provide conclusive statements: that the corresponding mechanistic assumptions do not serve as valid explanations for the experimental data. Rejections are usually done using e.g. the chi-square test (χ2) or the Durbin-Watson test (DW). Analytical formulas for the corresponding distributions rely on assumptions that typically are not fulfilled. This problem is partly alleviated by the usage of bootstrapping, a computationally heavy approach to calculate an empirical distribution. Bootstrapping also allows for a natural extension to estimation of joint distributions, but this feature has so far been little exploited. Results We herein show that simplistic combinations of bootstrapped tests, like the max or min of the individual p-values, give inconsistent, i.e. overly conservative or liberal, results. A new two-dimensional (2D) approach based on parametric bootstrapping, on the other hand, is found both consistent and with a higher power than the individual tests, when tested on static and dynamic examples where the truth is known. In the same examples, the most superior test is a 2D χ2vsχ2, where the second χ2-value comes from an additional help model, and its ability to describe bootstraps from the tested model. This superiority is lost if the help model is too simple, or too flexible. If a useful help model is found, the most powerful approach is the bootstrapped log-likelihood ratio (LHR). We show that this is because the LHR is one-dimensional, because the second dimension comes at a cost, and because LHR has retained most of the crucial information in the 2D distribution. These approaches statistically resolve a previously published rejection example for the first time. Conclusions We have shown how to, and how not to, combine tests in a bootstrap setting, when the combination is advantageous, and when it is advantageous to include a second model. These results also provide a deeper insight into the original motivation for formulating the LHR, for the more general setting of nonlinear and non-nested models. These insights are valuable in cases when accuracy and power, rather than computational speed, are prioritized. PMID:24742065

  7. Trends and Correlation Estimation in Climate Sciences: Effects of Timescale Errors

    NASA Astrophysics Data System (ADS)

    Mudelsee, M.; Bermejo, M. A.; Bickert, T.; Chirila, D.; Fohlmeister, J.; Köhler, P.; Lohmann, G.; Olafsdottir, K.; Scholz, D.

    2012-12-01

    Trend describes time-dependence in the first moment of a stochastic process, and correlation measures the linear relation between two random variables. Accurately estimating the trend and correlation, including uncertainties, from climate time series data in the uni- and bivariate domain, respectively, allows first-order insights into the geophysical process that generated the data. Timescale errors, ubiquitious in paleoclimatology, where archives are sampled for proxy measurements and dated, poses a problem to the estimation. Statistical science and the various applied research fields, including geophysics, have almost completely ignored this problem due to its theoretical almost-intractability. However, computational adaptations or replacements of traditional error formulas have become technically feasible. This contribution gives a short overview of such an adaptation package, bootstrap resampling combined with parametric timescale simulation. We study linear regression, parametric change-point models and nonparametric smoothing for trend estimation. We introduce pairwise-moving block bootstrap resampling for correlation estimation. Both methods share robustness against autocorrelation and non-Gaussian distributional shape. We shortly touch computing-intensive calibration of bootstrap confidence intervals and consider options to parallelize the related computer code. Following examples serve not only to illustrate the methods but tell own climate stories: (1) the search for climate drivers of the Agulhas Current on recent timescales, (2) the comparison of three stalagmite-based proxy series of regional, western German climate over the later part of the Holocene, and (3) trends and transitions in benthic oxygen isotope time series from the Cenozoic. Financial support by Deutsche Forschungsgemeinschaft (FOR 668, FOR 1070, MU 1595/4-1) and the European Commission (MC ITN 238512, MC ITN 289447) is acknowledged.

  8. Numerical Analysis of the Effects of Normalized Plasma Pressure on RMP ELM Suppression in DIII-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orlov, D. M.; Moyer, R.A.; Evans, T. E.

    2010-01-01

    The effect of normalized plasma pressure as characterized by normalized pressure parameter (beta(N)) on the suppression of edge localized modes (ELMs) using resonant magnetic perturbations (RMPs) is studied in low-collisionality (nu* <= 0.2) H-mode plasmas with low-triangularity ( = 0.25) and ITER similar shapes ( = 0.51). Experimental results have suggested that ELM suppression by RMPs requires a minimum threshold in plasma pressure as characterized by beta(N). The variations in the vacuum field topology with beta(N) due to safety factor profile and island overlap changes caused by variation of the Shafranov shift and pedestal bootstrap current are examined numerically withmore » the field line integration code TRIP3D. The results show very small differences in the vacuum field structure in terms of the Chirikov (magnetic island overlap) parameter, Poincare sections and field line loss fractions. These differences do not appear to explain the observed threshold in beta(N) for ELM suppression. Linear peeling-ballooning stability analysis with the ELITE code suggests that the ELMs which persist during the RMPs when beta(N) is below the observed threshold are not type I ELMs, because the pedestal conditions are deep within the stable regime for peeling-ballooning modes. These ELMs have similarities to type III ELMs or low density ELMs.« less

  9. A Critical Meta-Analysis of Lens Model Studies in Human Judgment and Decision-Making

    PubMed Central

    Kaufmann, Esther; Reips, Ulf-Dietrich; Wittmann, Werner W.

    2013-01-01

    Achieving accurate judgment (‘judgmental achievement’) is of utmost importance in daily life across multiple domains. The lens model and the lens model equation provide useful frameworks for modeling components of judgmental achievement and for creating tools to help decision makers (e.g., physicians, teachers) reach better judgments (e.g., a correct diagnosis, an accurate estimation of intelligence). Previous meta-analyses of judgment and decision-making studies have attempted to evaluate overall judgmental achievement and have provided the basis for evaluating the success of bootstrapping (i.e., replacing judges by linear models that guide decision making). However, previous meta-analyses have failed to appropriately correct for a number of study design artifacts (e.g., measurement error, dichotomization), which may have potentially biased estimations (e.g., of the variability between studies) and led to erroneous interpretations (e.g., with regards to moderator variables). In the current study we therefore conduct the first psychometric meta-analysis of judgmental achievement studies that corrects for a number of study design artifacts. We identified 31 lens model studies (N = 1,151, k = 49) that met our inclusion criteria. We evaluated overall judgmental achievement as well as whether judgmental achievement depended on decision domain (e.g., medicine, education) and/or the level of expertise (expert vs. novice). We also evaluated whether using corrected estimates affected conclusions with regards to the success of bootstrapping with psychometrically-corrected models. Further, we introduce a new psychometric trim-and-fill method to estimate the effect sizes of potentially missing studies correct psychometric meta-analyses for effects of publication bias. Comparison of the results of the psychometric meta-analysis with the results of a traditional meta-analysis (which only corrected for sampling error) indicated that artifact correction leads to a) an increase in values of the lens model components, b) reduced heterogeneity between studies, and c) increases the success of bootstrapping. We argue that psychometric meta-analysis is useful for accurately evaluating human judgment and show the success of bootstrapping. PMID:24391781

  10. A critical meta-analysis of lens model studies in human judgment and decision-making.

    PubMed

    Kaufmann, Esther; Reips, Ulf-Dietrich; Wittmann, Werner W

    2013-01-01

    Achieving accurate judgment ('judgmental achievement') is of utmost importance in daily life across multiple domains. The lens model and the lens model equation provide useful frameworks for modeling components of judgmental achievement and for creating tools to help decision makers (e.g., physicians, teachers) reach better judgments (e.g., a correct diagnosis, an accurate estimation of intelligence). Previous meta-analyses of judgment and decision-making studies have attempted to evaluate overall judgmental achievement and have provided the basis for evaluating the success of bootstrapping (i.e., replacing judges by linear models that guide decision making). However, previous meta-analyses have failed to appropriately correct for a number of study design artifacts (e.g., measurement error, dichotomization), which may have potentially biased estimations (e.g., of the variability between studies) and led to erroneous interpretations (e.g., with regards to moderator variables). In the current study we therefore conduct the first psychometric meta-analysis of judgmental achievement studies that corrects for a number of study design artifacts. We identified 31 lens model studies (N = 1,151, k = 49) that met our inclusion criteria. We evaluated overall judgmental achievement as well as whether judgmental achievement depended on decision domain (e.g., medicine, education) and/or the level of expertise (expert vs. novice). We also evaluated whether using corrected estimates affected conclusions with regards to the success of bootstrapping with psychometrically-corrected models. Further, we introduce a new psychometric trim-and-fill method to estimate the effect sizes of potentially missing studies correct psychometric meta-analyses for effects of publication bias. Comparison of the results of the psychometric meta-analysis with the results of a traditional meta-analysis (which only corrected for sampling error) indicated that artifact correction leads to a) an increase in values of the lens model components, b) reduced heterogeneity between studies, and c) increases the success of bootstrapping. We argue that psychometric meta-analysis is useful for accurately evaluating human judgment and show the success of bootstrapping.

  11. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  12. Overview of Recent DIII-D Experimental Results

    NASA Astrophysics Data System (ADS)

    Fenstermacher, Max

    2015-11-01

    Recent DIII-D experiments have added to the ITER physics basis and to physics understanding for extrapolation to future devices. ELMs were suppressed by RMPs in He plasmas consistent with ITER non-nuclear phase conditions, and in steady state hybrid plasmas. Characteristics of the EHO during both standard high torque, and low torque enhanced pedestal QH-mode with edge broadband fluctuations were measured, including edge localized density fluctuations with a microwave imaging reflectometer. The path to Super H-mode was verified at high beta with a QH-mode edge, and in plasmas with ELMs triggered by Li granules. ITER acceptable TQ mitigation was obtained with low Ne fraction Shattered Pellet Injection. Divertor ne and Te data from Thomson Scattering confirm predicted drift-driven asymmetries in electron pressure, and X-divertor heat flux reduction and detachment were characterized. The crucial mechanisms for ExB shear control of turbulence were clarified. In collaboration with EAST, high beta-p scenarios were obtained with 80 % bootstrap fraction, high H-factor and stability limits, and large radius ITBs leading to low AE activity. Work supported by the US Department of Energy under DE-FC02-04ER54698 and DE-AC52-07NA27344.

  13. Migration of the ATLAS Metadata Interface (AMI) to Web 2.0 and cloud

    NASA Astrophysics Data System (ADS)

    Odier, J.; Albrand, S.; Fulachier, J.; Lambert, F.

    2015-12-01

    The ATLAS Metadata Interface (AMI), a mature application of more than 10 years of existence, is currently under adaptation to some recently available technologies. The web interfaces, which previously manipulated XML documents using XSL transformations, are being migrated to Asynchronous JavaScript (AJAX). Web development is considerably simplified by the introduction of a framework based on JQuery and Twitter Bootstrap. Finally, the AMI services are being migrated to an OpenStack cloud infrastructure.

  14. Finite Beta Boundary Magnetic Fields of NCSX

    NASA Astrophysics Data System (ADS)

    Grossman, A.; Kaiser, T.; Mioduszewski, P.

    2004-11-01

    The magnetic field between the plasma surface and wall of the National Compact Stellarator (NCSX), which uses quasi-symmetry to combine the best features of the tokamak and stellarator in a configuration of low aspect ratio is mapped via field line tracing in a range of finite beta in which part of the rotational transform is generated by the bootstrap current. We adopt the methodology developed for W7-X, in which an equilibrium solution is computed by an inverse equilibrium solver based on an energy minimizing variational moments code, VMEC2000[1], which solves directly for the shape of the flux surfaces given the external coils and their currents as well as a bootstrap current provided by a separate transport calculation. The VMEC solution and the Biot-Savart vacuum fields are coupled to the magnetic field solver for finite-beta equilibrium (MFBE2001)[2] code to determine the magnetic field on a 3D grid over a computational domain. It is found that the edge plasma is more stellarator-like, with a complex 3D structure, and less like the ordered 2D symmetric structure of a tokamak. The field lines make a transition from ergodically covering a surface to ergodically covering a volume, as the distance from the last closed magnetic surface is increased. The results are compared with the PIES[3] calculations. [1] S.P. Hirshman et al. Comput. Phys. Commun. 43 (1986) 143. [2] E. Strumberger, et al. Nucl. Fusion 42 (2002) 827. [3] A.H. Reiman and H.S. Greenside, Comput. Phys. Commun. 43, 157 (1986).

  15. Bootstrap Methods: A Very Leisurely Look.

    ERIC Educational Resources Information Center

    Hinkle, Dennis E.; Winstead, Wayland H.

    The Bootstrap method, a computer-intensive statistical method of estimation, is illustrated using a simple and efficient Statistical Analysis System (SAS) routine. The utility of the method for generating unknown parameters, including standard errors for simple statistics, regression coefficients, discriminant function coefficients, and factor…

  16. Bootstrapping Student Understanding of What Is Going on in Econometrics.

    ERIC Educational Resources Information Center

    Kennedy, Peter E.

    2001-01-01

    Explains that econometrics is an intellectual game played by rules based on the sampling distribution concept. Contains explanations for why many students are uncomfortable with econometrics. Encourages instructors to use explain-how-to-bootstrap exercises to promote student understanding. (RLH)

  17. The fidelity of Kepler eclipsing binary parameters inferred by the neural network

    NASA Astrophysics Data System (ADS)

    Holanda, N.; da Silva, J. R. P.

    2018-04-01

    This work aims to test the fidelity and efficiency of obtaining automatic orbital elements of eclipsing binary systems, from light curves using neural network models. We selected a random sample with 78 systems, from over 1400 eclipsing binary detached obtained from the Kepler Eclipsing Binaries Catalog, processed using the neural network approach. The orbital parameters of the sample systems were measured applying the traditional method of light curve adjustment with uncertainties calculated by the bootstrap method, employing the JKTEBOP code. These estimated parameters were compared with those obtained by the neural network approach for the same systems. The results reveal a good agreement between techniques for the sum of the fractional radii and moderate agreement for e cos ω and e sin ω, but orbital inclination is clearly underestimated in neural network tests.

  18. The fidelity of Kepler eclipsing binary parameters inferred by the neural network

    NASA Astrophysics Data System (ADS)

    Holanda, N.; da Silva, J. R. P.

    2018-07-01

    This work aims to test the fidelity and efficiency of obtaining automatic orbital elements of eclipsing binary systems, from light curves using neural network models. We selected a random sample with 78 systems, from over 1400 detached eclipsing binaries obtained from the Kepler Eclipsing Binaries Catalog, processed using the neural network approach. The orbital parameters of the sample systems were measured applying the traditional method of light-curve adjustment with uncertainties calculated by the bootstrap method, employing the JKTEBOP code. These estimated parameters were compared with those obtained by the neural network approach for the same systems. The results reveal a good agreement between techniques for the sum of the fractional radii and moderate agreement for e cosω and e sinω, but orbital inclination is clearly underestimated in neural network tests.

  19. Using the bootstrap to establish statistical significance for relative validity comparisons among patient-reported outcome measures

    PubMed Central

    2013-01-01

    Background Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Methods Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups of patients compared to the most discriminating (reference) measure. Statistical significance of RV was quantified by the 95% bootstrap confidence interval. Simulations examined the effects of sample size, denominator F-statistic, correlation between comparator and reference measures, and number of bootstrap replicates. Results The statistical significance of the RV increased as the magnitude of denominator F-statistic increased or as the correlation between comparator and reference measures increased. A denominator F-statistic of 57 conveyed sufficient power (80%) to detect an RV of 0.6 for two measures correlated at r = 0.7. Larger denominator F-statistics or higher correlations provided greater power. Larger sample size with a fixed denominator F-statistic or more bootstrap replicates (beyond 500) had minimal impact. Conclusions The bootstrap is valuable for establishing the statistical significance of RV estimates. A reasonably large denominator F-statistic (F > 57) is required for adequate power when using the RV to compare the validity of measures with small or moderate correlations (r < 0.7). Substantially greater power can be achieved when comparing measures of a very high correlation (r > 0.9). PMID:23721463

  20. Four Bootstrap Confidence Intervals for the Binomial-Error Model.

    ERIC Educational Resources Information Center

    Lin, Miao-Hsiang; Hsiung, Chao A.

    1992-01-01

    Four bootstrap methods are identified for constructing confidence intervals for the binomial-error model. The extent to which similar results are obtained and the theoretical foundation of each method and its relevance and ranges of modeling the true score uncertainty are discussed. (SLD)

  1. Nonparametric Regression and the Parametric Bootstrap for Local Dependence Assessment.

    ERIC Educational Resources Information Center

    Habing, Brian

    2001-01-01

    Discusses ideas underlying nonparametric regression and the parametric bootstrap with an overview of their application to item response theory and the assessment of local dependence. Illustrates the use of the method in assessing local dependence that varies with examinee trait levels. (SLD)

  2. Application of the Bootstrap Statistical Method in Deriving Vibroacoustic Specifications

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; Paez, Thomas L.

    2006-01-01

    This paper discusses the Bootstrap Method for specification of vibroacoustic test specifications. Vibroacoustic test specifications are necessary to properly accept or qualify a spacecraft and its components for the expected acoustic, random vibration and shock environments seen on an expendable launch vehicle. Traditionally, NASA and the U.S. Air Force have employed methods of Normal Tolerance Limits to derive these test levels based upon the amount of data available, and the probability and confidence levels desired. The Normal Tolerance Limit method contains inherent assumptions about the distribution of the data. The Bootstrap is a distribution-free statistical subsampling method which uses the measured data themselves to establish estimates of statistical measures of random sources. This is achieved through the computation of large numbers of Bootstrap replicates of a data measure of interest and the use of these replicates to derive test levels consistent with the probability and confidence desired. The comparison of the results of these two methods is illustrated via an example utilizing actual spacecraft vibroacoustic data.

  3. The Reliability and Stability of an Inferred Phylogenetic Tree from Empirical Data.

    PubMed

    Katsura, Yukako; Stanley, Craig E; Kumar, Sudhir; Nei, Masatoshi

    2017-03-01

    The reliability of a phylogenetic tree obtained from empirical data is usually measured by the bootstrap probability (Pb) of interior branches of the tree. If the bootstrap probability is high for most branches, the tree is considered to be reliable. If some interior branches show relatively low bootstrap probabilities, we are not sure that the inferred tree is really reliable. Here, we propose another quantity measuring the reliability of the tree called the stability of a subtree. This quantity refers to the probability of obtaining a subtree (Ps) of an inferred tree obtained. We then show that if the tree is to be reliable, both Pb and Ps must be high. We also show that Ps is given by a bootstrap probability of the subtree with the closest outgroup sequence, and computer program RESTA for computing the Pb and Ps values will be presented. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  4. Closure of the operator product expansion in the non-unitary bootstrap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Esterlis, Ilya; Fitzpatrick, A. Liam; Ramirez, David M.

    We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a specialmore » case of the “Gliozzi” bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.« less

  5. A revisit to contingency table and tests of independence: bootstrap is preferred to Chi-square approximations as well as Fisher's exact test.

    PubMed

    Lin, Jyh-Jiuan; Chang, Ching-Hui; Pal, Nabendu

    2015-01-01

    To test the mutual independence of two qualitative variables (or attributes), it is a common practice to follow the Chi-square tests (Pearson's as well as likelihood ratio test) based on data in the form of a contingency table. However, it should be noted that these popular Chi-square tests are asymptotic in nature and are useful when the cell frequencies are "not too small." In this article, we explore the accuracy of the Chi-square tests through an extensive simulation study and then propose their bootstrap versions that appear to work better than the asymptotic Chi-square tests. The bootstrap tests are useful even for small-cell frequencies as they maintain the nominal level quite accurately. Also, the proposed bootstrap tests are more convenient than the Fisher's exact test which is often criticized for being too conservative. Finally, all test methods are applied to a few real-life datasets for demonstration purposes.

  6. Closure of the operator product expansion in the non-unitary bootstrap

    DOE PAGES

    Esterlis, Ilya; Fitzpatrick, A. Liam; Ramirez, David M.

    2016-11-07

    We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a specialmore » case of the “Gliozzi” bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.« less

  7. Functional Data Analysis Applied to Modeling of Severe Acute Mucositis and Dysphagia Resulting From Head and Neck Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dean, Jamie A., E-mail: jamie.dean@icr.ac.uk; Wong, Kee H.; Gay, Hiram

    Purpose: Current normal tissue complication probability modeling using logistic regression suffers from bias and high uncertainty in the presence of highly correlated radiation therapy (RT) dose data. This hinders robust estimates of dose-response associations and, hence, optimal normal tissue–sparing strategies from being elucidated. Using functional data analysis (FDA) to reduce the dimensionality of the dose data could overcome this limitation. Methods and Materials: FDA was applied to modeling of severe acute mucositis and dysphagia resulting from head and neck RT. Functional partial least squares regression (FPLS) and functional principal component analysis were used for dimensionality reduction of the dose-volume histogrammore » data. The reduced dose data were input into functional logistic regression models (functional partial least squares–logistic regression [FPLS-LR] and functional principal component–logistic regression [FPC-LR]) along with clinical data. This approach was compared with penalized logistic regression (PLR) in terms of predictive performance and the significance of treatment covariate–response associations, assessed using bootstrapping. Results: The area under the receiver operating characteristic curve for the PLR, FPC-LR, and FPLS-LR models was 0.65, 0.69, and 0.67, respectively, for mucositis (internal validation) and 0.81, 0.83, and 0.83, respectively, for dysphagia (external validation). The calibration slopes/intercepts for the PLR, FPC-LR, and FPLS-LR models were 1.6/−0.67, 0.45/0.47, and 0.40/0.49, respectively, for mucositis (internal validation) and 2.5/−0.96, 0.79/−0.04, and 0.79/0.00, respectively, for dysphagia (external validation). The bootstrapped odds ratios indicated significant associations between RT dose and severe toxicity in the mucositis and dysphagia FDA models. Cisplatin was significantly associated with severe dysphagia in the FDA models. None of the covariates was significantly associated with severe toxicity in the PLR models. Dose levels greater than approximately 1.0 Gy/fraction were most strongly associated with severe acute mucositis and dysphagia in the FDA models. Conclusions: FPLS and functional principal component analysis marginally improved predictive performance compared with PLR and provided robust dose-response associations. FDA is recommended for use in normal tissue complication probability modeling.« less

  8. Functional Data Analysis Applied to Modeling of Severe Acute Mucositis and Dysphagia Resulting From Head and Neck Radiation Therapy.

    PubMed

    Dean, Jamie A; Wong, Kee H; Gay, Hiram; Welsh, Liam C; Jones, Ann-Britt; Schick, Ulrike; Oh, Jung Hun; Apte, Aditya; Newbold, Kate L; Bhide, Shreerang A; Harrington, Kevin J; Deasy, Joseph O; Nutting, Christopher M; Gulliford, Sarah L

    2016-11-15

    Current normal tissue complication probability modeling using logistic regression suffers from bias and high uncertainty in the presence of highly correlated radiation therapy (RT) dose data. This hinders robust estimates of dose-response associations and, hence, optimal normal tissue-sparing strategies from being elucidated. Using functional data analysis (FDA) to reduce the dimensionality of the dose data could overcome this limitation. FDA was applied to modeling of severe acute mucositis and dysphagia resulting from head and neck RT. Functional partial least squares regression (FPLS) and functional principal component analysis were used for dimensionality reduction of the dose-volume histogram data. The reduced dose data were input into functional logistic regression models (functional partial least squares-logistic regression [FPLS-LR] and functional principal component-logistic regression [FPC-LR]) along with clinical data. This approach was compared with penalized logistic regression (PLR) in terms of predictive performance and the significance of treatment covariate-response associations, assessed using bootstrapping. The area under the receiver operating characteristic curve for the PLR, FPC-LR, and FPLS-LR models was 0.65, 0.69, and 0.67, respectively, for mucositis (internal validation) and 0.81, 0.83, and 0.83, respectively, for dysphagia (external validation). The calibration slopes/intercepts for the PLR, FPC-LR, and FPLS-LR models were 1.6/-0.67, 0.45/0.47, and 0.40/0.49, respectively, for mucositis (internal validation) and 2.5/-0.96, 0.79/-0.04, and 0.79/0.00, respectively, for dysphagia (external validation). The bootstrapped odds ratios indicated significant associations between RT dose and severe toxicity in the mucositis and dysphagia FDA models. Cisplatin was significantly associated with severe dysphagia in the FDA models. None of the covariates was significantly associated with severe toxicity in the PLR models. Dose levels greater than approximately 1.0 Gy/fraction were most strongly associated with severe acute mucositis and dysphagia in the FDA models. FPLS and functional principal component analysis marginally improved predictive performance compared with PLR and provided robust dose-response associations. FDA is recommended for use in normal tissue complication probability modeling. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  9. Edge Currents and Stability in DIII-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, D M; Fenstermacher, M E; Finkenthal, D K

    2004-12-01

    Understanding the stability physics of the H-mode pedestal in tokamak devices requires an accurate measurement of plasma current in the pedestal region with good spatial resolution. Theoretically, the high pressure gradients achieved in the edge of H-mode plasmas should lead to generation of a significant edge current density peak through bootstrap and Pfirsh-Schl{umlt u}ter effects. This edge current is important for the achievement of second stability in the context of coupled magneto hydrodynamic (MHD) modes which are both pressure (ballooning) and current (peeling) driven. Many aspects of edge localized mode (ELM) behavior can be accounted for in terms of anmore » edge current density peak, with the identification of Type 1 ELMs as intermediate-n toroidal mode number MHD modes being a natural feature of this model. The development of a edge localized instabilities in tokamak experiments code (ELITE) based on this model allows one to efficiently calculate the stability and growth of the relevant modes for a broad range of plasma parameters and thus provides a framework for understanding the limits on pedestal height. This however requires an accurate assessment of the edge current. While estimates of j{sub edge} can be made based on specific bootstrap models, their validity may be limited in the edge (gradient scalelengths comparable to orbit size, large changes in collisionality, etc.). Therefore it is highly desirable to have an actual measurement. Such measurements have been made on the DIII-D tokamak using combined polarimetry and spectroscopy of an injected lithium beam. By analyzing one of the Zeeman-split 2S-2P lithium resonance line components, one can obtain direct information on the local magnetic field components. These values allow one to infer details of the edge current density. Because of the negligible Stark mixing of the relevant atomic levels in lithium, this method of determining j(r) is insensitive to the large local electric fields typically found in enhanced confinement (H-mode) edges, and thus avoids an ambiguity common to MSE measurements of B{sub pol}.« less

  10. Edge Currents and Stability in DIII-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, D M; Fenstermacher, M E; Finkenthal, D K

    2005-05-05

    Understanding the stability physics of the H-mode pedestal in tokamak devices requires an accurate measurement of plasma current in the pedestal region with good spatial resolution. Theoretically, the high pressure gradients achieved in the edge of H-mode plasmas should lead to generation of a significant edge current density peak through bootstrap and Pfirsh-Schlueter effects. This edge current is important for the achievement of second stability in the context of coupled magneto hydrodynamic (MHD) modes which are both pressure (ballooning) and current (peeling) driven [1]. Many aspects of edge localized mode (ELM) behavior can be accounted for in terms of anmore » edge current density peak, with the identification of Type 1 ELMs as intermediate-n toroidal mode number MHD modes being a natural feature of this model [2]. The development of a edge localized instabilities in tokamak experiments code (ELITE) based on this model allows one to efficiently calculate the stability and growth of the relevant modes for a broad range of plasma parameters [3,4] and thus provides a framework for understanding the limits on pedestal height. This however requires an accurate assessment of the edge current. While estimates of j{sub edge} can be made based on specific bootstrap models, their validity may be limited in the edge (gradient scale lengths comparable to orbit size, large changes in collisionality, etc.). Therefore it is highly desirable to have an actual measurement. Such measurements have been made on the DIII-D tokamak using combined polarimetry and spectroscopy of an injected lithium beam. [5,6]. By analyzing one of the Zeeman-split 2S-2P lithium resonance line components, one can obtain direct information on the local magnetic field components. These values allow one to infer details of the edge current density. Because of the negligible Stark mixing of the relevant atomic levels in lithium, this method of determining j(r) is insensitive to the large local electric fields typically found in enhanced confinement (H-mode) edges, and thus avoids an ambiguity common to MSE measurements of B{sub pol}.« less

  11. Confidence Interval Coverage for Cohen's Effect Size Statistic

    ERIC Educational Resources Information Center

    Algina, James; Keselman, H. J.; Penfield, Randall D.

    2006-01-01

    Kelley compared three methods for setting a confidence interval (CI) around Cohen's standardized mean difference statistic: the noncentral-"t"-based, percentile (PERC) bootstrap, and biased-corrected and accelerated (BCA) bootstrap methods under three conditions of nonnormality, eight cases of sample size, and six cases of population…

  12. A Bootstrap Procedure of Propensity Score Estimation

    ERIC Educational Resources Information Center

    Bai, Haiyan

    2013-01-01

    Propensity score estimation plays a fundamental role in propensity score matching for reducing group selection bias in observational data. To increase the accuracy of propensity score estimation, the author developed a bootstrap propensity score. The commonly used propensity score matching methods: nearest neighbor matching, caliper matching, and…

  13. Software Supportability Risk Assessment in OT&E (Operational Test and Evaluation): Literature Review, Current Research Review, and Data Base Assemblage.

    DTIC Science & Technology

    1984-09-28

    variables before simula- tion of model - Search for reality checks a, - Express uncertainty as a probability density distribution. a. H2 a, H-22 TWIF... probability that the software con- tains errors. This prior is updated as test failure data are accumulated. Only a p of 1 (software known to contain...discusssed; both parametric and nonparametric versions are presented. It is shown by the author that the bootstrap underlies the jackknife method and

  14. Age related changes in fractional elimination pathways for drugs: assessing the impact of variable ontogeny on metabolic drug-drug interactions.

    PubMed

    Salem, Farzaneh; Johnson, Trevor N; Barter, Zoe E; Leeder, J Steven; Rostami-Hodjegan, Amin

    2013-08-01

    The magnitude of any metabolic drug-drug interactions (DDIs) depends on fractional importance of inhibited pathway which may not necessarily be the same in young children when compared to adults. The ontogeny pattern of cytochrome P450 (CYP) enzymes (CYPs 1A2, 2B6, 2C8, 2C9, 2C18/19, 2D6, 2E1, 3A4) and renal function were analyzed systematically. Bootstrap methodology was used to account for variability, and to define the age range over which statistical differences existed between each pair of specific pathways. A number of DDIs were simulated (Simcyp Pediatric v12) for virtual compounds to highlight effects of age on fractional elimination and consequent magnitude of DDI. For a theoretical drug metabolized 50% by each of CYP2D6 and CYP3A4 pathways at birth, co-administration of ketoconazole (3 mg/kg) resulted in a 1.65-fold difference between inhibited versus uninhibited AUC compared to 2.4-fold in 1 year olds and 3.2-fold in adults. Conversely, neonates could be more sensitive to DDI than adults in certain scenarios. Thus, extrapolation from adult data may not be applicable across all pediatric age groups. The use of pediatric physiologically based pharmacokinetic (p-PBPK) models may offer an interim solution to uncovering potential periods of vulnerability to DDI where there are no existing clinical data derived from children. © The Author(s) 2013.

  15. Dual energy X-ray absorptiometry spine scans to determine abdominal fat in post-menopausal women

    PubMed Central

    Bea, J. W.; Blew, R. M.; Going, S. B.; Hsu, C-H; Lee, M. C.; Lee, V. R.; Caan, B.J.; Kwan, M.L.; Lohman, T. G.

    2016-01-01

    Body composition may be a better predictor of chronic disease risk than body mass index (BMI) in older populations. Objectives We sought to validate spine fat fraction (%) from dual energy X-ray absorptiometry (DXA) spine scans as a proxy for total abdominal fat. Methods Total body DXA scan abdominal fat regions of interest (ROI) that have been previously validated by magnetic resonance imaging were assessed among healthy, postmenopausal women who also had antero-posterior spine scans (n=103). ROIs were 1) lumbar vertebrae L2-L4 and 2) L2-Iliac Crest (L2-IC), manually selected by two independent raters, and 3) trunk, auto-selected by DXA software. Intra-class correlation coefficients evaluated intra and inter-rater reliability on a random subset (N=25). Linear regression models, validated by bootstrapping, assessed the relationship between spine fat fraction (%) and total abdominal fat (%) ROIs. Results Mean age, BMI and total body fat were: 66.1 ± 4.8y, 25.8 ± 3.8kg/m2 and 40.0 ± 6.6%, respectively. There were no significant differences within or between raters. Linear regression models adjusted for several participant and scan characteristics were equivalent to using only BMI and spine fat fraction. The model predicted L2-L4 (Adj. R2: 0.83) and L2-IC (Adj.R2:0.84) abdominal fat (%) well; the adjusted R2 for trunk fat (%) was 0.78. Model validation demonstrated minimal over-fitting (Adj. R2: 0.82, 0.83, and 0.77 for L2-L4, L2-IC, and trunk fat respectively). Conclusions The strong correlation between spine fat fraction and DXA abdominal fat measures make it suitable for further development in post-menopausal chronic disease risk prediction models. PMID:27416964

  16. A generalized approach for producing, quantifying, and validating citizen science data from wildlife images.

    PubMed

    Swanson, Alexandra; Kosmala, Margaret; Lintott, Chris; Packer, Craig

    2016-06-01

    Citizen science has the potential to expand the scope and scale of research in ecology and conservation, but many professional researchers remain skeptical of data produced by nonexperts. We devised an approach for producing accurate, reliable data from untrained, nonexpert volunteers. On the citizen science website www.snapshotserengeti.org, more than 28,000 volunteers classified 1.51 million images taken in a large-scale camera-trap survey in Serengeti National Park, Tanzania. Each image was circulated to, on average, 27 volunteers, and their classifications were aggregated using a simple plurality algorithm. We validated the aggregated answers against a data set of 3829 images verified by experts and calculated 3 certainty metrics-level of agreement among classifications (evenness), fraction of classifications supporting the aggregated answer (fraction support), and fraction of classifiers who reported "nothing here" for an image that was ultimately classified as containing an animal (fraction blank)-to measure confidence that an aggregated answer was correct. Overall, aggregated volunteer answers agreed with the expert-verified data on 98% of images, but accuracy differed by species commonness such that rare species had higher rates of false positives and false negatives. Easily calculated analysis of variance and post-hoc Tukey tests indicated that the certainty metrics were significant indicators of whether each image was correctly classified or classifiable. Thus, the certainty metrics can be used to identify images for expert review. Bootstrapping analyses further indicated that 90% of images were correctly classified with just 5 volunteers per image. Species classifications based on the plurality vote of multiple citizen scientists can provide a reliable foundation for large-scale monitoring of African wildlife. © 2016 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  17. A generalized approach for producing, quantifying, and validating citizen science data from wildlife images

    PubMed Central

    Kosmala, Margaret; Lintott, Chris; Packer, Craig

    2016-01-01

    Abstract Citizen science has the potential to expand the scope and scale of research in ecology and conservation, but many professional researchers remain skeptical of data produced by nonexperts. We devised an approach for producing accurate, reliable data from untrained, nonexpert volunteers. On the citizen science website www.snapshotserengeti.org, more than 28,000 volunteers classified 1.51 million images taken in a large‐scale camera‐trap survey in Serengeti National Park, Tanzania. Each image was circulated to, on average, 27 volunteers, and their classifications were aggregated using a simple plurality algorithm. We validated the aggregated answers against a data set of 3829 images verified by experts and calculated 3 certainty metrics—level of agreement among classifications (evenness), fraction of classifications supporting the aggregated answer (fraction support), and fraction of classifiers who reported “nothing here” for an image that was ultimately classified as containing an animal (fraction blank)—to measure confidence that an aggregated answer was correct. Overall, aggregated volunteer answers agreed with the expert‐verified data on 98% of images, but accuracy differed by species commonness such that rare species had higher rates of false positives and false negatives. Easily calculated analysis of variance and post‐hoc Tukey tests indicated that the certainty metrics were significant indicators of whether each image was correctly classified or classifiable. Thus, the certainty metrics can be used to identify images for expert review. Bootstrapping analyses further indicated that 90% of images were correctly classified with just 5 volunteers per image. Species classifications based on the plurality vote of multiple citizen scientists can provide a reliable foundation for large‐scale monitoring of African wildlife. PMID:27111678

  18. Bootstrapping Methods Applied for Simulating Laboratory Works

    ERIC Educational Resources Information Center

    Prodan, Augustin; Campean, Remus

    2005-01-01

    Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…

  19. Bootstrap Confidence Intervals for Ordinary Least Squares Factor Loadings and Correlations in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Preacher, Kristopher J.; Luo, Shanhong

    2010-01-01

    This article is concerned with using the bootstrap to assign confidence intervals for rotated factor loadings and factor correlations in ordinary least squares exploratory factor analysis. Coverage performances of "SE"-based intervals, percentile intervals, bias-corrected percentile intervals, bias-corrected accelerated percentile…

  20. Bootstrap Estimation and Testing for Variance Equality.

    ERIC Educational Resources Information Center

    Olejnik, Stephen; Algina, James

    The purpose of this study was to develop a single procedure for comparing population variances which could be used for distribution forms. Bootstrap methodology was used to estimate the variability of the sample variance statistic when the population distribution was normal, platykurtic and leptokurtic. The data for the study were generated and…

  1. Bootstrapping the Syntactic Bootstrapper: Probabilistic Labeling of Prosodic Phrases

    ERIC Educational Resources Information Center

    Gutman, Ariel; Dautriche, Isabelle; Crabbé, Benoît; Christophe, Anne

    2015-01-01

    The "syntactic bootstrapping" hypothesis proposes that syntactic structure provides children with cues for learning the meaning of novel words. In this article, we address the question of how children might start acquiring some aspects of syntax before they possess a sizeable lexicon. The study presents two models of early syntax…

  2. Evaluating the Use of Random Distribution Theory to Introduce Statistical Inference Concepts to Business Students

    ERIC Educational Resources Information Center

    Larwin, Karen H.; Larwin, David A.

    2011-01-01

    Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…

  3. Rapid and accurate taxonomic classification of insect (class Insecta) cytochrome c oxidase subunit 1 (COI) DNA barcode sequences using a naïve Bayesian classifier

    PubMed Central

    Porter, Teresita M; Gibson, Joel F; Shokralla, Shadi; Baird, Donald J; Golding, G Brian; Hajibabaei, Mehrdad

    2014-01-01

    Current methods to identify unknown insect (class Insecta) cytochrome c oxidase (COI barcode) sequences often rely on thresholds of distances that can be difficult to define, sequence similarity cut-offs, or monophyly. Some of the most commonly used metagenomic classification methods do not provide a measure of confidence for the taxonomic assignments they provide. The aim of this study was to use a naïve Bayesian classifier (Wang et al. Applied and Environmental Microbiology, 2007; 73: 5261) to automate taxonomic assignments for large batches of insect COI sequences such as data obtained from high-throughput environmental sequencing. This method provides rank-flexible taxonomic assignments with an associated bootstrap support value, and it is faster than the blast-based methods commonly used in environmental sequence surveys. We have developed and rigorously tested the performance of three different training sets using leave-one-out cross-validation, two field data sets, and targeted testing of Lepidoptera, Diptera and Mantodea sequences obtained from the Barcode of Life Data system. We found that type I error rates, incorrect taxonomic assignments with a high bootstrap support, were already relatively low but could be lowered further by ensuring that all query taxa are actually present in the reference database. Choosing bootstrap support cut-offs according to query length and summarizing taxonomic assignments to more inclusive ranks can also help to reduce error while retaining the maximum number of assignments. Additionally, we highlight gaps in the taxonomic and geographic representation of insects in public sequence databases that will require further work by taxonomists to improve the quality of assignments generated using any method.

  4. Overview of physics research on the TCV tokamak

    NASA Astrophysics Data System (ADS)

    Fasoli, A.; TCV Team

    2009-10-01

    The Tokamak à Configuration Variable (TCV) tokamak is equipped with high-power (4.5 MW), real-time-controllable EC systems and flexible shaping, and plays an important role in fusion research by broadening the parameter range of reactor relevant regimes, by investigating tokamak physics questions and by developing new control tools. Steady-state discharges are achieved, in which the current is entirely self-generated through the bootstrap mechanism, a fundamental ingredient for ITER steady-state operation. The discharge remains quiescent over several current redistribution times, demonstrating that a self-consistent, 'bootstrap-aligned' equilibrium state is possible. Electron internal transport barrier regimes sustained by EC current drive have also been explored. MHD activity is shown to be crucial in scenarios characterized by large and slow oscillations in plasma confinement, which in turn can be modified by small Ohmic current perturbations altering the barrier strength. In studies of the relation between anomalous transport and plasma shape, the observed dependences of the electron thermal diffusivity on triangularity (direct) and collisionality (inverse) are qualitatively reproduced by non-linear gyro-kinetic simulations and shown to be governed by TEM turbulence. Parallel SOL flows are studied for their importance for material migration. Flow profiles are measured using a reciprocating Mach probe by changing from lower to upper single-null diverted equilibria and shifting the plasmas vertically. The dominant, field-direction-dependent Pfirsch-Schlüter component is found to be in good agreement with theoretical predictions. A field-direction-independent component is identified and is consistent with flows generated by transient over-pressure due to ballooning-like interchange turbulence. Initial high-resolution infrared images confirm that ELMs have a filamentary structure, while fast, localized radiation measurements reveal that ELM activity first appears in the X-point region. Real time control techniques are currently being applied to EC multiple independent power supplies and beam launchers, e.g. to control the plasma current in fully non-inductive conditions, and the plasma elongation through current broadening by far-off-axis heating at constant shaping field.

  5. Bootstrapping N=2 chiral correlators

    NASA Astrophysics Data System (ADS)

    Lemos, Madalena; Liendo, Pedro

    2016-01-01

    We apply the numerical bootstrap program to chiral operators in four-dimensional N=2 SCFTs. In the first part of this work we study four-point functions in which all fields have the same conformal dimension. We give special emphasis to bootstrapping a specific theory: the simplest Argyres-Douglas fixed point with no flavor symmetry. In the second part we generalize our setup and consider correlators of fields with unequal dimension. This is an example of a mixed correlator and allows us to probe new regions in the parameter space of N=2 SCFTs. In particular, our results put constraints on relations in the Coulomb branch chiral ring and on the curvature of the Zamolodchikov metric.

  6. The effect of anisotropic heat transport on magnetic islands in 3-D configurations

    NASA Astrophysics Data System (ADS)

    Schlutt, M. G.; Hegna, C. C.

    2012-08-01

    An analytic theory of nonlinear pressure-induced magnetic island formation using a boundary layer analysis is presented. This theory extends previous work by including the effects of finite parallel heat transport and is applicable to general three dimensional magnetic configurations. In this work, particular attention is paid to the role of finite parallel heat conduction in the context of pressure-induced island physics. It is found that localized currents that require self-consistent deformation of the pressure profile, such as resistive interchange and bootstrap currents, are attenuated by finite parallel heat conduction when the magnetic islands are sufficiently small. However, these anisotropic effects do not change saturated island widths caused by Pfirsch-Schlüter current effects. Implications for finite pressure-induced island healing are discussed.

  7. Exploring the Replicability of a Study's Results: Bootstrap Statistics for the Multivariate Case.

    ERIC Educational Resources Information Center

    Thompson, Bruce

    Conventional statistical significance tests do not inform the researcher regarding the likelihood that results will replicate. One strategy for evaluating result replication is to use a "bootstrap" resampling of a study's data so that the stability of results across numerous configurations of the subjects can be explored. This paper…

  8. Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization

    ERIC Educational Resources Information Center

    Lock, Robin H.; Lock, Patti Frazer

    2008-01-01

    Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…

  9. Computing Robust, Bootstrap-Adjusted Fit Indices for Use with Nonnormal Data

    ERIC Educational Resources Information Center

    Walker, David A.; Smith, Thomas J.

    2017-01-01

    Nonnormality of data presents unique challenges for researchers who wish to carry out structural equation modeling. The subsequent SPSS syntax program computes bootstrap-adjusted fit indices (comparative fit index, Tucker-Lewis index, incremental fit index, and root mean square error of approximation) that adjust for nonnormality, along with the…

  10. Forgetski Vygotsky: Or, a Plea for Bootstrapping Accounts of Learning

    ERIC Educational Resources Information Center

    Luntley, Michael

    2017-01-01

    This paper argues that sociocultural accounts of learning fail to answer the key question about learning--how is it possible? Accordingly, we should adopt an individualist bootstrapping methodology in providing a theory of learning. Such a methodology takes seriously the idea that learning is staged and distinguishes between a non-comprehending…

  11. Higher curvature gravities, unlike GR, cannot be bootstrapped from their (usual) linearizations

    NASA Astrophysics Data System (ADS)

    Deser, S.

    2017-12-01

    We show that higher curvature order gravities, in particular the propagating quadratic curvature models, cannot be derived by self-coupling from their linear, flat space, forms, except through an unphysical version of linearization; only GR can. Separately, we comment on an early version of the self-coupling bootstrap.

  12. Methods for Estimating Uncertainty in PMF Solutions: Examples with Ambient Air and Water Quality Data and Guidance on Reporting PMF Results

    EPA Science Inventory

    The new version of EPA’s positive matrix factorization (EPA PMF) software, 5.0, includes three error estimation (EE) methods for analyzing factor analytic solutions: classical bootstrap (BS), displacement of factor elements (DISP), and bootstrap enhanced by displacement (BS-DISP)...

  13. Bootsie: estimation of coefficient of variation of AFLP data by bootstrap analysis

    USDA-ARS?s Scientific Manuscript database

    Bootsie is an English-native replacement for ASG Coelho’s “DBOOT” utility for estimating coefficient of variation of a population of AFLP marker data using bootstrapping. Bootsie improves on DBOOT by supporting batch processing, time-to-completion estimation, built-in graphs, and a suite of export t...

  14. How to Bootstrap a Human Communication System

    ERIC Educational Resources Information Center

    Fay, Nicolas; Arbib, Michael; Garrod, Simon

    2013-01-01

    How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified…

  15. Measuring and Benchmarking Technical Efficiency of Public Hospitals in Tianjin, China: A Bootstrap-Data Envelopment Analysis Approach.

    PubMed

    Li, Hao; Dong, Siping

    2015-01-01

    China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. © The Author(s) 2015.

  16. Weak percolation on multiplex networks

    NASA Astrophysics Data System (ADS)

    Baxter, Gareth J.; Dorogovtsev, Sergey N.; Mendes, José F. F.; Cellai, Davide

    2014-04-01

    Bootstrap percolation is a simple but nontrivial model. It has applications in many areas of science and has been explored on random networks for several decades. In single-layer (simplex) networks, it has been recently observed that bootstrap percolation, which is defined as an incremental process, can be seen as the opposite of pruning percolation, where nodes are removed according to a connectivity rule. Here we propose models of both bootstrap and pruning percolation for multiplex networks. We collectively refer to these two models with the concept of "weak" percolation, to distinguish them from the somewhat classical concept of ordinary ("strong") percolation. While the two models coincide in simplex networks, we show that they decouple when considering multiplexes, giving rise to a wealth of critical phenomena. Our bootstrap model constitutes the simplest example of a contagion process on a multiplex network and has potential applications in critical infrastructure recovery and information security. Moreover, we show that our pruning percolation model may provide a way to diagnose missing layers in a multiplex network. Finally, our analytical approach allows us to calculate critical behavior and characterize critical clusters.

  17. Visuospatial bootstrapping: Binding useful visuospatial information during verbal working memory encoding does not require set-shifting executive resources.

    PubMed

    Calia, Clara; Darling, Stephen; Havelka, Jelena; Allen, Richard J

    2018-05-01

    Immediate serial recall of digits is better when the digits are shown by highlighting them in a familiar array, such as a phone keypad, compared with presenting them serially in a single location, a pattern referred to as "visuospatial bootstrapping." This pattern implies the establishment of temporary links between verbal and spatial working memory, alongside access to information in long-term memory. However, the role of working memory control processes like those implied by the "Central Executive" in bootstrapping has not been directly investigated. Here, we report a study addressing this issue, focusing on executive processes of attentional shifting. Tasks in which information has to be sequenced are thought to be heavily dependent on shifting. Memory for digits presented in keypads versus single locations was assessed under two secondary task load conditions, one with and one without a sequencing requirement, and hence differing in the degree to which they invoke shifting. Results provided clear evidence that multimodal binding (visuospatial bootstrapping) can operate independently of this form of executive control process.

  18. Understanding roles of E  ×  B flow and magnetic shear on the formation of internal and edge transport barriers using two-field bifurcation concept

    NASA Astrophysics Data System (ADS)

    Chatthong, B.; Onjun, T.

    2016-01-01

    A set of heat and particle transport equations with the inclusion of E  ×  B flow and magnetic shear is used to understand the formation and behaviors of edge transport barriers (ETBs) and internal transport barriers (ITBs) in tokamak plasmas based on two-field bifurcation concept. A simple model that can describe the E  ×  B flow shear and magnetic shear effect in tokamak plasma is used for anomalous transport suppression with the effect of bootstrap current included. Consequently, conditions and formations of ETB and ITB can be visualized and studied. It can be seen that the ETB formation depends sensitively on the E  ×  B flow shear suppression with small dependence on the magnetic shear suppression. However, the ITB formation depends sensitively on the magnetic shear suppression with a small dependence on the E  ×  B flow shear suppression. Once the H-mode is achieved, the s-curve bifurcation diagram is modified due to an increase of bootstrap current at the plasma edge, resulting in reductions of both L-H and H-L transition thresholds with stronger hysteresis effects. It is also found that both ITB and ETB widths appear to be governed by heat or particle sources and the location of the current peaking. In addition, at a marginal flux just below the L-H threshold, a small perturbation in terms of heat or density fluctuation can result in a transition, which can remain after the perturbation is removed due to the hysteresis effect.

  19. Role of Deep Convection in Establishing the Isotopic Composition of Water Vapor in the Tropical Transition Layer

    NASA Technical Reports Server (NTRS)

    Smith, Jamison A.; Ackerman, Andrew S.; Jensen, Eric J.; Toon, Owen B.

    2006-01-01

    The transport of H2O and HDO within deep convection is investigated with 3-D large eddy simulations (LES) using bin microphysics. The lofting and sublimation of HDO-rich ice invalidate the Rayleigh fractionation model of isotopologue distribution within deep convection. Bootstrapping the correlation of the ratio of HDO to H2O (deltaD) to water vapor mixing ratio (q(sub v)) through a sequence of convective events produced non-Rayleigh correlations resembling observations. These results support two mechanisms for stratospheric entry. Deep convection can inject air with water vapor of stratospheric character directly into the tropical transition layer (TTL). Alternatively, moister air detraining from convection may be dehydrated via cirrus formation n the TTL to produce stratospheric water vapor. Significant production of subsaturated air in the TTL via convective dehydration is not observed in these simulations, nor is it necessary to resolve the stratospheric isotope paradox.

  20. Diamagnetic drift effects on the low-n magnetohydrodynamic modes at the high mode pedestal with plasma rotation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, L. J.; Kotschenreuther, M. T.; Valanju, P.

    2014-06-15

    The diamagnetic drift effects on the low-n magnetohydrodynamic instabilities at the high-mode (H-mode) pedestal are investigated in this paper with the inclusion of bootstrap current for equilibrium and rotation effects for stability, where n is the toroidal mode number. The AEGIS (Adaptive EiGenfunction Independent Solutions) code [L. J. Zheng and M. T. Kotschenreuther, J. Comp. Phys. 211 (2006)] is extended to include the diamagnetic drift effects. This can be viewed as the lowest order approximation of the finite Larmor radius effects in consideration of the pressure gradient steepness at the pedestal. The H-mode discharges at Jointed European Torus is reconstructedmore » numerically using the VMEC code [P. Hirshman and J. C. Whitson, Phys. Fluids 26, 3553 (1983)], with bootstrap current taken into account. Generally speaking, the diamagnetic drift effects are stabilizing. Our results show that the effectiveness of diamagnetic stabilization depends sensitively on the safe factor value (q{sub s}) at the safety-factor reversal or plateau region. The diamagnetic stabilization are weaker, when q{sub s} is larger than an integer; while stronger, when q{sub s} is smaller or less larger than an integer. We also find that the diamagnetic drift effects also depend sensitively on the rotation direction. The diamagnetic stabilization in the co-rotation case is stronger than in the counter rotation case with respect to the ion diamagnetic drift direction.« less

  1. Generalized Bootstrap Method for Assessment of Uncertainty in Semivariogram Inference

    USGS Publications Warehouse

    Olea, R.A.; Pardo-Iguzquiza, E.

    2011-01-01

    The semivariogram and its related function, the covariance, play a central role in classical geostatistics for modeling the average continuity of spatially correlated attributes. Whereas all methods are formulated in terms of the true semivariogram, in practice what can be used are estimated semivariograms and models based on samples. A generalized form of the bootstrap method to properly model spatially correlated data is used to advance knowledge about the reliability of empirical semivariograms and semivariogram models based on a single sample. Among several methods available to generate spatially correlated resamples, we selected a method based on the LU decomposition and used several examples to illustrate the approach. The first one is a synthetic, isotropic, exhaustive sample following a normal distribution, the second example is also a synthetic but following a non-Gaussian random field, and a third empirical sample consists of actual raingauge measurements. Results show wider confidence intervals than those found previously by others with inadequate application of the bootstrap. Also, even for the Gaussian example, distributions for estimated semivariogram values and model parameters are positively skewed. In this sense, bootstrap percentile confidence intervals, which are not centered around the empirical semivariogram and do not require distributional assumptions for its construction, provide an achieved coverage similar to the nominal coverage. The latter cannot be achieved by symmetrical confidence intervals based on the standard error, regardless if the standard error is estimated from a parametric equation or from bootstrap. ?? 2010 International Association for Mathematical Geosciences.

  2. Predicting Postsurgical Satisfaction in Adolescents With Idiopathic Scoliosis: The Role of Presurgical Functioning and Expectations.

    PubMed

    Sieberg, Christine B; Manganella, Juliana; Manalo, Gem; Simons, Laura E; Hresko, M Timothy

    2017-12-01

    There is a need to better assess patient satisfaction and surgical outcomes. The purpose of the current study is to identify how preoperative expectations can impact postsurgical satisfaction among youth with adolescent idiopathic scoliosis undergoing spinal fusion surgery. The present study includes patients with adolescent idiopathic scoliosis undergoing spinal fusion surgery enrolled in a prospective, multicentered registry examining postsurgical outcomes. The Scoliosis Research Society Questionnaire-Version 30, which assesses pain, self-image, mental health, and satisfaction with management, along with the Spinal Appearance Questionnaire, which measures surgical expectations was administered to 190 patients before surgery and 1 and 2 years postoperatively. Regression analyses with bootstrapping (with n=5000 bootstrap samples) were conducted with 99% bias-corrected confidence intervals to examine the extent to which preoperative expectations for spinal appearance mediated the relationship between presurgical mental health and pain and 2-year postsurgical satisfaction. Results indicate that preoperative mental health, pain, and expectations are predictive of postsurgical satisfaction. With the shifting health care system, physicians may want to consider patient mental health, pain, and expectations before surgery to optimize satisfaction and ultimately improve clinical care and patient outcomes. Level I-prognostic study.

  3. Standard errors and confidence intervals for variable importance in random forest regression, classification, and survival.

    PubMed

    Ishwaran, Hemant; Lu, Min

    2018-06-04

    Random forests are a popular nonparametric tree ensemble procedure with broad applications to data analysis. While its widespread popularity stems from its prediction performance, an equally important feature is that it provides a fully nonparametric measure of variable importance (VIMP). A current limitation of VIMP, however, is that no systematic method exists for estimating its variance. As a solution, we propose a subsampling approach that can be used to estimate the variance of VIMP and for constructing confidence intervals. The method is general enough that it can be applied to many useful settings, including regression, classification, and survival problems. Using extensive simulations, we demonstrate the effectiveness of the subsampling estimator and in particular find that the delete-d jackknife variance estimator, a close cousin, is especially effective under low subsampling rates due to its bias correction properties. These 2 estimators are highly competitive when compared with the .164 bootstrap estimator, a modified bootstrap procedure designed to deal with ties in out-of-sample data. Most importantly, subsampling is computationally fast, thus making it especially attractive for big data settings. Copyright © 2018 John Wiley & Sons, Ltd.

  4. Correcting the Standard Errors of 2-Stage Residual Inclusion Estimators for Mendelian Randomization Studies

    PubMed Central

    Palmer, Tom M; Holmes, Michael V; Keating, Brendan J; Sheehan, Nuala A

    2017-01-01

    Abstract Mendelian randomization studies use genotypes as instrumental variables to test for and estimate the causal effects of modifiable risk factors on outcomes. Two-stage residual inclusion (TSRI) estimators have been used when researchers are willing to make parametric assumptions. However, researchers are currently reporting uncorrected or heteroscedasticity-robust standard errors for these estimates. We compared several different forms of the standard error for linear and logistic TSRI estimates in simulations and in real-data examples. Among others, we consider standard errors modified from the approach of Newey (1987), Terza (2016), and bootstrapping. In our simulations Newey, Terza, bootstrap, and corrected 2-stage least squares (in the linear case) standard errors gave the best results in terms of coverage and type I error. In the real-data examples, the Newey standard errors were 0.5% and 2% larger than the unadjusted standard errors for the linear and logistic TSRI estimators, respectively. We show that TSRI estimators with modified standard errors have correct type I error under the null. Researchers should report TSRI estimates with modified standard errors instead of reporting unadjusted or heteroscedasticity-robust standard errors. PMID:29106476

  5. Peace of Mind, Academic Motivation, and Academic Achievement in Filipino High School Students.

    PubMed

    Datu, Jesus Alfonso D

    2017-04-09

    Recent literature has recognized the advantageous role of low-arousal positive affect such as feelings of peacefulness and internal harmony in collectivist cultures. However, limited research has explored the benefits of low-arousal affective states in the educational setting. The current study examined the link of peace of mind (PoM) to academic motivation (i.e., amotivation, controlled motivation, and autonomous motivation) and academic achievement among 525 Filipino high school students. Findings revealed that PoM was positively associated with academic achievement β = .16, p < .05, autonomous motivation β = .48, p < .001, and controlled motivation β = .25, p < .01. As expected, PoM was negatively related to amotivation β = -.19, p < .05, and autonomous motivation was positively associated with academic achievement β = .52, p < .01. Furthermore, the results of bias-corrected bootstrap analyses at 95% confidence interval based on 5,000 bootstrapped resamples demonstrated that peace of mind had an indirect influence on academic achievement through the mediating effects of autonomous motivation. In terms of the effect sizes, the findings showed that PoM explained about 1% to 18% of the variance in academic achievement and motivation. The theoretical and practical implications of the results are elucidated.

  6. Performance of an SOI Boot-Strapped Full-Bridge MOSFET Driver, Type CHT-FBDR, under Extreme Temperatures

    NASA Technical Reports Server (NTRS)

    Patterson, Richard; Hammoud, Ahmad

    2009-01-01

    Electronic systems designed for use in deep space and planetary exploration missions are expected to encounter extreme temperatures and wide thermal swings. Silicon-based devices are limited in their wide-temperature capability and usually require extra measures, such as cooling or heating mechanisms, to provide adequate ambient temperature for proper operation. Silicon-On-Insulator (SOI) technology, on the other hand, lately has been gaining wide spread use in applications where high temperatures are encountered. Due to their inherent design, SOI-based integrated circuit chips are able to operate at temperatures higher than those of the silicon devices by virtue of reducing leakage currents, eliminating parasitic junctions, and limiting internal heating. In addition, SOI devices provide faster switching, consume less power, and offer improved radiation-tolerance. Very little data, however, exist on the performance of such devices and circuits under cryogenic temperatures. In this work, the performance of an SOI bootstrapped, full-bridge driver integrated circuit was evaluated under extreme temperatures and thermal cycling. The investigations were carried out to establish a baseline on the functionality and to determine suitability of this device for use in space exploration missions under extreme temperature conditions.

  7. Pulling Econometrics Students up by Their Bootstraps

    ERIC Educational Resources Information Center

    O'Hara, Michael E.

    2014-01-01

    Although the concept of the sampling distribution is at the core of much of what we do in econometrics, it is a concept that is often difficult for students to grasp. The thought process behind bootstrapping provides a way for students to conceptualize the sampling distribution in a way that is intuitive and visual. However, teaching students to…

  8. Accuracy assessment of percent canopy cover, cover type, and size class

    Treesearch

    H. T. Schreuder; S. Bain; R. C. Czaplewski

    2003-01-01

    Truth for vegetation cover percent and type is obtained from very large-scale photography (VLSP), stand structure as measured by size classes, and vegetation types from a combination of VLSP and ground sampling. We recommend using the Kappa statistic with bootstrap confidence intervals for overall accuracy, and similarly bootstrap confidence intervals for percent...

  9. Finding One's Meaning: A Test of the Relation between Quantifiers and Integers in Language Development

    ERIC Educational Resources Information Center

    Barner, David; Chow, Katherine; Yang, Shu-Ju

    2009-01-01

    We explored children's early interpretation of numerals and linguistic number marking, in order to test the hypothesis (e.g., Carey (2004). Bootstrapping and the origin of concepts. "Daedalus", 59-68) that children's initial distinction between "one" and other numerals (i.e., "two," "three," etc.) is bootstrapped from a prior distinction between…

  10. A Class of Population Covariance Matrices in the Bootstrap Approach to Covariance Structure Analysis

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Hayashi, Kentaro; Yanagihara, Hirokazu

    2007-01-01

    Model evaluation in covariance structure analysis is critical before the results can be trusted. Due to finite sample sizes and unknown distributions of real data, existing conclusions regarding a particular statistic may not be applicable in practice. The bootstrap procedure automatically takes care of the unknown distribution and, for a given…

  11. A Resampling Analysis of Federal Family Assistance Program Quality Control Data: An Application of the Bootstrap.

    ERIC Educational Resources Information Center

    Hand, Michael L.

    1990-01-01

    Use of the bootstrap resampling technique (BRT) is assessed in its application to resampling analysis associated with measurement of payment allocation errors by federally funded Family Assistance Programs. The BRT is applied to a food stamp quality control database in Oregon. This analysis highlights the outlier-sensitivity of the…

  12. Calculating Confidence Intervals for Regional Economic Impacts of Recreastion by Bootstrapping Visitor Expenditures

    Treesearch

    Donald B.K. English

    2000-01-01

    In this paper I use bootstrap procedures to develop confidence intervals for estimates of total industrial output generated per thousand tourist visits. Mean expenditures from replicated visitor expenditure data included weights to correct for response bias. Impacts were estimated with IMPLAN. Ninety percent interval endpoints were 6 to 16 percent above or below the...

  13. Comparison of Methods for Estimating Low Flow Characteristics of Streams

    USGS Publications Warehouse

    Tasker, Gary D.

    1987-01-01

    Four methods for estimating the 7-day, 10-year and 7-day, 20-year low flows for streams are compared by the bootstrap method. The bootstrap method is a Monte Carlo technique in which random samples are drawn from an unspecified sampling distribution defined from observed data. The nonparametric nature of the bootstrap makes it suitable for comparing methods based on a flow series for which the true distribution is unknown. Results show that the two methods based on hypothetical distribution (Log-Pearson III and Weibull) had lower mean square errors than did the G. E. P. Box-D. R. Cox transformation method or the Log-W. C. Boughton method which is based on a fit of plotting positions.

  14. H-mode fueling optimization with the supersonic deuterium jet in NSTX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soukhanovskii, V A; Bell, M G; Bell, R E

    2008-06-18

    High-performance, long-pulse 0.7-1.2 MA 6-7 MW NBI-heated small-ELM H-mode plasma discharges are developed in the National Spherical Torus Experiment (NSTX) as prototypes for confinement and current drive extrapolations to future spherical tori. It is envisioned that innovative lithium coating techniques for H-mode density pumping and a supersonic deuterium jet for plasma refueling will be used to achieve the low pedestal collisionality and low n{sub e}/n{sub G} fractions (0.3-0.6), both of which being essential conditions for maximizing the non-inductive (bootstrap and beam driven) current fractions. The low field side supersonic gas injector (SGI) on NSTX consists of a small converging-diverging graphitemore » Laval nozzle and a piezoelectric gas valve. The nozzle is capable of producing a deuterium jet with Mach number M {le} 4, estimated gas density at the nozzle exit n {le} 5 x 10{sup 23} m{sup -3}, estimated temperature T {ge} 70 K, and flow velocity v = 2:4 km/s. The nozzle Reynolds number Reis {approx_equal} 6000. The nozzle and the valve are enclosed in a protective carbon fiber composite shroud and mounted on a movable probe at a midplane port location. Despite the beneficial L-mode fueling experience with supersonic jets in limiter tokamaks, there is a limited experience with fueling of high-performance H-mode divertor discharges and the associated density, MHD stability, and MARFE limits. In initial supersonic deuterium jet fueling experiments in NSTX, a reliable H-mode access, a low NBI power threshold, P{sub LH} {le} 2 MW, and a high fueling efficiency (0.1-0.4) have been demonstrated. Progress has also been made toward a better control of the injected fueling gas by decreasing the uncontrolled high field side (HFS) injector fueling rate by up to 95 % and complementing it with the supersonic jet fueling. These results motivated recent upgrades to the SGI gas delivery and control systems. The new SGI-Upgrade (SGI-U) capabilities include multi-pulse ms-scale controls and a reservoir gas pressure up to P{sub 0} = 5000 Torr. In this paper we summarize recent progress toward optimization of H-mode fueling in NSTX using the SGI-U.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tucker, Susan L.; Liu, H. Helen; Wang, Shulian

    Purpose: The aim of this study was to investigate the effect of radiation dose distribution in the lung on the risk of postoperative pulmonary complications among esophageal cancer patients. Methods and Materials: We analyzed data from 110 patients with esophageal cancer treated with concurrent chemoradiotherapy followed by surgery at our institution from 1998 to 2003. The endpoint for analysis was postsurgical pneumonia or acute respiratory distress syndrome. Dose-volume histograms (DVHs) and dose-mass histograms (DMHs) for the whole lung were used to fit normal-tissue complication probability (NTCP) models, and the quality of fits were compared using bootstrap analysis. Results: Normal-tissue complicationmore » probability modeling identified that the risk of postoperative pulmonary complications was most significantly associated with small absolute volumes of lung spared from doses {>=}5 Gy (VS5), that is, exposed to doses <5 Gy. However, bootstrap analysis found no significant difference between the quality of this model and fits based on other dosimetric parameters, including mean lung dose, effective dose, and relative volume of lung receiving {>=}5 Gy, probably because of correlations among these factors. The choice of DVH vs. DMH or the use of fractionation correction did not significantly affect the results of the NTCP modeling. The parameter values estimated for the Lyman NTCP model were as follows (with 95% confidence intervals in parentheses): n = 1.85 (0.04, {infinity}), m = 0.55 (0.22, 1.02), and D {sub 5} = 17.5 Gy (9.4 Gy, 102 Gy). Conclusions: In this cohort of esophageal cancer patients, several dosimetric parameters including mean lung dose, effective dose, and absolute volume of lung receiving <5 Gy provided similar descriptions of the risk of postoperative pulmonary complications as a function of Radiation dose distribution in the lung.« less

  16. Improved human observer performance in digital reconstructed radiograph verification in head and neck cancer radiotherapy.

    PubMed

    Sturgeon, Jared D; Cox, John A; Mayo, Lauren L; Gunn, G Brandon; Zhang, Lifei; Balter, Peter A; Dong, Lei; Awan, Musaddiq; Kocak-Uzel, Esengul; Mohamed, Abdallah Sherif Radwan; Rosenthal, David I; Fuller, Clifton David

    2015-10-01

    Digitally reconstructed radiographs (DRRs) are routinely used as an a priori reference for setup correction in radiotherapy. The spatial resolution of DRRs may be improved to reduce setup error in fractionated radiotherapy treatment protocols. The influence of finer CT slice thickness reconstruction (STR) and resultant increased resolution DRRs on physician setup accuracy was prospectively evaluated. Four head and neck patient CT-simulation images were acquired and used to create DRR cohorts by varying STRs at 0.5, 1, 2, 2.5, and 3 mm. DRRs were displaced relative to a fixed isocenter using 0-5 mm random shifts in the three cardinal axes. Physician observers reviewed DRRs of varying STRs and displacements and then aligned reference and test DRRs replicating daily KV imaging workflow. A total of 1,064 images were reviewed by four blinded physicians. Observer errors were analyzed using nonparametric statistics (Friedman's test) to determine whether STR cohorts had detectably different displacement profiles. Post hoc bootstrap resampling was applied to evaluate potential generalizability. The observer-based trial revealed a statistically significant difference between cohort means for observer displacement vector error ([Formula: see text]) and for [Formula: see text]-axis [Formula: see text]. Bootstrap analysis suggests a 15% gain in isocenter translational setup error with reduction of STR from 3 mm to [Formula: see text]2 mm, though interobserver variance was a larger feature than STR-associated measurement variance. Higher resolution DRRs generated using finer CT scan STR resulted in improved observer performance at shift detection and could decrease operator-dependent geometric error. Ideally, CT STRs [Formula: see text]2 mm should be utilized for DRR generation in the head and neck.

  17. The effect of white matter hyperintensities on verbal memory: Mediation by temporal lobe atrophy.

    PubMed

    Swardfager, Walter; Cogo-Moreira, Hugo; Masellis, Mario; Ramirez, Joel; Herrmann, Nathan; Edwards, Jodi D; Saleem, Mahwesh; Chan, Parco; Yu, Di; Nestor, Sean M; Scott, Christopher J M; Holmes, Melissa F; Sahlas, Demetrios J; Kiss, Alexander; Oh, Paul I; Strother, Stephen C; Gao, Fuqiang; Stefanovic, Bojana; Keith, Julia; Symons, Sean; Swartz, Richard H; Lanctôt, Krista L; Stuss, Donald T; Black, Sandra E

    2018-02-20

    To determine the relationship between white matter hyperintensities (WMH) presumed to indicate disease of the cerebral small vessels, temporal lobe atrophy, and verbal memory deficits in Alzheimer disease (AD) and other dementias. We recruited groups of participants with and without AD, including strata with extensive WMH and minimal WMH, into a cross-sectional proof-of-principle study (n = 118). A consecutive case series from a memory clinic was used as an independent validation sample (n = 702; Sunnybrook Dementia Study; NCT01800214). We assessed WMH volume and left temporal lobe atrophy (measured as the brain parenchymal fraction) using structural MRI and verbal memory using the California Verbal Learning Test. Using path modeling with an inferential bootstrapping procedure, we tested an indirect effect of WMH on verbal recall that depends sequentially on temporal lobe atrophy and verbal learning. In both samples, WMH predicted poorer verbal recall, specifically due to temporal lobe atrophy and poorer verbal learning (proof-of-principle -1.53, 95% bootstrap confidence interval [CI] -2.45 to -0.88; and confirmation -0.66, 95% CI [-0.95 to -0.41] words). This pathway was significant in subgroups with (-0.20, 95% CI [-0.38 to -0.07] words, n = 363) and without (-0.71, 95% CI [-1.12 to -0.37] words, n = 339) AD. Via the identical pathway, WMH contributed to deficits in recognition memory (-1.82%, 95% CI [-2.64% to -1.11%]), a sensitive and specific sign of AD. Across dementia syndromes, WMH contribute indirectly to verbal memory deficits considered pathognomonic of Alzheimer disease, specifically by contributing to temporal lobe atrophy. © 2018 American Academy of Neurology.

  18. Population pharmacokinetics and safety of eptifibatide in healthy Chinese volunteers and simulations on the dose regimens approved for a Western population.

    PubMed

    Wang, Xi-Pei; Zhou, Zhi-Ling; Yang, Min; Mai, Li-Ping; Zheng, Zhi-Jie; He, Guo-Dong; Wu, Yue-Heng; Lin, Qiu-Xiong; Shan, Zhi-Xin; Yu, Xi-Yong

    2015-08-01

    This study was designed to evaluate the pharmacokinetics (PK) and safety of eptifibatide in healthy Chinese volunteers and provide information for the further study in the Chinese population. 30 healthy volunteers (15 male) were enrolled in the study and divided into three dose groups (45 µg x kg⁻¹, 90 µg x kg⁻¹, and 180 µg x kg⁻¹). Plasma and urine samples were drawn after one single-bolus administration and measured by LC-MS/MS. The plasma and urine data were analyzed simultaneously by the population approach using the NONMEM software and evaluated by the visual predicted check (VPC) and bootstraping. The PK profiles of dose regimens approved for a Western population in the Chinese population were simulated. A two-compartment model adequately described the PK profiles of eptifibatide. The clearance (CL) and the distribution volume (V₁) of the central compartment were 0.128 L x h⁻¹ x kg⁻¹ and 0.175 L x kg⁻¹, respectively. The clearance (Q) and V₂of the peripheral compartment were 0.0988 L x h⁻¹ x kg⁻¹ and 0.147 L x kg⁻¹, respectively. The elimination fraction from plasma to urine (F₀) was 17.2%. No covariates were found to have a significant effect. Inter-individual variabilites were all within 33.9%. The VPC plots and bootstrap results indicated good precision and prediction of the model. The simulations of the approved regimens in the Chinese population showed much lower steady-state concentrations than the target concentration obtained from the Western clinical trials. No severe safety events were found in this study. The PK model of eptifibatide was established and could provide PK information for further studies in the Chinese population.

  19. Self-consistent modeling of the dynamic evolution of magnetic island growth in the presence of stabilizing electron-cyclotron current drive

    NASA Astrophysics Data System (ADS)

    Chatziantonaki, Ioanna; Tsironis, Christos; Isliker, Heinz; Vlahos, Loukas

    2013-11-01

    The most promising technique for the control of neoclassical tearing modes in tokamak experiments is the compensation of the missing bootstrap current with an electron-cyclotron current drive (ECCD). In this frame, the dynamics of magnetic islands has been studied extensively in terms of the modified Rutherford equation (MRE), including the presence of a current drive, either analytically described or computed by numerical methods. In this article, a self-consistent model for the dynamic evolution of the magnetic island and the driven current is derived, which takes into account the island's magnetic topology and its effect on the current drive. The model combines the MRE with a ray-tracing approach to electron-cyclotron wave-propagation and absorption. Numerical results exhibit a decrease in the time required for complete stabilization with respect to the conventional computation (not taking into account the island geometry), which increases by increasing the initial island size and radial misalignment of the deposition.

  20. Exact Mass-Coupling Relation for the Homogeneous Sine-Gordon Model.

    PubMed

    Bajnok, Zoltán; Balog, János; Ito, Katsushi; Satoh, Yuji; Tóth, Gábor Zsolt

    2016-05-06

    We derive the exact mass-coupling relation of the simplest multiscale quantum integrable model, i.e., the homogeneous sine-Gordon model with two mass scales. The relation is obtained by comparing the perturbed conformal field theory description of the model valid at short distances to the large distance bootstrap description based on the model's integrability. In particular, we find a differential equation for the relation by constructing conserved tensor currents, which satisfy a generalization of the Θ sum rule Ward identity. The mass-coupling relation is written in terms of hypergeometric functions.

  1. Technical and scale efficiency in public and private Irish nursing homes - a bootstrap DEA approach.

    PubMed

    Ni Luasa, Shiovan; Dineen, Declan; Zieba, Marta

    2016-10-27

    This article provides methodological and empirical insights into the estimation of technical efficiency in the nursing home sector. Focusing on long-stay care and using primary data, we examine technical and scale efficiency in 39 public and 73 private Irish nursing homes by applying an input-oriented data envelopment analysis (DEA). We employ robust bootstrap methods to validate our nonparametric DEA scores and to integrate the effects of potential determinants in estimating the efficiencies. Both the homogenous and two-stage double bootstrap procedures are used to obtain confidence intervals for the bias-corrected DEA scores. Importantly, the application of the double bootstrap approach affords true DEA technical efficiency scores after adjusting for the effects of ownership, size, case-mix, and other determinants such as location, and quality. Based on our DEA results for variable returns to scale technology, the average technical efficiency score is 62 %, and the mean scale efficiency is 88 %, with nearly all units operating on the increasing returns to scale part of the production frontier. Moreover, based on the double bootstrap results, Irish nursing homes are less technically efficient, and more scale efficient than the conventional DEA estimates suggest. Regarding the efficiency determinants, in terms of ownership, we find that private facilities are less efficient than the public units. Furthermore, the size of the nursing home has a positive effect, and this reinforces our finding that Irish homes produce at increasing returns to scale. Also, notably, we find that a tendency towards quality improvements can lead to poorer technical efficiency performance.

  2. Uncertainty Estimation using Bootstrapped Kriging Predictions for Precipitation Isoscapes

    NASA Astrophysics Data System (ADS)

    Ma, C.; Bowen, G. J.; Vander Zanden, H.; Wunder, M.

    2017-12-01

    Isoscapes are spatial models representing the distribution of stable isotope values across landscapes. Isoscapes of hydrogen and oxygen in precipitation are now widely used in a diversity of fields, including geology, biology, hydrology, and atmospheric science. To generate isoscapes, geostatistical methods are typically applied to extend predictions from limited data measurements. Kriging is a popular method in isoscape modeling, but quantifying the uncertainty associated with the resulting isoscapes is challenging. Applications that use precipitation isoscapes to determine sample origin require estimation of uncertainty. Here we present a simple bootstrap method (SBM) to estimate the mean and uncertainty of the krigged isoscape and compare these results with a generalized bootstrap method (GBM) applied in previous studies. We used hydrogen isotopic data from IsoMAP to explore these two approaches for estimating uncertainty. We conducted 10 simulations for each bootstrap method and found that SBM results in more kriging predictions (9/10) compared to GBM (4/10). Prediction from SBM was closer to the original prediction generated without bootstrapping and had less variance than GBM. SBM was tested on different datasets from IsoMAP with different numbers of observation sites. We determined that predictions from the datasets with fewer than 40 observation sites using SBM were more variable than the original prediction. The approaches we used for estimating uncertainty will be compiled in an R package that is under development. We expect that these robust estimates of precipitation isoscape uncertainty can be applied in diagnosing the origin of samples ranging from various type of waters to migratory animals, food products, and humans.

  3. Bootstrapping rapidity anomalous dimensions for transverse-momentum resummation

    DOE PAGES

    Li, Ye; Zhu, Hua Xing

    2017-01-11

    Soft function relevant for transverse-momentum resummation for Drell-Yan or Higgs production at hadron colliders are computed through to three loops in the expansion of strong coupling, with the help of bootstrap technique and supersymmetric decomposition. The corresponding rapidity anomalous dimension is extracted. Furthermore, an intriguing relation between anomalous dimensions for transverse-momentum resummation and threshold resummation is found.

  4. Reliability of confidence intervals calculated by bootstrap and classical methods using the FIA 1-ha plot design

    Treesearch

    H. T. Schreuder; M. S. Williams

    2000-01-01

    In simulation sampling from forest populations using sample sizes of 20, 40, and 60 plots respectively, confidence intervals based on the bootstrap (accelerated, percentile, and t-distribution based) were calculated and compared with those based on the classical t confidence intervals for mapped populations and subdomains within those populations. A 68.1 ha mapped...

  5. Morphological Cues vs. Number of Nominals in Learning Verb Types in Turkish: The Syntactic Bootstrapping Mechanism Revisited

    ERIC Educational Resources Information Center

    Ural, A. Engin; Yuret, Deniz; Ketrez, F. Nihan; Kocbas, Dilara; Kuntay, Aylin C.

    2009-01-01

    The syntactic bootstrapping mechanism of verb learning was evaluated against child-directed speech in Turkish, a language with rich morphology, nominal ellipsis and free word order. Machine-learning algorithms were run on transcribed caregiver speech directed to two Turkish learners (one hour every two weeks between 0;9 to 1;10) of different…

  6. A Comparison of the Bootstrap-F, Improved General Approximation, and Brown-Forsythe Multivariate Approaches in a Mixed Repeated Measures Design

    ERIC Educational Resources Information Center

    Seco, Guillermo Vallejo; Izquierdo, Marcelino Cuesta; Garcia, M. Paula Fernandez; Diez, F. Javier Herrero

    2006-01-01

    The authors compare the operating characteristics of the bootstrap-F approach, a direct extension of the work of Berkovits, Hancock, and Nevitt, with Huynh's improved general approximation (IGA) and the Brown-Forsythe (BF) multivariate approach in a mixed repeated measures design when normality and multisample sphericity assumptions do not hold.…

  7. Sample-based estimation of tree species richness in a wet tropical forest compartment

    Treesearch

    Steen Magnussen; Raphael Pelissier

    2007-01-01

    Petersen's capture-recapture ratio estimator and the well-known bootstrap estimator are compared across a range of simulated low-intensity simple random sampling with fixed-area plots of 100 m? in a rich wet tropical forest compartment with 93 tree species in the Western Ghats of India. Petersen's ratio estimator was uniformly superior to the bootstrap...

  8. Common Ground between Form and Content: The Pragmatic Solution to the Bootstrapping Problem

    ERIC Educational Resources Information Center

    Oller, John W.

    2005-01-01

    The frame of reference for this article is second or foreign language (L2 or FL) acquisition, but the pragmatic bootstrapping hypothesis applies to language processing and acquisition in any context or modality. It is relevant to teaching children to read. It shows how connections between target language surface forms and their content can be made…

  9. The Bacterial Gene IfpA Influences the Potent Induction of Calcitonin Receptor and Osteoclast-Related Genes in Burkholderia Pseudomallei-Induced TRAP-Positive Multinucleated Giant Cells

    DTIC Science & Technology

    2006-06-13

    with arithmetic mean ( UPGMA ) using random tie breaking and uncorrected pairwise distances in MacVector 7.0 (Oxford Molecular). Numbers on branches...denote the UPGMA bootstrap percentage using a highly stringent number (1000) of replications (Felsenstein, 1985). All bootstrap values are 50%, as shown

  10. A Comparison of Single Sample and Bootstrap Methods to Assess Mediation in Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Pituch, Keenan A.; Stapleton, Laura M.; Kang, Joo Youn

    2006-01-01

    A Monte Carlo study examined the statistical performance of single sample and bootstrap methods that can be used to test and form confidence interval estimates of indirect effects in two cluster randomized experimental designs. The designs were similar in that they featured random assignment of clusters to one of two treatment conditions and…

  11. Multilingual Phoneme Models for Rapid Speech Processing System Development

    DTIC Science & Technology

    2006-09-01

    processes are used to develop an Arabic speech recognition system starting from monolingual English models, In- ternational Phonetic Association (IPA...clusters. It was found that multilingual bootstrapping methods out- perform monolingual English bootstrapping methods on the Arabic evaluation data initially...International Phonetic Alphabet . . . . . . . . . 7 2.3.2 Multilingual vs. Monolingual Speech Recognition 7 2.3.3 Data-Driven Approaches

  12. An inferential study of the phenotype for the chromosome 15q24 microdeletion syndrome: a bootstrap analysis

    PubMed Central

    Ramírez-Prado, Dolores; Cortés, Ernesto; Aguilar-Segura, María Soledad; Gil-Guillén, Vicente Francisco

    2016-01-01

    In January 2012, a review of the cases of chromosome 15q24 microdeletion syndrome was published. However, this study did not include inferential statistics. The aims of the present study were to update the literature search and calculate confidence intervals for the prevalence of each phenotype using bootstrap methodology. Published case reports of patients with the syndrome that included detailed information about breakpoints and phenotype were sought and 36 were included. Deletions in megabase (Mb) pairs were determined to calculate the size of the interstitial deletion of the phenotypes studied in 2012. To determine confidence intervals for the prevalence of the phenotype and the interstitial loss, we used bootstrap methodology. Using the bootstrap percentiles method, we found wide variability in the prevalence of the different phenotypes (3–100%). The mean interstitial deletion size was 2.72 Mb (95% CI [2.35–3.10 Mb]). In comparison with our work, which expanded the literature search by 45 months, there were differences in the prevalence of 17% of the phenotypes, indicating that more studies are needed to analyze this rare disease. PMID:26925314

  13. Bootstrap imputation with a disease probability model minimized bias from misclassification due to administrative database codes.

    PubMed

    van Walraven, Carl

    2017-04-01

    Diagnostic codes used in administrative databases cause bias due to misclassification of patient disease status. It is unclear which methods minimize this bias. Serum creatinine measures were used to determine severe renal failure status in 50,074 hospitalized patients. The true prevalence of severe renal failure and its association with covariates were measured. These were compared to results for which renal failure status was determined using surrogate measures including the following: (1) diagnostic codes; (2) categorization of probability estimates of renal failure determined from a previously validated model; or (3) bootstrap methods imputation of disease status using model-derived probability estimates. Bias in estimates of severe renal failure prevalence and its association with covariates were minimal when bootstrap methods were used to impute renal failure status from model-based probability estimates. In contrast, biases were extensive when renal failure status was determined using codes or methods in which model-based condition probability was categorized. Bias due to misclassification from inaccurate diagnostic codes can be minimized using bootstrap methods to impute condition status using multivariable model-derived probability estimates. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Reference interval computation: which method (not) to choose?

    PubMed

    Pavlov, Igor Y; Wilson, Andrew R; Delgado, Julio C

    2012-07-11

    When different methods are applied to reference interval (RI) calculation the results can sometimes be substantially different, especially for small reference groups. If there are no reliable RI data available, there is no way to confirm which method generates results closest to the true RI. We randomly drawn samples obtained from a public database for 33 markers. For each sample, RIs were calculated by bootstrapping, parametric, and Box-Cox transformed parametric methods. Results were compared to the values of the population RI. For approximately half of the 33 markers, results of all 3 methods were within 3% of the true reference value. For other markers, parametric results were either unavailable or deviated considerably from the true values. The transformed parametric method was more accurate than bootstrapping for sample size of 60, very close to bootstrapping for sample size 120, but in some cases unavailable. We recommend against using parametric calculations to determine RIs. The transformed parametric method utilizing Box-Cox transformation would be preferable way of RI calculation, if it satisfies normality test. If not, the bootstrapping is always available, and is almost as accurate and precise as the transformed parametric method. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. The sound symbolism bootstrapping hypothesis for language acquisition and language evolution

    PubMed Central

    Imai, Mutsumi; Kita, Sotaro

    2014-01-01

    Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. PMID:25092666

  16. Estimating uncertainty in respondent-driven sampling using a tree bootstrap method.

    PubMed

    Baraff, Aaron J; McCormick, Tyler H; Raftery, Adrian E

    2016-12-20

    Respondent-driven sampling (RDS) is a network-based form of chain-referral sampling used to estimate attributes of populations that are difficult to access using standard survey tools. Although it has grown quickly in popularity since its introduction, the statistical properties of RDS estimates remain elusive. In particular, the sampling variability of these estimates has been shown to be much higher than previously acknowledged, and even methods designed to account for RDS result in misleadingly narrow confidence intervals. In this paper, we introduce a tree bootstrap method for estimating uncertainty in RDS estimates based on resampling recruitment trees. We use simulations from known social networks to show that the tree bootstrap method not only outperforms existing methods but also captures the high variability of RDS, even in extreme cases with high design effects. We also apply the method to data from injecting drug users in Ukraine. Unlike other methods, the tree bootstrap depends only on the structure of the sampled recruitment trees, not on the attributes being measured on the respondents, so correlations between attributes can be estimated as well as variability. Our results suggest that it is possible to accurately assess the high level of uncertainty inherent in RDS.

  17. A bootstrap lunar base: Preliminary design review 2

    NASA Technical Reports Server (NTRS)

    1987-01-01

    A bootstrap lunar base is the gateway to manned solar system exploration and requires new ideas and new designs on the cutting edge of technology. A preliminary design for a Bootstrap Lunar Base, the second provided by this contractor, is presented. An overview of the work completed is discussed as well as the technical, management, and cost strategies to complete the program requirements. The lunar base design stresses the transforming capabilities of its lander vehicles to aid in base construction. The design also emphasizes modularity and expandability in the base configuration to support the long-term goals of scientific research and profitable lunar resource exploitation. To successfully construct, develop, and inhabit a permanent lunar base, however, several technological advancements must first be realized. Some of these technological advancements are also discussed.

  18. Spheres, charges, instantons, and bootstrap: A five-dimensional odyssey

    NASA Astrophysics Data System (ADS)

    Chang, Chi-Ming; Fluder, Martin; Lin, Ying-Hsuan; Wang, Yifan

    2018-03-01

    We combine supersymmetric localization and the conformal bootstrap to study five-dimensional superconformal field theories. To begin, we classify the admissible counter-terms and derive a general relation between the five-sphere partition function and the conformal and flavor central charges. Along the way, we discover a new superconformal anomaly in five dimensions. We then propose a precise triple factorization formula for the five-sphere partition function, that incorporates instantons and is consistent with flavor symmetry enhancement. We numerically evaluate the central charges for the rank-one Seiberg and Morrison-Seiberg theories, and find strong evidence for their saturation of bootstrap bounds, thereby determining the spectra of long multiplets in these theories. Lastly, our results provide new evidence for the F-theorem and possibly a C-theorem in five-dimensional superconformal theories.

  19. Using a Nonparametric Bootstrap to Obtain a Confidence Interval for Pearson's "r" with Cluster Randomized Data: A Case Study

    ERIC Educational Resources Information Center

    Wagstaff, David A.; Elek, Elvira; Kulis, Stephen; Marsiglia, Flavio

    2009-01-01

    A nonparametric bootstrap was used to obtain an interval estimate of Pearson's "r," and test the null hypothesis that there was no association between 5th grade students' positive substance use expectancies and their intentions to not use substances. The students were participating in a substance use prevention program in which the unit of…

  20. Bootstrapping a five-loop amplitude using Steinmann relations

    DOE PAGES

    Caron-Huot, Simon; Dixon, Lance J.; McLeod, Andrew; ...

    2016-12-05

    Here, the analytic structure of scattering amplitudes is restricted by Steinmann relations, which enforce the vanishing of certain discontinuities of discontinuities. We show that these relations dramatically simplify the function space for the hexagon function bootstrap in planar maximally supersymmetric Yang-Mills theory. Armed with this simplification, along with the constraints of dual conformal symmetry and Regge exponentiation, we obtain the complete five-loop six-particle amplitude.

  1. A Bootstrap Algorithm for Mixture Models and Interval Data in Inter-Comparisons

    DTIC Science & Technology

    2001-07-01

    parametric bootstrap. The present algorithm will be applied to a thermometric inter-comparison, where data cannot be assumed to be normally distributed. 2 Data...experimental methods, used in each laboratory) often imply that the statistical assumptions are not satisfied, as for example in several thermometric ...triangular). Indeed, in thermometric experiments these three probabilistic models can represent several common stochastic variabilities for

  2. On the Model-Based Bootstrap with Missing Data: Obtaining a "P"-Value for a Test of Exact Fit

    ERIC Educational Resources Information Center

    Savalei, Victoria; Yuan, Ke-Hai

    2009-01-01

    Evaluating the fit of a structural equation model via bootstrap requires a transformation of the data so that the null hypothesis holds exactly in the sample. For complete data, such a transformation was proposed by Beran and Srivastava (1985) for general covariance structure models and applied to structural equation modeling by Bollen and Stine…

  3. A Comparison of Kernel Equating and Traditional Equipercentile Equating Methods and the Parametric Bootstrap Methods for Estimating Standard Errors in Equipercentile Equating

    ERIC Educational Resources Information Center

    Choi, Sae Il

    2009-01-01

    This study used simulation (a) to compare the kernel equating method to traditional equipercentile equating methods under the equivalent-groups (EG) design and the nonequivalent-groups with anchor test (NEAT) design and (b) to apply the parametric bootstrap method for estimating standard errors of equating. A two-parameter logistic item response…

  4. Measuring Efficiency of Tunisian Schools in the Presence of Quasi-Fixed Inputs: A Bootstrap Data Envelopment Analysis Approach

    ERIC Educational Resources Information Center

    Essid, Hedi; Ouellette, Pierre; Vigeant, Stephane

    2010-01-01

    The objective of this paper is to measure the efficiency of high schools in Tunisia. We use a statistical data envelopment analysis (DEA)-bootstrap approach with quasi-fixed inputs to estimate the precision of our measure. To do so, we developed a statistical model serving as the foundation of the data generation process (DGP). The DGP is…

  5. BootGraph: probabilistic fiber tractography using bootstrap algorithms and graph theory.

    PubMed

    Vorburger, Robert S; Reischauer, Carolin; Boesiger, Peter

    2013-02-01

    Bootstrap methods have recently been introduced to diffusion-weighted magnetic resonance imaging to estimate the measurement uncertainty of ensuing diffusion parameters directly from the acquired data without the necessity to assume a noise model. These methods have been previously combined with deterministic streamline tractography algorithms to allow for the assessment of connection probabilities in the human brain. Thereby, the local noise induced disturbance in the diffusion data is accumulated additively due to the incremental progression of streamline tractography algorithms. Graph based approaches have been proposed to overcome this drawback of streamline techniques. For this reason, the bootstrap method is in the present work incorporated into a graph setup to derive a new probabilistic fiber tractography method, called BootGraph. The acquired data set is thereby converted into a weighted, undirected graph by defining a vertex in each voxel and edges between adjacent vertices. By means of the cone of uncertainty, which is derived using the wild bootstrap, a weight is thereafter assigned to each edge. Two path finding algorithms are subsequently applied to derive connection probabilities. While the first algorithm is based on the shortest path approach, the second algorithm takes all existing paths between two vertices into consideration. Tracking results are compared to an established algorithm based on the bootstrap method in combination with streamline fiber tractography and to another graph based algorithm. The BootGraph shows a very good performance in crossing situations with respect to false negatives and permits incorporating additional constraints, such as a curvature threshold. By inheriting the advantages of the bootstrap method and graph theory, the BootGraph method provides a computationally efficient and flexible probabilistic tractography setup to compute connection probability maps and virtual fiber pathways without the drawbacks of streamline tractography algorithms or the assumption of a noise distribution. Moreover, the BootGraph can be applied to common DTI data sets without further modifications and shows a high repeatability. Thus, it is very well suited for longitudinal studies and meta-studies based on DTI. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Comparison of Bootstrapping and Markov Chain Monte Carlo for Copula Analysis of Hydrological Droughts

    NASA Astrophysics Data System (ADS)

    Yang, P.; Ng, T. L.; Yang, W.

    2015-12-01

    Effective water resources management depends on the reliable estimation of the uncertainty of drought events. Confidence intervals (CIs) are commonly applied to quantify this uncertainty. A CI seeks to be at the minimal length necessary to cover the true value of the estimated variable with the desired probability. In drought analysis where two or more variables (e.g., duration and severity) are often used to describe a drought, copulas have been found suitable for representing the joint probability behavior of these variables. However, the comprehensive assessment of the parameter uncertainties of copulas of droughts has been largely ignored, and the few studies that have recognized this issue have not explicitly compared the various methods to produce the best CIs. Thus, the objective of this study to compare the CIs generated using two widely applied uncertainty estimation methods, bootstrapping and Markov Chain Monte Carlo (MCMC). To achieve this objective, (1) the marginal distributions lognormal, Gamma, and Generalized Extreme Value, and the copula functions Clayton, Frank, and Plackett are selected to construct joint probability functions of two drought related variables. (2) The resulting joint functions are then fitted to 200 sets of simulated realizations of drought events with known distribution and extreme parameters and (3) from there, using bootstrapping and MCMC, CIs of the parameters are generated and compared. The effect of an informative prior on the CIs generated by MCMC is also evaluated. CIs are produced for different sample sizes (50, 100, and 200) of the simulated drought events for fitting the joint probability functions. Preliminary results assuming lognormal marginal distributions and the Clayton copula function suggest that for cases with small or medium sample sizes (~50-100), MCMC to be superior method if an informative prior exists. Where an informative prior is unavailable, for small sample sizes (~50), both bootstrapping and MCMC yield the same level of performance, and for medium sample sizes (~100), bootstrapping is better. For cases with a large sample size (~200), there is little difference between the CIs generated using bootstrapping and MCMC regardless of whether or not an informative prior exists.

  7. Reliability of dose volume constraint inference from clinical data.

    PubMed

    Lutz, C M; Møller, D S; Hoffmann, L; Knap, M M; Alber, M

    2017-04-21

    Dose volume histogram points (DVHPs) frequently serve as dose constraints in radiotherapy treatment planning. An experiment was designed to investigate the reliability of DVHP inference from clinical data for multiple cohort sizes and complication incidence rates. The experimental background was radiation pneumonitis in non-small cell lung cancer and the DVHP inference method was based on logistic regression. From 102 NSCLC real-life dose distributions and a postulated DVHP model, an 'ideal' cohort was generated where the most predictive model was equal to the postulated model. A bootstrap and a Cohort Replication Monte Carlo (CoRepMC) approach were applied to create 1000 equally sized populations each. The cohorts were then analyzed to establish inference frequency distributions. This was applied to nine scenarios for cohort sizes of 102 (1), 500 (2) to 2000 (3) patients (by sampling with replacement) and three postulated DVHP models. The Bootstrap was repeated for a 'non-ideal' cohort, where the most predictive model did not coincide with the postulated model. The Bootstrap produced chaotic results for all models of cohort size 1 for both the ideal and non-ideal cohorts. For cohort size 2 and 3, the distributions for all populations were more concentrated around the postulated DVHP. For the CoRepMC, the inference frequency increased with cohort size and incidence rate. Correct inference rates  >[Formula: see text] were only achieved by cohorts with more than 500 patients. Both Bootstrap and CoRepMC indicate that inference of the correct or approximate DVHP for typical cohort sizes is highly uncertain. CoRepMC results were less spurious than Bootstrap results, demonstrating the large influence that randomness in dose-response has on the statistical analysis.

  8. Reliability of dose volume constraint inference from clinical data

    NASA Astrophysics Data System (ADS)

    Lutz, C. M.; Møller, D. S.; Hoffmann, L.; Knap, M. M.; Alber, M.

    2017-04-01

    Dose volume histogram points (DVHPs) frequently serve as dose constraints in radiotherapy treatment planning. An experiment was designed to investigate the reliability of DVHP inference from clinical data for multiple cohort sizes and complication incidence rates. The experimental background was radiation pneumonitis in non-small cell lung cancer and the DVHP inference method was based on logistic regression. From 102 NSCLC real-life dose distributions and a postulated DVHP model, an ‘ideal’ cohort was generated where the most predictive model was equal to the postulated model. A bootstrap and a Cohort Replication Monte Carlo (CoRepMC) approach were applied to create 1000 equally sized populations each. The cohorts were then analyzed to establish inference frequency distributions. This was applied to nine scenarios for cohort sizes of 102 (1), 500 (2) to 2000 (3) patients (by sampling with replacement) and three postulated DVHP models. The Bootstrap was repeated for a ‘non-ideal’ cohort, where the most predictive model did not coincide with the postulated model. The Bootstrap produced chaotic results for all models of cohort size 1 for both the ideal and non-ideal cohorts. For cohort size 2 and 3, the distributions for all populations were more concentrated around the postulated DVHP. For the CoRepMC, the inference frequency increased with cohort size and incidence rate. Correct inference rates  >85 % were only achieved by cohorts with more than 500 patients. Both Bootstrap and CoRepMC indicate that inference of the correct or approximate DVHP for typical cohort sizes is highly uncertain. CoRepMC results were less spurious than Bootstrap results, demonstrating the large influence that randomness in dose-response has on the statistical analysis.

  9. Statistical inference based on the nonparametric maximum likelihood estimator under double-truncation.

    PubMed

    Emura, Takeshi; Konno, Yoshihiko; Michimae, Hirofumi

    2015-07-01

    Doubly truncated data consist of samples whose observed values fall between the right- and left- truncation limits. With such samples, the distribution function of interest is estimated using the nonparametric maximum likelihood estimator (NPMLE) that is obtained through a self-consistency algorithm. Owing to the complicated asymptotic distribution of the NPMLE, the bootstrap method has been suggested for statistical inference. This paper proposes a closed-form estimator for the asymptotic covariance function of the NPMLE, which is computationally attractive alternative to bootstrapping. Furthermore, we develop various statistical inference procedures, such as confidence interval, goodness-of-fit tests, and confidence bands to demonstrate the usefulness of the proposed covariance estimator. Simulations are performed to compare the proposed method with both the bootstrap and jackknife methods. The methods are illustrated using the childhood cancer dataset.

  10. Model specification and bootstrapping for multiply imputed data: An application to count models for the frequency of alcohol use

    PubMed Central

    Comulada, W. Scott

    2015-01-01

    Stata’s mi commands provide powerful tools to conduct multiple imputation in the presence of ignorable missing data. In this article, I present Stata code to extend the capabilities of the mi commands to address two areas of statistical inference where results are not easily aggregated across imputed datasets. First, mi commands are restricted to covariate selection. I show how to address model fit to correctly specify a model. Second, the mi commands readily aggregate model-based standard errors. I show how standard errors can be bootstrapped for situations where model assumptions may not be met. I illustrate model specification and bootstrapping on frequency counts for the number of times that alcohol was consumed in data with missing observations from a behavioral intervention. PMID:26973439

  11. Heptagons from the Steinmann cluster bootstrap

    DOE PAGES

    Dixon, Lance J.; Drummond, James; Harrington, Thomas; ...

    2017-02-28

    We reformulate the heptagon cluster bootstrap to take advantage of the Steinmann relations, which require certain double discontinuities of any amplitude to vanish. These constraints vastly reduce the number of functions needed to bootstrap seven-point amplitudes in planarmore » $$ \\mathcal{N} $$ = 4 supersymmetric Yang-Mills theory, making higher-loop contributions to these amplitudes more computationally accessible. In particular, dual superconformal symmetry and well-defined collinear limits suffice to determine uniquely the symbols of the three-loop NMHV and four-loop MHV seven-point amplitudes. We also show that at three loops, relaxing the dual superconformal $$\\bar{Q}$$ relations and imposing dihedral symmetry (and for NMHV the absence of spurious poles) leaves only a single ambiguity in the heptagon amplitudes. These results point to a strong tension between the collinear properties of the amplitudes and the Steinmann relations.« less

  12. Kepler Planet Detection Metrics: Statistical Bootstrap Test

    NASA Technical Reports Server (NTRS)

    Jenkins, Jon M.; Burke, Christopher J.

    2016-01-01

    This document describes the data produced by the Statistical Bootstrap Test over the final three Threshold Crossing Event (TCE) deliveries to NExScI: SOC 9.1 (Q1Q16)1 (Tenenbaum et al. 2014), SOC 9.2 (Q1Q17) aka DR242 (Seader et al. 2015), and SOC 9.3 (Q1Q17) aka DR253 (Twicken et al. 2016). The last few years have seen significant improvements in the SOC science data processing pipeline, leading to higher quality light curves and more sensitive transit searches. The statistical bootstrap analysis results presented here and the numerical results archived at NASAs Exoplanet Science Institute (NExScI) bear witness to these software improvements. This document attempts to introduce and describe the main features and differences between these three data sets as a consequence of the software changes.

  13. Imaging with New Classic and Vision at the NPOI

    NASA Astrophysics Data System (ADS)

    Jorgensen, Anders

    2018-04-01

    The Navy Precision Optical Interferometer (NPOI) is unique among interferometric observatories for its ability to position telescopes in an equally-spaced array configuration. This configuration is optimal for interferometric imaging because it allows the use of bootstrapping to track fringes on long baselines with signal-to-noise ratio less than one. When combined with coherent integration techniques this can produce visibilities with acceptable SNR on baselines long enough to resolve features on the surfaces of stars. The stellar surface imaging project at NPOI combines the bootstrapping array configuration of the NPOI array, real-time fringe tracking, baseline- and wavelength bootstrapping with Earth rotation to provide dense coverage in the UV plane at a wide range of spatial frequencies. In this presentation, we provide an overview of the project and an update of the latest status and results from the project.

  14. Bootstrapping and Maintaining Trust in the Cloud

    DTIC Science & Technology

    2016-12-01

    proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as Amazon Web Services and Google Compute Engine means...IaaS trusted computing system: • Secure Bootstrapping – the system should enable the tenant to securely install an initial root secret into each cloud ...elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features, but none achieve all. Excalibur [31] sup

  15. Sample Reuse in Statistical Remodeling.

    DTIC Science & Technology

    1987-08-01

    as the jackknife and bootstrap, is an expansion of the functional, T(Fn), or of its distribution function or both. Frangos and Schucany (1987a) used...accelerated bootstrap. In the same report Frangos and Schucany demonstrated the small sample superiority of that approach over the proposals that take...higher order terms of an Edgeworth expansion into account. In a second report Frangos and Schucany (1987b) examined the small sample performance of

  16. Bootstrapping Development of a Cloud-Based Spoken Dialog System in the Educational Domain from Scratch Using Crowdsourced Data. Research Report. ETS RR-16-16

    ERIC Educational Resources Information Center

    Ramanarayanan, Vikram; Suendermann-Oeft, David; Lange, Patrick; Ivanov, Alexei V.; Evanini, Keelan; Yu, Zhou; Tsuprun, Eugene; Qian, Yao

    2016-01-01

    We propose a crowdsourcing-based framework to iteratively and rapidly bootstrap a dialog system from scratch for a new domain. We leverage the open-source modular HALEF dialog system to deploy dialog applications. We illustrate the usefulness of this framework using four different prototype dialog items with applications in the educational domain…

  17. The sound symbolism bootstrapping hypothesis for language acquisition and language evolution.

    PubMed

    Imai, Mutsumi; Kita, Sotaro

    2014-09-19

    Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  18. Advancing theory development: exploring the leadership-climate relationship as a mechanism of the implementation of cultural competence.

    PubMed

    Guerrero, Erick G; Fenwick, Karissa; Kong, Yinfei

    2017-11-14

    Leadership style and specific organizational climates have emerged as critical mechanisms to implement targeted practices in organizations. Drawing from relevant theories, we propose that climate for implementation of cultural competence reflects how transformational leadership may enhance the organizational implementation of culturally responsive practices in health care organizations. Using multilevel data from 427 employees embedded in 112 addiction treatment programs collected in 2013, confirmatory factor analysis showed adequate fit statistics for our measure of climate for implementation of cultural competence (Cronbach's alpha = .88) and three outcomes: knowledge (Cronbach's alpha = .88), services (Cronbach's alpha = .86), and personnel (Cronbach's alpha = .86) practices. Results from multilevel path analyses indicate a positive relationship between employee perceptions of transformational leadership and climate for implementation of cultural competence (standardized indirect effect = .057, bootstrap p < .001). We also found a positive indirect effect between transformational leadership and each of the culturally competent practices: knowledge (standardized indirect effect = .006, bootstrap p = .004), services (standardized indirect effect = .019, bootstrap p < .001), and personnel (standardized indirect effect = .014, bootstrap p = .005). Findings contribute to implementation science. They build on leadership theory and offer evidence of the mediating role of climate in the implementation of cultural competence in addiction health service organizations.

  19. Non-parametric methods for cost-effectiveness analysis: the central limit theorem and the bootstrap compared.

    PubMed

    Nixon, Richard M; Wonderling, David; Grieve, Richard D

    2010-03-01

    Cost-effectiveness analyses (CEA) alongside randomised controlled trials commonly estimate incremental net benefits (INB), with 95% confidence intervals, and compute cost-effectiveness acceptability curves and confidence ellipses. Two alternative non-parametric methods for estimating INB are to apply the central limit theorem (CLT) or to use the non-parametric bootstrap method, although it is unclear which method is preferable. This paper describes the statistical rationale underlying each of these methods and illustrates their application with a trial-based CEA. It compares the sampling uncertainty from using either technique in a Monte Carlo simulation. The experiments are repeated varying the sample size and the skewness of costs in the population. The results showed that, even when data were highly skewed, both methods accurately estimated the true standard errors (SEs) when sample sizes were moderate to large (n>50), and also gave good estimates for small data sets with low skewness. However, when sample sizes were relatively small and the data highly skewed, using the CLT rather than the bootstrap led to slightly more accurate SEs. We conclude that while in general using either method is appropriate, the CLT is easier to implement, and provides SEs that are at least as accurate as the bootstrap. (c) 2009 John Wiley & Sons, Ltd.

  20. Smokers’ sensory beliefs mediate the relation between smoking a ‘light/low tar’ cigarette and perceptions of harm

    PubMed Central

    Elton-Marshall, Tara; Fong, Geoffrey T; Yong, Hua-Hie; Borland, Ron; Xu, Steve Shaowei; Quah, Anne C K; Feng, Guoze; Jiang, Yuan

    2016-01-01

    Background The sensory belief that ‘light/low tar’ cigarettes are smoother can also influence the belief that ‘light/low tar’ cigarettes are less harmful. However, the ‘light’ concept is one of several factors influencing beliefs. No studies have examined the impact of the sensory belief about one’s own brand of cigarettes on perceptions of harm. Objective The current study examines whether a smoker’s sensory belief that their brand is smoother is associated with the belief that their brand is less harmful and whether sensory beliefs mediate the relation between smoking a ‘light/low tar’ cigarette and relative perceptions of harm among smokers in China. Methods Data are from 5209 smokers who were recruited using a stratified multistage sampling design and participated in wave 3 of the International Tobacco Control (ITC) China Survey, a face-to-face survey of adult smokers and non-smokers in seven cities. Results Smokers who agreed that their brand of cigarettes was smoother were significantly more likely to say that their brand of cigarettes was less harmful (p<0.001, OR=6.86, 95% CI 5.64 to 8.33). Mediational analyses using the bootstrapping procedure indicated that both the direct effect of ‘light/low tar’ cigarette smokers on the belief that their cigarettes are less harmful (b=0.24, bootstrapped bias corrected 95% CI 0.13 to 0.34, p<0.001) and the indirect effect via their belief that their cigarettes are smoother were significant (b=0.32, bootstrapped bias-corrected 95% CI 0.28 to 0.37, p<0.001), suggesting that the mediation was partial. Conclusions These results demonstrate the importance of implementing tobacco control policies that address the impact that cigarette design and marketing can have in capitalising on the smoker’s natural associations between smoother sensations and lowered perceptions of harm. PMID:25370698

  1. Smokers' sensory beliefs mediate the relation between smoking a light/low tar cigarette and perceptions of harm.

    PubMed

    Elton-Marshall, Tara; Fong, Geoffrey T; Yong, Hua-Hie; Borland, Ron; Xu, Steve Shaowei; Quah, Anne C K; Feng, Guoze; Jiang, Yuan

    2015-11-01

    The sensory belief that 'light/low tar' cigarettes are smoother can also influence the belief that 'light/low tar' cigarettes are less harmful. However, the 'light' concept is one of several factors influencing beliefs. No studies have examined the impact of the sensory belief about one's own brand of cigarettes on perceptions of harm. The current study examines whether a smoker's sensory belief that their brand is smoother is associated with the belief that their brand is less harmful and whether sensory beliefs mediate the relation between smoking a 'light/low tar' cigarette and relative perceptions of harm among smokers in China. Data are from 5209 smokers who were recruited using a stratified multistage sampling design and participated in Wave 3 of the International Tobacco Control (ITC) China Survey, a face-to-face survey of adult smokers and non-smokers in seven cities. Smokers who agreed that their brand of cigarettes was smoother were significantly more likely to say that their brand of cigarettes was less harmful (p<0.001, OR=6.86, 95% CI 5.64 to 8.33). Mediational analyses using the bootstrapping procedure indicated that both the direct effect of 'light/low tar' cigarette smokers on the belief that their cigarettes are less harmful (b=0.24, bootstrapped bias corrected 95% CI 0.13 to 0.34, p<0.001) and the indirect effect via their belief that their cigarettes are smoother were significant (b=0.32, bootstrapped bias-corrected 95% CI 0.28 to 0.37, p<0.001), suggesting that the mediation was partial. These results demonstrate the importance of implementing tobacco control policies that address the impact that cigarette design and marketing can have in capitalising on the smoker's natural associations between smoother sensations and lowered perceptions of harm. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  2. Maternal depression and trait anger as risk factors for escalated physical discipline.

    PubMed

    Shay, Nicole L; Knutson, John F

    2008-02-01

    To test the hypothesized anger-mediated relation between maternal depression and escalation of physical discipline, 122 economically disadvantaged mothers were assessed for current and lifetime diagnoses of depression using the Current Depressive Episode, Past Depression, and Dysthymia sections of the Structured Clinical Interview for DSM-IV (SCID) and a measure of current depressive symptoms, the Beck Depression Inventory-Second Edition (BDI-II). Escalation of physical discipline was assessed using a video analog parenting task; maternal anger not specific to discipline was assessed using the Spielberger Trait Anger Expression Inventory. Reports of anger were associated with the diagnosis of depression and depressive symptoms. Bootstrap analyses of indirect effects indicated that the link between depression and escalated discipline was mediated by anger. Parallel analyses based on BDI-II scores identified a marginally significant indirect effect of depression on discipline. Findings suggest that anger and irritability are central to the putative link between depression and harsh discipline.

  3. Some Aspects of Advanced Tokamak Modeling in DIII-D

    NASA Astrophysics Data System (ADS)

    St John, H. E.; Petty, C. C.; Murakami, M.; Kinsey, J. E.

    2000-10-01

    We extend previous work(M. Murakami, et al., General Atomics Report GA-A23310 (1999).) done on time dependent DIII-D advanced tokamak simulations by introducing theoretical confinement models rather than relying on power balance derived transport coefficients. We explore using NBCD and off axis ECCD together with a self-consistent aligned bootstrap current, driven by the internal transport barrier dynamics generated with the GLF23 confinement model, to shape the hollow current profile and to maintain MHD stable conditions. Our theoretical modeling approach uses measured DIII-D initial conditions to start off the simulations in a smooth consistent manner. This mitigates the troublesome long lived perturbations in the ohmic current profile that is normally caused by inconsistent initial data. To achieve this goal our simulation uses a sequence of time dependent eqdsks generated autonomously by the EFIT MHD equilibrium code in analyzing experimental data to supply the history for the simulation.

  4. BOOTSTRAPPING THE CORONAL MAGNETIC FIELD WITH STEREO: UNIPOLAR POTENTIAL FIELD MODELING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aschwanden, Markus J.; Sandman, Anne W., E-mail: aschwanden@lmsal.co

    We investigate the recently quantified misalignment of {alpha}{sub mis} {approx} 20{sup 0}-40{sup 0} between the three-dimensional geometry of stereoscopically triangulated coronal loops observed with STEREO/EUVI (in four active regions (ARs)) and theoretical (potential or nonlinear force-free) magnetic field models extrapolated from photospheric magnetograms. We develop an efficient method of bootstrapping the coronal magnetic field by forward fitting a parameterized potential field model to the STEREO-observed loops. The potential field model consists of a number of unipolar magnetic charges that are parameterized by decomposing a photospheric magnetogram from the Michelson Doppler Imager. The forward-fitting method yields a best-fit magnetic field modelmore » with a reduced misalignment of {alpha}{sub PF} {approx} 13{sup 0}-20{sup 0}. We also evaluate stereoscopic measurement errors and find a contribution of {alpha}{sub SE} {approx} 7{sup 0}-12{sup 0}, which constrains the residual misalignment to {alpha}{sub NP} {approx} 11{sup 0}-17{sup 0}, which is likely due to the nonpotentiality of the ARs. The residual misalignment angle, {alpha}{sub NP}, of the potential field due to nonpotentiality is found to correlate with the soft X-ray flux of the AR, which implies a relationship between electric currents and plasma heating.« less

  5. Correcting the Standard Errors of 2-Stage Residual Inclusion Estimators for Mendelian Randomization Studies.

    PubMed

    Palmer, Tom M; Holmes, Michael V; Keating, Brendan J; Sheehan, Nuala A

    2017-11-01

    Mendelian randomization studies use genotypes as instrumental variables to test for and estimate the causal effects of modifiable risk factors on outcomes. Two-stage residual inclusion (TSRI) estimators have been used when researchers are willing to make parametric assumptions. However, researchers are currently reporting uncorrected or heteroscedasticity-robust standard errors for these estimates. We compared several different forms of the standard error for linear and logistic TSRI estimates in simulations and in real-data examples. Among others, we consider standard errors modified from the approach of Newey (1987), Terza (2016), and bootstrapping. In our simulations Newey, Terza, bootstrap, and corrected 2-stage least squares (in the linear case) standard errors gave the best results in terms of coverage and type I error. In the real-data examples, the Newey standard errors were 0.5% and 2% larger than the unadjusted standard errors for the linear and logistic TSRI estimators, respectively. We show that TSRI estimators with modified standard errors have correct type I error under the null. Researchers should report TSRI estimates with modified standard errors instead of reporting unadjusted or heteroscedasticity-robust standard errors. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health.

  6. Parametric methods outperformed non-parametric methods in comparisons of discrete numerical variables.

    PubMed

    Fagerland, Morten W; Sandvik, Leiv; Mowinckel, Petter

    2011-04-13

    The number of events per individual is a widely reported variable in medical research papers. Such variables are the most common representation of the general variable type called discrete numerical. There is currently no consensus on how to compare and present such variables, and recommendations are lacking. The objective of this paper is to present recommendations for analysis and presentation of results for discrete numerical variables. Two simulation studies were used to investigate the performance of hypothesis tests and confidence interval methods for variables with outcomes {0, 1, 2}, {0, 1, 2, 3}, {0, 1, 2, 3, 4}, and {0, 1, 2, 3, 4, 5}, using the difference between the means as an effect measure. The Welch U test (the T test with adjustment for unequal variances) and its associated confidence interval performed well for almost all situations considered. The Brunner-Munzel test also performed well, except for small sample sizes (10 in each group). The ordinary T test, the Wilcoxon-Mann-Whitney test, the percentile bootstrap interval, and the bootstrap-t interval did not perform satisfactorily. The difference between the means is an appropriate effect measure for comparing two independent discrete numerical variables that has both lower and upper bounds. To analyze this problem, we encourage more frequent use of parametric hypothesis tests and confidence intervals.

  7. Current transport mechanism in graphene/AlGaN/GaN heterostructures with various Al mole fractions

    NASA Astrophysics Data System (ADS)

    Pandit, Bhishma; Seo, Tae Hoon; Ryu, Beo Deul; Cho, Jaehee

    2016-06-01

    The current transport mechanism of graphene formed on AlxGa1-xN/GaN heterostructures with various Al mole fractions (x = 0.15, 0.20, 0.30, and 0.40) is investigated. The current-voltage measurement from graphene to AlGaN/GaN shows an excellent rectifying property. The extracted Schottky barrier height of the graphene/AlGaN/GaN contacts increases with the Al mole fraction in AlGaN. However, the current transport mechanism deviates from the Schottky-Mott theory owing to the deterioration of AlGaN crystal quality at high Al mole fractions confirmed by reverse leakage current measurement.

  8. Dual energy X-ray absorptiometry spine scans to determine abdominal fat in postmenopausal women.

    PubMed

    Bea, J W; Blew, R M; Going, S B; Hsu, C-H; Lee, M C; Lee, V R; Caan, B J; Kwan, M L; Lohman, T G

    2016-11-01

    Body composition may be a better predictor of chronic disease risk than body mass index (BMI) in older populations. We sought to validate spine fat fraction (%) from dual energy X-ray absorptiometry (DXA) spine scans as a proxy for total abdominal fat. Total body DXA scan abdominal fat regions of interest (ROI) that have been previously validated by magnetic resonance imaging were assessed among healthy, postmenopausal women who also had antero-posterior spine scans (n = 103). ROIs were (1) lumbar vertebrae L2-L4 and (2) L2-Iliac Crest (L2-IC), manually selected by two independent raters, and (3) trunk, auto-selected by DXA software. Intra-class correlation coefficients evaluated intra and inter-rater reliability on a random subset (N = 25). Linear regression models, validated by bootstrapping, assessed the relationship between spine fat fraction (%) and total abdominal fat (%) ROIs. Mean age, BMI, and total body fat were 66.1 ± 4.8 y, 25.8 ± 3.8 kg/m 2 and 40.0 ± 6.6%, respectively. There were no significant differences within or between raters. Linear regression models adjusted for several participant and scan characteristics were equivalent to using only BMI and spine fat fraction. The model predicted L2-L4 (Adj. R 2 : 0.83) and L2-IC (Adj. R 2 : 0.84) abdominal fat (%) well; the adjusted R 2 for trunk fat (%) was 0.78. Model validation demonstrated minimal over-fitting (Adj. R 2 : 0.82, 0.83, and 0.77 for L2-L4, L2-IC, and trunk fat, respectively). The strong correlation between spine fat fraction and DXA abdominal fat measures make it suitable for further development in postmenopausal chronic disease risk prediction models. Am. J. Hum. Biol. 28:918-926, 2016. © 2016Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. The effect of sheared toroidal rotation on pressure driven magnetic islands in toroidal plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hegna, C. C.

    2016-05-15

    The impact of sheared toroidal rotation on the evolution of pressure driven magnetic islands in tokamak plasmas is investigated using a resistive magnetohydrodynamics model augmented by a neoclassical Ohm's law. Particular attention is paid to the asymptotic matching data as the Mercier indices are altered in the presence of sheared flow. Analysis of the nonlinear island Grad-Shafranov equation shows that sheared flows tend to amplify the stabilizing pressure/curvature contribution to pressure driven islands in toroidal tokamaks relative to the island bootstrap current contribution. As such, sheared toroidal rotation tends to reduce saturated magnetic island widths.

  10. Cloud fraction at the ARM SGP site: Reducing uncertainty with self-organizing maps

    DOE PAGES

    Kennedy, Aaron D.; Dong, Xiquan; Xi, Baike

    2015-02-15

    Instrument downtime leads to uncertainty in the monthly and annual record of cloud fraction (CF), making it difficult to perform time series analyses of cloud properties and perform detailed evaluations of model simulations. As cloud occurrence is partially controlled by the large-scale atmospheric environment, this knowledge is used to reduce uncertainties in the instrument record. Synoptic patterns diagnosed from the North American Regional Reanalysis (NARR) during the period 1997–2010 are classified using a competitive neural network known as the self-organizing map (SOM). The classified synoptic states are then compared to the Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) instrumentmore » record to determine the expected CF. A number of SOMs are tested to understand how the number of classes and the period of classifications impact the relationship between classified states and CFs. Bootstrapping is utilized to quantify the uncertainty of the instrument record when statistical information from the SOM is included. Although all SOMs significantly reduce the uncertainty of the CF record calculated in Kennedy et al. (Theor Appl Climatol 115:91–105, 2014), SOMs with a large number of classes and separated by month are required to produce the lowest uncertainty and best agreement with the annual cycle of CF. Lastly, this result may be due to a manifestation of seasonally dependent biases in NARR.« less

  11. Bootstrapping and Maintaining Trust in the Cloud

    DTIC Science & Technology

    2016-03-16

    of infrastructure-as-a- service (IaaS) cloud computing services such as Ama- zon Web Services, Google Compute Engine, Rackspace, et. al. means that...Implementation We implemented keylime in ∼3.2k lines of Python in four components: registrar, node, CV, and tenant. The registrar offers a REST-based web ...bootstrap key K. It provides an unencrypted REST-based web service for these two functions. As described earlier, the pro- tocols for exchanging data

  12. Reduced Power Laer Designation Systems

    DTIC Science & Technology

    2008-06-20

    200KD, Ri = = 60Kfl, and R 2 = R4 = 2K yields an overall transimpedance gain of 200K x 30 x 30 = 180MV/A. Figure 3. Three stage photodiode amplifier ...transistor circuit for bootstrap buffering of the input stage, comparing the noise performance of the candidate amplifier designs, selecting the two...transistor bootstrap design as the circuit of choice, and comparing the performance of this circuit against that of a basic transconductance amplifier

  13. Benchmarking the efficiency of the Chilean water and sewerage companies: a double-bootstrap approach.

    PubMed

    Molinos-Senante, María; Donoso, Guillermo; Sala-Garrido, Ramon; Villegas, Andrés

    2018-03-01

    Benchmarking the efficiency of water companies is essential to set water tariffs and to promote their sustainability. In doing so, most of the previous studies have applied conventional data envelopment analysis (DEA) models. However, it is a deterministic method that does not allow to identify environmental factors influencing efficiency scores. To overcome this limitation, this paper evaluates the efficiency of a sample of Chilean water and sewerage companies applying a double-bootstrap DEA model. Results evidenced that the ranking of water and sewerage companies changes notably whether efficiency scores are computed applying conventional or double-bootstrap DEA models. Moreover, it was found that the percentage of non-revenue water and customer density are factors influencing the efficiency of Chilean water and sewerage companies. This paper illustrates the importance of using a robust and reliable method to increase the relevance of benchmarking tools.

  14. Probabilistic sensitivity analysis incorporating the bootstrap: an example comparing treatments for the eradication of Helicobacter pylori.

    PubMed

    Pasta, D J; Taylor, J L; Henning, J M

    1999-01-01

    Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.

  15. Bootstrapping the energy flow in the beginning of life.

    PubMed

    Hengeveld, R; Fedonkin, M A

    2007-01-01

    This paper suggests that the energy flow on which all living structures depend only started up slowly, the low-energy, initial phase starting up a second, slightly more energetic phase, and so on. In this way, the build up of the energy flow follows a bootstrapping process similar to that found in the development of computers, the first generation making possible the calculations necessary for constructing the second one, etc. In the biogenetic upstart of an energy flow, non-metals in the lower periods of the Periodic Table of Elements would have constituted the most primitive systems, their operation being enhanced and later supplanted by elements in the higher periods that demand more energy. This bootstrapping process would put the development of the metabolisms based on the second period elements carbon, nitrogen and oxygen at the end of the evolutionary process rather than at, or even before, the biogenetic event.

  16. Phylogenetic analysis of 47 chloroplast genomes clarifies the contribution of wild species to the domesticated apple maternal line.

    PubMed

    Nikiforova, Svetlana V; Cavalieri, Duccio; Velasco, Riccardo; Goremykin, Vadim

    2013-08-01

    Both the origin of domesticated apple and the overall phylogeny of the genus Malus are still not completely resolved. Having this as a target, we built a 134,553-position-long alignment including two previously published chloroplast DNAs (cpDNAs) and 45 de novo sequenced, fully colinear chloroplast genomes from cultivated apple varieties and wild apple species. The data produced are free from compositional heterogeneity and from substitutional saturation, which can adversely affect phylogeny reconstruction. Phylogenetic analyses based on this alignment recovered a branch, having the maximum bootstrap support, subtending a large group of the cultivated apple sorts together with all analyzed European wild apple (Malus sylvestris) accessions. One apple cultivar was embedded in a monophylum comprising wild M. sieversii accessions and other Asian apple species. The data demonstrate that M. sylvestris has contributed chloroplast genome to a substantial fraction of domesticated apple varieties, supporting the conclusion that different wild species should have contributed the organelle and nuclear genomes to the domesticated apple.

  17. Neonatal Maturation of Paracetamol (Acetaminophen) Glucuronidation, Sulfation, and Oxidation Based on a Parent-Metabolite Population Pharmacokinetic Model

    PubMed Central

    Cook, Sarah F.; Stockmann, Chris; Samiee-Zafarghandy, Samira; King, Amber D.; Deutsch, Nina; Williams, Elaine F.; Wilkins, Diana G.; van den Anker, John N.

    2017-01-01

    Objectives This study aimed to model the population pharmacokinetics of intravenous paracetamol and its major metabolites in neonates and to identify influential patient characteristics, especially those affecting the formation clearance (CLformation) of oxidative pathway metabolites. Methods Neonates with a clinical indication for intravenous analgesia received five 15-mg/kg doses of paracetamol at 12-h intervals (<28 weeks’ gestation) or seven 15-mg/kg doses at 8-h intervals (≥28 weeks’ gestation). Plasma and urine were sampled throughout the 72-h study period. Concentration-time data for paracetamol, paracetamol-glucuronide, paracetamol-sulfate, and the combined oxidative pathway metabolites (paracetamol-cysteine and paracetamol-N-acetylcysteine) were simultaneously modeled in NONMEM 7.2. Results The model incorporated 259 plasma and 350 urine samples from 35 neonates with a mean gestational age of 33.6 weeks (standard deviation 6.6). CLformation for all metabolites increased with weight; CLformation for glucuronidation and oxidation also increased with postnatal age. At the mean weight (2.3 kg) and postnatal age (7.5 days), CLformation estimates (bootstrap 95% confidence interval; between-subject variability) were 0.049 L/h (0.038–0.062; 62 %) for glucuronidation, 0.21 L/h (0.17–0.24; 33 %) for sulfation, and 0.058 L/h (0.044–0.078; 72 %) for oxidation. Expression of individual oxidation CLformation as a fraction of total individual paracetamol clearance showed that, on average, fractional oxidation CLformation increased <15 % when plotted against weight or postnatal age. Conclusions The parent-metabolite model successfully characterized the pharmacokinetics of intravenous paracetamol and its metabolites in neonates. Maturational changes in the fraction of paracetamol undergoing oxidation were small relative to between-subject variability. PMID:27209292

  18. Neonatal Maturation of Paracetamol (Acetaminophen) Glucuronidation, Sulfation, and Oxidation Based on a Parent-Metabolite Population Pharmacokinetic Model.

    PubMed

    Cook, Sarah F; Stockmann, Chris; Samiee-Zafarghandy, Samira; King, Amber D; Deutsch, Nina; Williams, Elaine F; Wilkins, Diana G; Sherwin, Catherine M T; van den Anker, John N

    2016-11-01

    This study aimed to model the population pharmacokinetics of intravenous paracetamol and its major metabolites in neonates and to identify influential patient characteristics, especially those affecting the formation clearance (CL formation ) of oxidative pathway metabolites. Neonates with a clinical indication for intravenous analgesia received five 15-mg/kg doses of paracetamol at 12-h intervals (<28 weeks' gestation) or seven 15-mg/kg doses at 8-h intervals (≥28 weeks' gestation). Plasma and urine were sampled throughout the 72-h study period. Concentration-time data for paracetamol, paracetamol-glucuronide, paracetamol-sulfate, and the combined oxidative pathway metabolites (paracetamol-cysteine and paracetamol-N-acetylcysteine) were simultaneously modeled in NONMEM 7.2. The model incorporated 259 plasma and 350 urine samples from 35 neonates with a mean gestational age of 33.6 weeks (standard deviation 6.6). CL formation for all metabolites increased with weight; CL formation for glucuronidation and oxidation also increased with postnatal age. At the mean weight (2.3 kg) and postnatal age (7.5 days), CL formation estimates (bootstrap 95% confidence interval; between-subject variability) were 0.049 L/h (0.038-0.062; 62 %) for glucuronidation, 0.21 L/h (0.17-0.24; 33 %) for sulfation, and 0.058 L/h (0.044-0.078; 72 %) for oxidation. Expression of individual oxidation CL formation as a fraction of total individual paracetamol clearance showed that, on average, fractional oxidation CL formation increased <15 % when plotted against weight or postnatal age. The parent-metabolite model successfully characterized the pharmacokinetics of intravenous paracetamol and its metabolites in neonates. Maturational changes in the fraction of paracetamol undergoing oxidation were small relative to between-subject variability.

  19. Seasonal comparisons of sea ice concentration estimates derived from SSM/I, OKEAN, and RADARSAT data

    USGS Publications Warehouse

    Belchansky, Gennady I.; Douglas, David C.

    2002-01-01

    The Special Sensor Microwave Imager (SSM/I) microwave satellite radiometer and its predecessor SMMR are primary sources of information for global sea ice and climate studies. However, comparisons of SSM/I, Landsat, AVHRR, and ERS-1 synthetic aperture radar (SAR) have shown substantial seasonal and regional differences in their estimates of sea ice concentration. To evaluate these differences, we compared SSM/I estimates of sea ice coverage derived with the NASA Team and Bootstrap algorithms to estimates made using RADARSAT, and OKEAN-01 satellite sensor data. The study area included the Barents Sea, Kara Sea, Laptev Sea, and adjacent parts of the Arctic Ocean, during October 1995 through October 1999. Ice concentration estimates from spatially and temporally near-coincident imagery were calculated using independent algorithms for each sensor type. The OKEAN algorithm implemented the satellite's two-channel active (radar) and passive microwave data in a linear mixture model based on the measured values of brightness temperature and radar backscatter. The RADARSAT algorithm utilized a segmentation approach of the measured radar backscatter, and the SSM/I ice concentrations were derived at National Snow and Ice Data Center (NSIDC) using the NASA Team and Bootstrap algorithms. Seasonal and monthly differences between SSM/I, OKEAN, and RADARSAT ice concentrations were calculated and compared. Overall, total sea ice concentration estimates derived independently from near-coincident RADARSAT, OKEAN-01, and SSM/I satellite imagery demonstrated mean differences of less than 5.5% (S.D.<9.5%) during the winter period. Differences between the SSM/I NASA Team and the SSM/I Bootstrap concentrations were no more than 3.1% (S.D.<5.4%) during this period. RADARSAT and OKEAN-01 data both yielded higher total ice concentrations than the NASA Team and the Bootstrap algorithms. The Bootstrap algorithm yielded higher total ice concentrations than the NASA Team algorithm. Total ice concentrations derived from OKEAN-01 and SSM/I satellite imagery were highly correlated during winter, spring, and fall, with mean differences of less than 8.1% (S.D.<15%) for the NASA Team algorithm, and less than 2.8% (S.D.<13.8%) for the Bootstrap algorithm. Respective differences between SSM/I NASA Team and SSM/I Bootstrap total concentrations were less than 5.3% (S.D.<6.9%). Monthly mean differences between SSM/I and OKEAN differed annually by less than 6%, with smaller differences primarily in winter. The NASA Team and Bootstrap algorithms underestimated the total sea ice concentrations relative to the RADARSAT ScanSAR no more than 3.0% (S.D.<9%) and 1.2% (S.D.<7.5%) during cold months, and no more than 12% and 7% during summer, respectively. ScanSAR tended to estimate higher ice concentrations for ice concentrations greater than 50%, when compared to SSM/I during all months. ScanSAR underestimated total sea ice concentration by 2% compared to the OKEAN-01 algorithm during cold months, and gave an overestimation by 2% during spring and summer months. Total NASA Team and Bootstrap sea ice concentration estimates derived from coincident SSM/I and OKEAN-01 data demonstrated mean differences of no more than 5.3% (S.D.<7%), 3.1% (S.D.<5.5%), 2.0% (S.D.<5.5%), and 7.3% (S.D.<10%) for fall, winter, spring, and summer periods, respectively. Large disagreements were observed between the OKEAN and NASA Team results in spring and summer for estimates of the first-year (FY) and multiyear (MY) age classes. The OKEAN-01 algorithm and data tended to estimate, on average, lower concentrations of young or FY ice and higher concentrations of total and MY ice for all months and seasons. Our results contribute to the growing body of documentation about the levels of disparity obtained when seasonal sea ice concentrations are estimated using various types of satellite data and algorithms.

  20. Reply to "Comment on 'Fractional quantum mechanics' and 'Fractional Schrödinger equation' ".

    PubMed

    Laskin, Nick

    2016-06-01

    The fractional uncertainty relation is a mathematical formulation of Heisenberg's uncertainty principle in the framework of fractional quantum mechanics. Two mistaken statements presented in the Comment have been revealed. The origin of each mistaken statement has been clarified and corrected statements have been made. A map between standard quantum mechanics and fractional quantum mechanics has been presented to emphasize the features of fractional quantum mechanics and to avoid misinterpretations of the fractional uncertainty relation. It has been shown that the fractional probability current equation is correct in the area of its applicability. Further studies have to be done to find meaningful quantum physics problems with involvement of the fractional probability current density vector and the extra term emerging in the framework of fractional quantum mechanics.

  1. A cluster bootstrap for two-loop MHV amplitudes

    DOE PAGES

    Golden, John; Spradlin, Marcus

    2015-02-02

    We apply a bootstrap procedure to two-loop MHV amplitudes in planar N=4 super-Yang-Mills theory. We argue that the mathematically most complicated part (the Λ 2 B 2 coproduct component) of the n-particle amplitude is uniquely determined by a simple cluster algebra property together with a few physical constraints (dihedral symmetry, analytic structure, supersymmetry, and well-defined collinear limits). Finally, we present a concise, closed-form expression which manifests these properties for all n.

  2. Wrappers for Performance Enhancement and Oblivious Decision Graphs

    DTIC Science & Technology

    1995-09-01

    always select all relevant features. We test di erent search engines to search the space of feature subsets and introduce compound operators to speed...distinct instances from the original dataset appearing in the test set is thus 0:632m. The 0i accuracy estimate is derived by using bootstrap sample...i for training and the rest of the instances for testing . Given a number b, the number of bootstrap samples, let 0i be the accuracy estimate for

  3. CME Velocity and Acceleration Error Estimates Using the Bootstrap Method

    NASA Technical Reports Server (NTRS)

    Michalek, Grzegorz; Gopalswamy, Nat; Yashiro, Seiji

    2017-01-01

    The bootstrap method is used to determine errors of basic attributes of coronal mass ejections (CMEs) visually identified in images obtained by the Solar and Heliospheric Observatory (SOHO) mission's Large Angle and Spectrometric Coronagraph (LASCO) instruments. The basic parameters of CMEs are stored, among others, in a database known as the SOHO/LASCO CME catalog and are widely employed for many research studies. The basic attributes of CMEs (e.g. velocity and acceleration) are obtained from manually generated height-time plots. The subjective nature of manual measurements introduces random errors that are difficult to quantify. In many studies the impact of such measurement errors is overlooked. In this study we present a new possibility to estimate measurements errors in the basic attributes of CMEs. This approach is a computer-intensive method because it requires repeating the original data analysis procedure several times using replicate datasets. This is also commonly called the bootstrap method in the literature. We show that the bootstrap approach can be used to estimate the errors of the basic attributes of CMEs having moderately large numbers of height-time measurements. The velocity errors are in the vast majority small and depend mostly on the number of height-time points measured for a particular event. In the case of acceleration, the errors are significant, and for more than half of all CMEs, they are larger than the acceleration itself.

  4. Simplified Estimation and Testing in Unbalanced Repeated Measures Designs.

    PubMed

    Spiess, Martin; Jordan, Pascal; Wendt, Mike

    2018-05-07

    In this paper we propose a simple estimator for unbalanced repeated measures design models where each unit is observed at least once in each cell of the experimental design. The estimator does not require a model of the error covariance structure. Thus, circularity of the error covariance matrix and estimation of correlation parameters and variances are not necessary. Together with a weak assumption about the reason for the varying number of observations, the proposed estimator and its variance estimator are unbiased. As an alternative to confidence intervals based on the normality assumption, a bias-corrected and accelerated bootstrap technique is considered. We also propose the naive percentile bootstrap for Wald-type tests where the standard Wald test may break down when the number of observations is small relative to the number of parameters to be estimated. In a simulation study we illustrate the properties of the estimator and the bootstrap techniques to calculate confidence intervals and conduct hypothesis tests in small and large samples under normality and non-normality of the errors. The results imply that the simple estimator is only slightly less efficient than an estimator that correctly assumes a block structure of the error correlation matrix, a special case of which is an equi-correlation matrix. Application of the estimator and the bootstrap technique is illustrated using data from a task switch experiment based on an experimental within design with 32 cells and 33 participants.

  5. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  6. Image analysis of representative food structures: application of the bootstrap method.

    PubMed

    Ramírez, Cristian; Germain, Juan C; Aguilera, José M

    2009-08-01

    Images (for example, photomicrographs) are routinely used as qualitative evidence of the microstructure of foods. In quantitative image analysis it is important to estimate the area (or volume) to be sampled, the field of view, and the resolution. The bootstrap method is proposed to estimate the size of the sampling area as a function of the coefficient of variation (CV(Bn)) and standard error (SE(Bn)) of the bootstrap taking sub-areas of different sizes. The bootstrap method was applied to simulated and real structures (apple tissue). For simulated structures, 10 computer-generated images were constructed containing 225 black circles (elements) and different coefficient of variation (CV(image)). For apple tissue, 8 images of apple tissue containing cellular cavities with different CV(image) were analyzed. Results confirmed that for simulated and real structures, increasing the size of the sampling area decreased the CV(Bn) and SE(Bn). Furthermore, there was a linear relationship between the CV(image) and CV(Bn) (.) For example, to obtain a CV(Bn) = 0.10 in an image with CV(image) = 0.60, a sampling area of 400 x 400 pixels (11% of whole image) was required, whereas if CV(image) = 1.46, a sampling area of 1000 x 100 pixels (69% of whole image) became necessary. This suggests that a large-size dispersion of element sizes in an image requires increasingly larger sampling areas or a larger number of images.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oberije, Cary, E-mail: cary.oberije@maastro.nl; De Ruysscher, Dirk; Universitaire Ziekenhuizen Leuven, KU Leuven

    Purpose: Although patients with stage III non-small cell lung cancer (NSCLC) are homogeneous according to the TNM staging system, they form a heterogeneous group, which is reflected in the survival outcome. The increasing amount of information for an individual patient and the growing number of treatment options facilitate personalized treatment, but they also complicate treatment decision making. Decision support systems (DSS), which provide individualized prognostic information, can overcome this but are currently lacking. A DSS for stage III NSCLC requires the development and integration of multiple models. The current study takes the first step in this process by developing andmore » validating a model that can provide physicians with a survival probability for an individual NSCLC patient. Methods and Materials: Data from 548 patients with stage III NSCLC were available to enable the development of a prediction model, using stratified Cox regression. Variables were selected by using a bootstrap procedure. Performance of the model was expressed as the c statistic, assessed internally and on 2 external data sets (n=174 and n=130). Results: The final multivariate model, stratified for treatment, consisted of age, gender, World Health Organization performance status, overall treatment time, equivalent radiation dose, number of positive lymph node stations, and gross tumor volume. The bootstrapped c statistic was 0.62. The model could identify risk groups in external data sets. Nomograms were constructed to predict an individual patient's survival probability ( (www.predictcancer.org)). The data set can be downloaded at (https://www.cancerdata.org/10.1016/j.ijrobp.2015.02.048). Conclusions: The prediction model for overall survival of patients with stage III NSCLC highlights the importance of combining patient, clinical, and treatment variables. Nomograms were developed and validated. This tool could be used as a first building block for a decision support system.« less

  8. PROGRESS IN THE PEELING-BALLOONING MODEL OF ELMS: TOROIDAL ROTATION AND 3D NONLINEAR DYNAMICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SNYDER,P.B; WILSON,H.R; XU,X.Q

    2004-06-01

    Understanding the physics of the H-Mode pedestal and edge localized modes (ELMs) is very important to next-step fusion devices for two primary reasons: (1) The pressure at the top of the edge barrier (''pedestal height'') strongly impacts global confinement and fusion performance, and (2) large ELMs lead to localized transient heat loads on material surfaces that may constrain component lifetimes. The development of the peeling-ballooning model has shed light on these issues by positing a mechanism for ELM onset and constraints on the pedestal height. The mechanism involves instability of ideal coupled ''peeling-ballooning'' modes driven by the sharp pressure gradientmore » and consequent large bootstrap current in the H-mode edge. It was first investigated in the local, high-n limit [1], and later quantified for non-local, finite-n modes in general toroidal geometry [2,3]. Important aspects are that a range of wavelengths may potentially be unstable, with intermediate n's (n {approx} 3-30) generally limiting in high performance regimes, and that stability bounds are strongly sensitive to shape [Fig l(a)], and to collisionality (i.e. temperature and density) [4] through the bootstrap current. The development of efficient MHD stability codes such as ELITE [3,2] and MISHKA [5] has allowed detailed quantification of peeling-ballooning stability bounds (e.g. [6]) and extensive and largely successful comparisons with observation (e.g. [2,6-9]). These previous calculations are ideal, static, and linear. Here we extend this work to incorporate the impact of sheared toroidal rotation, and the non-ideal, nonlinear dynamics which must be studied to quantify ELM size and heat deposition on material surfaces.« less

  9. A combined approach of generalized additive model and bootstrap with small sample sets for fault diagnosis in fermentation process of glutamate.

    PubMed

    Liu, Chunbo; Pan, Feng; Li, Yun

    2016-07-29

    Glutamate is of great importance in food and pharmaceutical industries. There is still lack of effective statistical approaches for fault diagnosis in the fermentation process of glutamate. To date, the statistical approach based on generalized additive model (GAM) and bootstrap has not been used for fault diagnosis in fermentation processes, much less the fermentation process of glutamate with small samples sets. A combined approach of GAM and bootstrap was developed for the online fault diagnosis in the fermentation process of glutamate with small sample sets. GAM was first used to model the relationship between glutamate production and different fermentation parameters using online data from four normal fermentation experiments of glutamate. The fitted GAM with fermentation time, dissolved oxygen, oxygen uptake rate and carbon dioxide evolution rate captured 99.6 % variance of glutamate production during fermentation process. Bootstrap was then used to quantify the uncertainty of the estimated production of glutamate from the fitted GAM using 95 % confidence interval. The proposed approach was then used for the online fault diagnosis in the abnormal fermentation processes of glutamate, and a fault was defined as the estimated production of glutamate fell outside the 95 % confidence interval. The online fault diagnosis based on the proposed approach identified not only the start of the fault in the fermentation process, but also the end of the fault when the fermentation conditions were back to normal. The proposed approach only used a small sample sets from normal fermentations excitements to establish the approach, and then only required online recorded data on fermentation parameters for fault diagnosis in the fermentation process of glutamate. The proposed approach based on GAM and bootstrap provides a new and effective way for the fault diagnosis in the fermentation process of glutamate with small sample sets.

  10. Examination of the reliability of the crash modification factors using empirical Bayes method with resampling technique.

    PubMed

    Wang, Jung-Han; Abdel-Aty, Mohamed; Wang, Ling

    2017-07-01

    There have been plenty of studies intended to use different methods, for example, empirical Bayes before-after methods, to get accurate estimation of CMFs. All of them have different assumptions toward crash count if there was no treatment. Additionally, another major assumption is that multiple sites share the same true CMF. Under this assumption, the CMF at an individual intersection is randomly drawn from a normally distributed population of CMFs at all intersections. Since CMFs are non-zero values, the population of all CMFs might not follow normal distributions, and even if it does, the true mean of CMFs at some intersections may be different from that at others. Therefore, a bootstrap method based on before-after empirical Bayes theory was proposed to estimate CMFs, but it did not make distributional assumptions. This bootstrap procedure has the added benefit of producing a measure of CMF stability. Furthermore, based on the bootstrapped CMF, a new CMF precision rating method was proposed to evaluate the reliability of CMFs. This study chose 29 urban four-legged intersections as treated sites, and their controls were changed from stop-controlled to signal-controlled. Meanwhile, 124 urban four-legged stop-controlled intersections were selected as reference sites. At first, different safety performance functions (SPFs) were applied to five crash categories, and it was found that each crash category had different optimal SPF form. Then, the CMFs of these five crash categories were estimated using the bootstrap empirical Bayes method. The results of the bootstrapped method showed that signalization significantly decreased Angle+Left-Turn crashes, and its CMF had the highest precision. While, the CMF for Rear-End crashes was unreliable. For KABCO, KABC, and KAB crashes, their CMFs were proved to be reliable for the majority of intersections, but the estimated effect of signalization may be not accurate at some sites. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Adaptive fractional order sliding mode control for Boost converter in the Battery/Supercapacitor HESS.

    PubMed

    Wang, Jianlin; Xu, Dan; Zhou, Huan; Zhou, Tao

    2018-01-01

    In this paper, an adaptive fractional order sliding mode control (AFSMC) scheme is designed for the current tracking control of the Boost-type converter in a Battery/Supercapacitor hybrid energy storage system (HESS). In order to stabilize the current, the adaptation rules based on state-observer and Lyapunov function are being designed. A fractional order sliding surface function is defined based on the tracking current error and adaptive rules. Furthermore, through fractional order analysis, the stability of the fractional order control system is proven, and the value of the fractional order (λ) is being investigated. In addition, the effectiveness of the proposed AFSMC strategy is being verified by numerical simulations. The advantages of good transient response and robustness to uncertainty are being indicated by this design, when compared with a conventional integer order sliding mode control system.

  12. Theory of Space Charge Limited Current in Fractional Dimensional Space

    NASA Astrophysics Data System (ADS)

    Zubair, Muhammad; Ang, L. K.

    The concept of fractional dimensional space has been effectively applied in many areas of physics to describe the fractional effects on the physical systems. We will present some recent developments of space charge limited (SCL) current in free space and solid in the framework of fractional dimensional space which may account for the effect of imperfectness or roughness of the electrode surface. For SCL current in free space, the governing law is known as the Child-Langmuir (CL) law. Its analogy in a trap-free solid (or dielectric) is known as Mott-Gurney (MG) law. This work extends the one-dimensional CL Law and MG Law for the case of a D-dimensional fractional space with 0 < D <= 1 where parameter D defines the degree of roughness of the electrode surface. Such a fractional dimensional space generalization of SCL current theory can be used to characterize the charge injection by the imperfectness or roughness of the surface in applications related to high current cathode (CL law), and organic electronics (MG law). In terms of operating regime, the model has included the quantum effects when the spacing between the electrodes is small.

  13. Elastic S-matrices in (1 + 1) dimensions and Toda field theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christe, P.; Mussardo, G.

    Particular deformations of 2-D conformal field theory lead to integrable massive quantum field theories. These can be characterized by the relative scattering data. This paper proposes a general scheme for classifying the elastic nondegenerate S-matrix in (1 + 1) dimensions starting from the possible boot-strap processes and the spins of the conserved currents. Their identification with the S-matrix coming from the Toda field theory is analyzed. The authors discuss both cases of Toda field theory constructed with the simply-laced Dynkin diagrams and the nonsimply-laced ones. The authors present the results of the perturbative analysis and their geometrical interpretations.

  14. Quantitative body DW-MRI biomarkers uncertainty estimation using unscented wild-bootstrap.

    PubMed

    Freiman, M; Voss, S D; Mulkern, R V; Perez-Rossello, J M; Warfield, S K

    2011-01-01

    We present a new method for the uncertainty estimation of diffusion parameters for quantitative body DW-MRI assessment. Diffusion parameters uncertainty estimation from DW-MRI is necessary for clinical applications that use these parameters to assess pathology. However, uncertainty estimation using traditional techniques requires repeated acquisitions, which is undesirable in routine clinical use. Model-based bootstrap techniques, for example, assume an underlying linear model for residuals rescaling and cannot be utilized directly for body diffusion parameters uncertainty estimation due to the non-linearity of the body diffusion model. To offset this limitation, our method uses the Unscented transform to compute the residuals rescaling parameters from the non-linear body diffusion model, and then applies the wild-bootstrap method to infer the body diffusion parameters uncertainty. Validation through phantom and human subject experiments shows that our method identify the regions with higher uncertainty in body DWI-MRI model parameters correctly with realtive error of -36% in the uncertainty values.

  15. Bootstrapping the (A1, A2) Argyres-Douglas theory

    NASA Astrophysics Data System (ADS)

    Cornagliotto, Martina; Lemos, Madalena; Liendo, Pedro

    2018-03-01

    We apply bootstrap techniques in order to constrain the CFT data of the ( A 1 , A 2) Argyres-Douglas theory, which is arguably the simplest of the Argyres-Douglas models. We study the four-point function of its single Coulomb branch chiral ring generator and put numerical bounds on the low-lying spectrum of the theory. Of particular interest is an infinite family of semi-short multiplets labeled by the spin ℓ. Although the conformal dimensions of these multiplets are protected, their three-point functions are not. Using the numerical bootstrap we impose rigorous upper and lower bounds on their values for spins up to ℓ = 20. Through a recently obtained inversion formula, we also estimate them for sufficiently large ℓ, and the comparison of both approaches shows consistent results. We also give a rigorous numerical range for the OPE coefficient of the next operator in the chiral ring, and estimates for the dimension of the first R-symmetry neutral non-protected multiplet for small spin.

  16. Acculturation, Income and Vegetable Consumption Behaviors Among Latino Adults in the U.S.: A Mediation Analysis with the Bootstrapping Technique.

    PubMed

    López, Erick B; Yamashita, Takashi

    2017-02-01

    This study examined whether household income mediates the relationship between acculturation and vegetable consumption among Latino adults in the U.S. Data from the 2009 to 2010 National Health and Nutrition Examination Survey were analyzed. Vegetable consumption index was created based on the frequencies of five kinds of vegetables intake. Acculturation was measured with the degree of English language use at home. Path model with bootstrapping technique was employed for mediation analysis. A significant partial mediation relationship was identified. Greater acculturation [95 % bias corrected bootstrap confident interval (BCBCI) = (0.02, 0.33)] was associated with the higher income and in turn, greater vegetable consumption. At the same time, greater acculturation was associated with lower vegetable consumption [95 % BCBCI = (-0.88, -0.07)]. Findings regarding the income as a mediator of the acculturation-dietary behavior relationship inform unique intervention programs and policy changes to address health disparities by race/ethnicity.

  17. Exploration of the factor structure of the Kirton Adaption-Innovation Inventory using bootstrapping estimation.

    PubMed

    Im, Subin; Min, Soonhong

    2013-04-01

    Exploratory factor analyses of the Kirton Adaption-Innovation Inventory (KAI), which serves to measure individual cognitive styles, generally indicate three factors: sufficiency of originality, efficiency, and rule/group conformity. In contrast, a 2005 study by Im and Hu using confirmatory factor analysis supported a four-factor structure, dividing the sufficiency of originality dimension into two subdimensions, idea generation and preference for change. This study extends Im and Hu's (2005) study of a derived version of the KAI by providing additional evidence of the four-factor structure. Specifically, the authors test the robustness of the parameter estimates to the violation of normality assumptions in the sample using bootstrap methods. A bias-corrected confidence interval bootstrapping procedure conducted among a sample of 356 participants--members of the Arkansas Household Research Panel, with middle SES and average age of 55.6 yr. (SD = 13.9)--showed that the four-factor model with two subdimensions of sufficiency of originality fits the data significantly better than the three-factor model in non-normality conditions.

  18. How to bootstrap a human communication system.

    PubMed

    Fay, Nicolas; Arbib, Michael; Garrod, Simon

    2013-01-01

    How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified items to a partner using repeated non-linguistic vocalization, repeated gesture, or repeated non-linguistic vocalization plus gesture (but without using their existing language system). Gesture proved more effective (measured by communication success) and more efficient (measured by the time taken to communicate) than non-linguistic vocalization across a range of item categories (emotion, object, and action). Combining gesture and vocalization did not improve performance beyond gesture alone. We experimentally demonstrate that gesture is a more effective means of bootstrapping a human communication system. We argue that gesture outperforms non-linguistic vocalization because it lends itself more naturally to the production of motivated signs. © 2013 Cognitive Science Society, Inc.

  19. Measuring and Benchmarking Technical Efficiency of Public Hospitals in Tianjin, China

    PubMed Central

    Li, Hao; Dong, Siping

    2015-01-01

    China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. PMID:26396090

  20. On combination of strict Bayesian principles with model reduction technique or how stochastic model calibration can become feasible for large-scale applications

    NASA Astrophysics Data System (ADS)

    Oladyshkin, S.; Schroeder, P.; Class, H.; Nowak, W.

    2013-12-01

    Predicting underground carbon dioxide (CO2) storage represents a challenging problem in a complex dynamic system. Due to lacking information about reservoir parameters, quantification of uncertainties may become the dominant question in risk assessment. Calibration on past observed data from pilot-scale test injection can improve the predictive power of the involved geological, flow, and transport models. The current work performs history matching to pressure time series from a pilot storage site operated in Europe, maintained during an injection period. Simulation of compressible two-phase flow and transport (CO2/brine) in the considered site is computationally very demanding, requiring about 12 days of CPU time for an individual model run. For that reason, brute-force approaches for calibration are not feasible. In the current work, we explore an advanced framework for history matching based on the arbitrary polynomial chaos expansion (aPC) and strict Bayesian principles. The aPC [1] offers a drastic but accurate stochastic model reduction. Unlike many previous chaos expansions, it can handle arbitrary probability distribution shapes of uncertain parameters, and can therefore handle directly the statistical information appearing during the matching procedure. We capture the dependence of model output on these multipliers with the expansion-based reduced model. In our study we keep the spatial heterogeneity suggested by geophysical methods, but consider uncertainty in the magnitude of permeability trough zone-wise permeability multipliers. Next combined the aPC with Bootstrap filtering (a brute-force but fully accurate Bayesian updating mechanism) in order to perform the matching. In comparison to (Ensemble) Kalman Filters, our method accounts for higher-order statistical moments and for the non-linearity of both the forward model and the inversion, and thus allows a rigorous quantification of calibrated model uncertainty. The usually high computational costs of accurate filtering become very feasible for our suggested aPC-based calibration framework. However, the power of aPC-based Bayesian updating strongly depends on the accuracy of prior information. In the current study, the prior assumptions on the model parameters were not satisfactory and strongly underestimate the reservoir pressure. Thus, the aPC-based response surface used in Bootstrap filtering is fitted to a distant and poorly chosen region within the parameter space. Thanks to the iterative procedure suggested in [2] we overcome this drawback with small computational costs. The iteration successively improves the accuracy of the expansion around the current estimation of the posterior distribution. The final result is a calibrated model of the site that can be used for further studies, with an excellent match to the data. References [1] Oladyshkin S. and Nowak W. Data-driven uncertainty quantification using the arbitrary polynomial chaos expansion. Reliability Engineering and System Safety, 106:179-190, 2012. [2] Oladyshkin S., Class H., Nowak W. Bayesian updating via Bootstrap filtering combined with data-driven polynomial chaos expansions: methodology and application to history matching for carbon dioxide storage in geological formations. Computational Geosciences, 17 (4), 671-687, 2013.

  1. Fusion Plasma Performance and Confinement Studies on JT-60 and JT-60U

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamada, Y.; Fujita, T.; Ishida, S.

    2002-09-15

    Fusion plasma performance and confinement studies on JT-60 and JT-60U are reviewed. With the main aim of providing a physics basis for ITER and the steady-state tokamak reactors, JT-60/JT-60U has been developing and optimizing the operational concepts, and extending the discharge regimes toward sustainment of high integrated performance in the reactor relevant parameter regime. In addition to achievement of high fusion plasma performances such as the equivalent breakeven condition (Q{sub DT}{sup eq} up to 1.25) and a high fusion triple product n{sub D}(0){tau}{sub E}T{sub i}(0) = 1.5 x 10{sup 21} m{sup -3}skeV, JT-60U has demonstrated the integrated performance of highmore » confinement, high {beta}{sub N}, full non-inductive current drive with a large fraction of bootstrap current. These favorable performances have been achieved in the two advanced operation regimes, the reversed magnetic shear (RS) and the weak magnetic shear (high-{beta}{sub p}) ELMy H modes characterized by both internal transport barriers (ITB) and edge transport barriers (ETB). The key factors in optimizing these plasmas towards high integrated performance are control of profiles of current, pressure, rotation, etc. utilizing a variety of heating, current drive, torque input, and particle control capabilities and high triangularity operation. As represented by discovery of ITBs (density ITB in the central pellet mode, ion temperature ITB in the high-{beta}{sub p} mode, and electron temperature ITB in the reversed shear mode), confinement studies in JT-60/JT-60U have been emphasizing freedom and also restriction of radial profiles of temperature and density. In addition to characterization of confinement and analyses of transport properties of the OH, the L-mode, the H-mode, the pellet mode, the high-{beta}{sub p} mode, and the RS mode, JT-60U has clarified formation conditions, spatial structures and dynamics of edge and internal transport barriers, and evaluated effects of repetitive MHD events on confinement such as sawteeth and ELMs. Through these studies, JT-60U has demonstrated applicability of the high confinement modes to ITER and the steady-state tokamak reactors.« less

  2. Bootstrapping quarks and gluons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chew, G.F.

    1979-04-01

    Dual topological unitarization (DTU) - the approach to S-matrix causality and unitarity through combinatorial topology - is reviewed. Amplitudes associated with triangulated spheres are shown to constitute the core of particle physics. Each sphere is covered by triangulated disc faces corresponding to hadrons. The leading current candidate for the hadron-face triangulation pattern employs 3-triangle basic subdiscs whose orientations correspond to baryon number and topological color. Additional peripheral triangles lie along the hadron-face perimeter. Certain combinations of peripheral triangles with a basic-disc triangle can be identified as quarks, the flavor of a quark corresponding to the orientation of its edges thatmore » lie on the hadron-face perimeter. Both baryon number and flavor are additively conserved. Quark helicity, which can be associated with triangle-interior orientation, is not uniformly conserved and interacts with particle momentum, whereas flavor does not. Three different colors attach to the 3 quarks associated with a single basic subdisc, but there is no additive physical conservation law associated with color. There is interplay between color and quark helicity. In hadron faces with more than one basic subdisc, there may occur pairs of adjacent flavorless but colored triangles with net helicity +-1 that are identifiable as gluons. Broken symmetry is an automatic feature of the bootstrap. T, C and P symmetries, as well as up-down flavor symmetry, persist on all orientable surfaces.« less

  3. Predicting the risk for colorectal cancer with personal characteristics and fecal immunochemical test.

    PubMed

    Li, Wen; Zhao, Li-Zhong; Ma, Dong-Wang; Wang, De-Zheng; Shi, Lei; Wang, Hong-Lei; Dong, Mo; Zhang, Shu-Yi; Cao, Lei; Zhang, Wei-Hua; Zhang, Xi-Peng; Zhang, Qing-Huai; Yu, Lin; Qin, Hai; Wang, Xi-Mo; Chen, Sam Li-Sheng

    2018-05-01

    We aimed to predict colorectal cancer (CRC) based on the demographic features and clinical correlates of personal symptoms and signs from Tianjin community-based CRC screening data.A total of 891,199 residents who were aged 60 to 74 and were screened in 2012 were enrolled. The Lasso logistic regression model was used to identify the predictors for CRC. Predictive validity was assessed by the receiver operating characteristic (ROC) curve. Bootstrapping method was also performed to validate this prediction model.CRC was best predicted by a model that included age, sex, education level, occupations, diarrhea, constipation, colon mucosa and bleeding, gallbladder disease, a stressful life event, family history of CRC, and a positive fecal immunochemical test (FIT). The area under curve (AUC) for the questionnaire with a FIT was 84% (95% CI: 82%-86%), followed by 76% (95% CI: 74%-79%) for a FIT alone, and 73% (95% CI: 71%-76%) for the questionnaire alone. With 500 bootstrap replications, the estimated optimism (<0.005) shows good discrimination in validation of prediction model.A risk prediction model for CRC based on a series of symptoms and signs related to enteric diseases in combination with a FIT was developed from first round of screening. The results of the current study are useful for increasing the awareness of high-risk subjects and for individual-risk-guided invitations or strategies to achieve mass screening for CRC.

  4. BusyBee Web: metagenomic data analysis by bootstrapped supervised binning and annotation

    PubMed Central

    Kiefer, Christina; Fehlmann, Tobias; Backes, Christina

    2017-01-01

    Abstract Metagenomics-based studies of mixed microbial communities are impacting biotechnology, life sciences and medicine. Computational binning of metagenomic data is a powerful approach for the culture-independent recovery of population-resolved genomic sequences, i.e. from individual or closely related, constituent microorganisms. Existing binning solutions often require a priori characterized reference genomes and/or dedicated compute resources. Extending currently available reference-independent binning tools, we developed the BusyBee Web server for the automated deconvolution of metagenomic data into population-level genomic bins using assembled contigs (Illumina) or long reads (Pacific Biosciences, Oxford Nanopore Technologies). A reversible compression step as well as bootstrapped supervised binning enable quick turnaround times. The binning results are represented in interactive 2D scatterplots. Moreover, bin quality estimates, taxonomic annotations and annotations of antibiotic resistance genes are computed and visualized. Ground truth-based benchmarks of BusyBee Web demonstrate comparably high performance to state-of-the-art binning solutions for assembled contigs and markedly improved performance for long reads (median F1 scores: 70.02–95.21%). Furthermore, the applicability to real-world metagenomic datasets is shown. In conclusion, our reference-independent approach automatically bins assembled contigs or long reads, exhibits high sensitivity and precision, enables intuitive inspection of the results, and only requires FASTA-formatted input. The web-based application is freely accessible at: https://ccb-microbe.cs.uni-saarland.de/busybee. PMID:28472498

  5. Adaptive fractional order sliding mode control for Boost converter in the Battery/Supercapacitor HESS

    PubMed Central

    Xu, Dan; Zhou, Huan; Zhou, Tao

    2018-01-01

    In this paper, an adaptive fractional order sliding mode control (AFSMC) scheme is designed for the current tracking control of the Boost-type converter in a Battery/Supercapacitor hybrid energy storage system (HESS). In order to stabilize the current, the adaptation rules based on state-observer and Lyapunov function are being designed. A fractional order sliding surface function is defined based on the tracking current error and adaptive rules. Furthermore, through fractional order analysis, the stability of the fractional order control system is proven, and the value of the fractional order (λ) is being investigated. In addition, the effectiveness of the proposed AFSMC strategy is being verified by numerical simulations. The advantages of good transient response and robustness to uncertainty are being indicated by this design, when compared with a conventional integer order sliding mode control system. PMID:29702696

  6. Classifier performance prediction for computer-aided diagnosis using a limited dataset.

    PubMed

    Sahiner, Berkman; Chan, Heang-Ping; Hadjiiski, Lubomir

    2008-04-01

    In a practical classifier design problem, the true population is generally unknown and the available sample is finite-sized. A common approach is to use a resampling technique to estimate the performance of the classifier that will be trained with the available sample. We conducted a Monte Carlo simulation study to compare the ability of the different resampling techniques in training the classifier and predicting its performance under the constraint of a finite-sized sample. The true population for the two classes was assumed to be multivariate normal distributions with known covariance matrices. Finite sets of sample vectors were drawn from the population. The true performance of the classifier is defined as the area under the receiver operating characteristic curve (AUC) when the classifier designed with the specific sample is applied to the true population. We investigated methods based on the Fukunaga-Hayes and the leave-one-out techniques, as well as three different types of bootstrap methods, namely, the ordinary, 0.632, and 0.632+ bootstrap. The Fisher's linear discriminant analysis was used as the classifier. The dimensionality of the feature space was varied from 3 to 15. The sample size n2 from the positive class was varied between 25 and 60, while the number of cases from the negative class was either equal to n2 or 3n2. Each experiment was performed with an independent dataset randomly drawn from the true population. Using a total of 1000 experiments for each simulation condition, we compared the bias, the variance, and the root-mean-squared error (RMSE) of the AUC estimated using the different resampling techniques relative to the true AUC (obtained from training on a finite dataset and testing on the population). Our results indicated that, under the study conditions, there can be a large difference in the RMSE obtained using different resampling methods, especially when the feature space dimensionality is relatively large and the sample size is small. Under this type of conditions, the 0.632 and 0.632+ bootstrap methods have the lowest RMSE, indicating that the difference between the estimated and the true performances obtained using the 0.632 and 0.632+ bootstrap will be statistically smaller than those obtained using the other three resampling methods. Of the three bootstrap methods, the 0.632+ bootstrap provides the lowest bias. Although this investigation is performed under some specific conditions, it reveals important trends for the problem of classifier performance prediction under the constraint of a limited dataset.

  7. Cloud fraction at the ARM SGP site: reducing uncertainty with self-organizing maps

    NASA Astrophysics Data System (ADS)

    Kennedy, Aaron D.; Dong, Xiquan; Xi, Baike

    2016-04-01

    Instrument downtime leads to uncertainty in the monthly and annual record of cloud fraction (CF), making it difficult to perform time series analyses of cloud properties and perform detailed evaluations of model simulations. As cloud occurrence is partially controlled by the large-scale atmospheric environment, this knowledge is used to reduce uncertainties in the instrument record. Synoptic patterns diagnosed from the North American Regional Reanalysis (NARR) during the period 1997-2010 are classified using a competitive neural network known as the self-organizing map (SOM). The classified synoptic states are then compared to the Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) instrument record to determine the expected CF. A number of SOMs are tested to understand how the number of classes and the period of classifications impact the relationship between classified states and CFs. Bootstrapping is utilized to quantify the uncertainty of the instrument record when statistical information from the SOM is included. Although all SOMs significantly reduce the uncertainty of the CF record calculated in Kennedy et al. (Theor Appl Climatol 115:91-105, 2014), SOMs with a large number of classes and separated by month are required to produce the lowest uncertainty and best agreement with the annual cycle of CF. This result may be due to a manifestation of seasonally dependent biases in NARR. With use of the SOMs, the average uncertainty in monthly CF is reduced in half from the values calculated in Kennedy et al. (Theor Appl Climatol 115:91-105, 2014).

  8. Kinetic effects on the currents determining the stability of a magnetic island in tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poli, E., E-mail: emanuele.poli@ipp.mpg.de; Bergmann, A.; Casson, F. J.

    The role of the bootstrap and polarization currents for the stability of neoclassical tearing modes is investigated employing both a drift kinetic and a gyrokinetic approach. The adiabatic response of the ions around the island separatrix implies, for island widths below or around the ion thermal banana width, density flattening for islands rotating at the ion diamagnetic frequency, while for islands rotating at the electron diamagnetic frequency the density is unperturbed and the only contribution to the neoclassical drive arises from electron temperature flattening. As for the polarization current, the full inclusion of finite orbit width effects in the calculationmore » of the potential developing in a rotating island leads to a smoothing of the discontinuous derivatives exhibited by the analytic potential on which the polarization term used in the modeling is based. This leads to a reduction of the polarization-current contribution with respect to the analytic estimate, in line with other studies. Other contributions to the perpendicular ion current, related to the response of the particles around the island separatrix, are found to compete or even dominate the polarization-current term for realistic island rotation frequencies.« less

  9. Maternal Depression and Trait Anger as Risk Factors for Escalated Physical Discipline

    PubMed Central

    Shay, Nicole L.; Knutson, John F.

    2008-01-01

    To test the hypothesized anger-mediated relation between maternal depression and escalation of physical discipline, 122 economically disadvantaged mothers were assessed for current and lifetime diagnoses of depression using the Current Depressive Episode, Past Depression, and Dysthymia sections of the Structured Clinical Interview for DSM-IV (SCID) and a measure of current depressive symptoms, the Beck Depression Inventory–Second Edition (BDI-II). Escalation of physical discipline was assessed using a video analog parenting task; maternal anger not specific to discipline was assessed using the Spielberger Trait Anger Expression Inventory. Reports of anger were associated with the diagnosis of depression and depressive symptoms. Bootstrap analyses of indirect effects indicated that the link between depression and escalated discipline was mediated by anger. Parallel analyses based on BDI-II scores identified a marginally significant indirect effect of depression on discipline. Findings suggest that anger and irritability are central to the putative link between depression and harsh discipline. PMID:18174347

  10. Trapped fast particle destabilization of internal kink mode for the locally flattened q-profile with an inflection point

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Xian-Qu; Zhang, Rui-Bin; Meng, Guo

    2016-07-15

    The destabilization of ideal internal kink modes by trapped fast particles in tokamak plasmas with a “shoulder”-like equilibrium current is investigated. It is found that energetic particle branch of the mode is unstable with the driving of fast-particle precession drifts and corresponds to a precessional fishbone. The mode with a low stability threshold is also more easily excited than the conventional precessional fishbone. This is different from earlier studies for the same equilibrium in which the magnetohydrodynamic (MHD) branch of the mode is stable. Furthermore, the stability and characteristic frequency of the mode are analyzed by solving the dispersion relationmore » and comparing with the conventional fishbone. The results suggest that an equilibrium with a locally flattened q-profile, may be modified by localized current drive (or bootstrap current, etc.), is prone to the onset of the precessional fishbone branch of the mode.« less

  11. Concept Innateness, Concept Continuity, and Bootstrapping

    PubMed Central

    Carey, Susan

    2011-01-01

    The commentators raised issues relevant to all three important theses of The Origin of Concepts (TOOC). Some questioned the very existence of innate representational primitives, and others questioned my claims about their richness and whether they should be thought of as concepts. Some questioned the existence of conceptual discontinuity in the course of knowledge acquisition and others argued that discontinuity is much more common than portrayed in TOOC. Some raised issues with my characterization of Quinian bootstrapping, and others questioned the dual factor theory of concepts motivated by my picture of conceptual development. PMID:23264705

  12. Crossing symmetry in alpha space

    NASA Astrophysics Data System (ADS)

    Hogervorst, Matthijs; van Rees, Balt C.

    2017-11-01

    We initiate the study of the conformal bootstrap using Sturm-Liouville theory, specializing to four-point functions in one-dimensional CFTs. We do so by decomposing conformal correlators using a basis of eigenfunctions of the Casimir which are labeled by a complex number α. This leads to a systematic method for computing conformal block decompositions. Analyzing bootstrap equations in alpha space turns crossing symmetry into an eigenvalue problem for an integral operator K. The operator K is closely related to the Wilson transform, and some of its eigenfunctions can be found in closed form.

  13. Direct measurement of fast transients by using boot-strapped waveform averaging

    NASA Astrophysics Data System (ADS)

    Olsson, Mattias; Edman, Fredrik; Karki, Khadga Jung

    2018-03-01

    An approximation to coherent sampling, also known as boot-strapped waveform averaging, is presented. The method uses digital cavities to determine the condition for coherent sampling. It can be used to increase the effective sampling rate of a repetitive signal and the signal to noise ratio simultaneously. The method is demonstrated by using it to directly measure the fluorescence lifetime from Rhodamine 6G by digitizing the signal from a fast avalanche photodiode. The obtained lifetime of 4.0 ns is in agreement with the known values.

  14. Mapping and assessing variability in the Antarctic marginal ice zone, pack ice and coastal polynyas in two sea ice algorithms with implications on breeding success of snow petrels

    NASA Astrophysics Data System (ADS)

    Stroeve, Julienne C.; Jenouvrier, Stephanie; Campbell, G. Garrett; Barbraud, Christophe; Delord, Karine

    2016-08-01

    Sea ice variability within the marginal ice zone (MIZ) and polynyas plays an important role for phytoplankton productivity and krill abundance. Therefore, mapping their spatial extent as well as seasonal and interannual variability is essential for understanding how current and future changes in these biologically active regions may impact the Antarctic marine ecosystem. Knowledge of the distribution of MIZ, consolidated pack ice and coastal polynyas in the total Antarctic sea ice cover may also help to shed light on the factors contributing towards recent expansion of the Antarctic ice cover in some regions and contraction in others. The long-term passive microwave satellite data record provides the longest and most consistent record for assessing the proportion of the sea ice cover that is covered by each of these ice categories. However, estimates of the amount of MIZ, consolidated pack ice and polynyas depend strongly on which sea ice algorithm is used. This study uses two popular passive microwave sea ice algorithms, the NASA Team and Bootstrap, and applies the same thresholds to the sea ice concentrations to evaluate the distribution and variability in the MIZ, the consolidated pack ice and coastal polynyas. Results reveal that the seasonal cycle in the MIZ and pack ice is generally similar between both algorithms, yet the NASA Team algorithm has on average twice the MIZ and half the consolidated pack ice area as the Bootstrap algorithm. Trends also differ, with the Bootstrap algorithm suggesting statistically significant trends towards increased pack ice area and no statistically significant trends in the MIZ. The NASA Team algorithm on the other hand indicates statistically significant positive trends in the MIZ during spring. Potential coastal polynya area and amount of broken ice within the consolidated ice pack are also larger in the NASA Team algorithm. The timing of maximum polynya area may differ by as much as 5 months between algorithms. These differences lead to different relationships between sea ice characteristics and biological processes, as illustrated here with the breeding success of an Antarctic seabird.

  15. Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Mai, J.; Tolson, B.

    2017-12-01

    The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method's independency of the convergence testing method, we applied it to two widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991) and the variance-based Sobol' method (Solbol' 1993). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed sensitivity results. This is one step towards reliable and transferable, published sensitivity results.

  16. Neoclassical, semi-collisional tearing mode theory in an axisymmetric torus

    NASA Astrophysics Data System (ADS)

    Connor, J. W.; Hastie, R. J.; Helander, P.

    2017-12-01

    A set of layer equations for determining the stability of semi-collisional tearing modes in an axisymmetric torus, incorporating neoclassical physics, in the small ion Larmor radius limit, is provided. These can be used as an inner layer module for inclusion in numerical codes that asymptotically match the layer to toroidal calculations of the tearing mode stability index, \\prime $ . They are more complete than in earlier work and comprise equations for the perturbed electron density and temperature, the ion temperature, Ampère's law and the vorticity equation, amounting to a twelvth-order set of radial differential equations. While the toroidal geometry is kept quite general when treating the classical and Pfirsch-Schlüter transport, parallel bootstrap current and semi-collisional physics, it is assumed that the fraction of trapped particles is small for the banana regime contribution. This is to justify the use of a model collision term when acting on the localised (in velocity space) solutions that remain after the Spitzer solutions have been exploited to account for the bulk of the passing distributions. In this respect, unlike standard neoclassical transport theory, the calculation involves the second Spitzer solution connected with a parallel temperature gradient, because this stability problem involves parallel temperature gradients that cannot occur in equilibrium toroidal transport theory. Furthermore, a calculation of the linearised neoclassical radial transport of toroidal momentum for general geometry is required to complete the vorticity equation. The solutions of the resulting set of equations do not match properly to the ideal magnetohydrodynamic (MHD) equations at large distances from the layer, and a further, intermediate layer involving ion corrections to the electrical conductivity and ion parallel thermal transport is invoked to achieve this matching and allow one to correctly calculate the layer \\prime $ .

  17. Trachyspermum ammi (L.) sprague: chemical composition of essential oil and antimicrobial activities of respective fractions.

    PubMed

    Moein, Mahmoodreza R; Zomorodian, Kamiar; Pakshir, Keyvan; Yavari, Farnoosh; Motamedi, Marjan; Zarshenas, Mohammad M

    2015-01-01

    Resistance to antibacterial agents has become a serious problem for global health. The current study evaluated the antimicrobial activities of essential oil and respective fractions of Trachyspermum ammi (L.) Sprague. Seeds of the essential oil were extracted and fractionated using column chromatography. All fractions were then analyzed by gas chromatography/mass spectrometry. Antifungal and antibacterial activities of the oil and its fractions were assessed using microdilution method. Compounds γ-terpinene (48.07%), ρ-cymene (33.73%), and thymol (17.41%) were determined as major constituents. The effect of fraction II was better than total essential oil, fraction I, and standard thymol. The greater effect of fraction II compared to standard thymol showed the synergistic effects of the ingredients in this fraction. As this fraction and also total oil were effective on the studied microorganism, the combination of these products with current antimicrobial agents could be considered as new antimicrobial compounds in further investigations. © The Author(s) 2014.

  18. A risk-adjusted financial model to estimate the cost of a video-assisted thoracoscopic surgery lobectomy programme.

    PubMed

    Brunelli, Alessandro; Tentzeris, Vasileios; Sandri, Alberto; McKenna, Alexandra; Liew, Shan Liung; Milton, Richard; Chaudhuri, Nilanjan; Kefaloyannis, Emmanuel; Papagiannopoulos, Kostas

    2016-05-01

    To develop a clinically risk-adjusted financial model to estimate the cost associated with a video-assisted thoracoscopic surgery (VATS) lobectomy programme. Prospectively collected data of 236 VATS lobectomy patients (August 2012-December 2013) were analysed retrospectively. Fixed and variable intraoperative and postoperative costs were retrieved from the Hospital Accounting Department. Baseline and surgical variables were tested for a possible association with total cost using a multivariable linear regression and bootstrap analyses. Costs were calculated in GBP and expressed in Euros (EUR:GBP exchange rate 1.4). The average total cost of a VATS lobectomy was €11 368 (range €6992-€62 535). Average intraoperative (including surgical and anaesthetic time, overhead, disposable materials) and postoperative costs [including ward stay, high dependency unit (HDU) or intensive care unit (ICU) and variable costs associated with management of complications] were €8226 (range €5656-€13 296) and €3029 (range €529-€51 970), respectively. The following variables remained reliably associated with total costs after linear regression analysis and bootstrap: carbon monoxide lung diffusion capacity (DLCO) <60% predicted value (P = 0.02, bootstrap 63%) and chronic obstructive pulmonary disease (COPD; P = 0.035, bootstrap 57%). The following model was developed to estimate the total costs: 10 523 + 1894 × COPD + 2376 × DLCO < 60%. The comparison between predicted and observed costs was repeated in 1000 bootstrapped samples to verify the stability of the model. The two values were not different (P > 0.05) in 86% of the samples. A hypothetical patient with COPD and DLCO less than 60% would cost €4270 more than a patient without COPD and with higher DLCO values (€14 793 vs €10 523). Risk-adjusting financial data can help estimate the total cost associated with VATS lobectomy based on clinical factors. This model can be used to audit the internal financial performance of a VATS lobectomy programme for budgeting, planning and for appropriate bundled payment reimbursements. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  19. Childhood trauma and problem behavior: Examining the mediating roles of experiential avoidance and mindfulness processes.

    PubMed

    Roche, Anne I; Kroska, Emily B; Miller, Michelle L; Kroska, Sydney K; O'Hara, Michael W

    2018-03-22

    Childhood trauma is associated with a variety of risky, unhealthy, or problem behaviors. The current study aimed to explore experiential avoidance and mindfulness processes as mechanisms through which childhood trauma and problem behavior are linked in a college sample. The sample consisted of college-aged young adults recruited November-December, 2016 (N = 414). Participants completed self-report measures of childhood trauma, current problem behavior, experiential avoidance, and mindfulness processes. Bootstrapped mediation analyses examined the mechanistic associations of interest. Mediation analyses indicated that experiential avoidance was a significant mediator of the association between childhood trauma and problem behavior. Additionally, multiple mediation analyses indicated that specific mindfulness facets-act with awareness and nonjudgment of inner experience-significantly mediated the same association. Interventions for college students who have experienced childhood trauma might profitably target mechanisms such as avoidance and mindfulness in order to minimize engagement in problem behavior.

  20. Anomalous dimensions of spinning operators from conformal symmetry

    NASA Astrophysics Data System (ADS)

    Gliozzi, Ferdinando

    2018-01-01

    We compute, to the first non-trivial order in the ɛ-expansion of a perturbed scalar field theory, the anomalous dimensions of an infinite class of primary operators with arbitrary spin ℓ = 0, 1, . . . , including as a particular case the weakly broken higher-spin currents, using only constraints from conformal symmetry. Following the bootstrap philosophy, no reference is made to any Lagrangian, equations of motion or coupling constants. Even the space dimensions d are left free. The interaction is implicitly turned on through the local operators by letting them acquire anomalous dimensions. When matching certain four-point and five-point functions with the corresponding quantities of the free field theory in the ɛ → 0 limit, no free parameter remains. It turns out that only the expected discrete d values are permitted and the ensuing anomalous dimensions reproduce known results for the weakly broken higher-spin currents and provide new results for the other spinning operators.

  1. Bootstrapping language acquisition.

    PubMed

    Abend, Omri; Kwiatkowski, Tom; Smith, Nathaniel J; Goldwater, Sharon; Steedman, Mark

    2017-07-01

    The semantic bootstrapping hypothesis proposes that children acquire their native language through exposure to sentences of the language paired with structured representations of their meaning, whose component substructures can be associated with words and syntactic structures used to express these concepts. The child's task is then to learn a language-specific grammar and lexicon based on (probably contextually ambiguous, possibly somewhat noisy) pairs of sentences and their meaning representations (logical forms). Starting from these assumptions, we develop a Bayesian probabilistic account of semantically bootstrapped first-language acquisition in the child, based on techniques from computational parsing and interpretation of unrestricted text. Our learner jointly models (a) word learning: the mapping between components of the given sentential meaning and lexical words (or phrases) of the language, and (b) syntax learning: the projection of lexical elements onto sentences by universal construction-free syntactic rules. Using an incremental learning algorithm, we apply the model to a dataset of real syntactically complex child-directed utterances and (pseudo) logical forms, the latter including contextually plausible but irrelevant distractors. Taking the Eve section of the CHILDES corpus as input, the model simulates several well-documented phenomena from the developmental literature. In particular, the model exhibits syntactic bootstrapping effects (in which previously learned constructions facilitate the learning of novel words), sudden jumps in learning without explicit parameter setting, acceleration of word-learning (the "vocabulary spurt"), an initial bias favoring the learning of nouns over verbs, and one-shot learning of words and their meanings. The learner thus demonstrates how statistical learning over structured representations can provide a unified account for these seemingly disparate phenomena. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Effect of non-normality on test statistics for one-way independent groups designs.

    PubMed

    Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R

    2012-02-01

    The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.

  3. Quantification of variability and uncertainty for air toxic emission inventories with censored emission factor data.

    PubMed

    Frey, H Christopher; Zhao, Yuchao

    2004-11-15

    Probabilistic emission inventories were developed for urban air toxic emissions of benzene, formaldehyde, chromium, and arsenic for the example of Houston. Variability and uncertainty in emission factors were quantified for 71-97% of total emissions, depending upon the pollutant and data availability. Parametric distributions for interunit variability were fit using maximum likelihood estimation (MLE), and uncertainty in mean emission factors was estimated using parametric bootstrap simulation. For data sets containing one or more nondetected values, empirical bootstrap simulation was used to randomly sample detection limits for nondetected values and observations for sample values, and parametric distributions for variability were fit using MLE estimators for censored data. The goodness-of-fit for censored data was evaluated by comparison of cumulative distributions of bootstrap confidence intervals and empirical data. The emission inventory 95% uncertainty ranges are as small as -25% to +42% for chromium to as large as -75% to +224% for arsenic with correlated surrogates. Uncertainty was dominated by only a few source categories. Recommendations are made for future improvements to the analysis.

  4. Bootstrapping non-commutative gauge theories from L∞ algebras

    NASA Astrophysics Data System (ADS)

    Blumenhagen, Ralph; Brunner, Ilka; Kupriyanov, Vladislav; Lüst, Dieter

    2018-05-01

    Non-commutative gauge theories with a non-constant NC-parameter are investigated. As a novel approach, we propose that such theories should admit an underlying L∞ algebra, that governs not only the action of the symmetries but also the dynamics of the theory. Our approach is well motivated from string theory. We recall that such field theories arise in the context of branes in WZW models and briefly comment on its appearance for integrable deformations of AdS5 sigma models. For the SU(2) WZW model, we show that the earlier proposed matrix valued gauge theory on the fuzzy 2-sphere can be bootstrapped via an L∞ algebra. We then apply this approach to the construction of non-commutative Chern-Simons and Yang-Mills theories on flat and curved backgrounds with non-constant NC-structure. More concretely, up to the second order, we demonstrate how derivative and curvature corrections to the equations of motion can be bootstrapped in an algebraic way from the L∞ algebra. The appearance of a non-trivial A∞ algebra is discussed, as well.

  5. A symbol of uniqueness: the cluster bootstrap for the 3-loop MHV heptagon

    DOE PAGES

    Drummond, J. M.; Papathanasiou, G.; Spradlin, M.

    2015-03-16

    Seven-particle scattering amplitudes in planar super-Yang-Mills theory are believed to belong to a special class of generalised polylogarithm functions called heptagon functions. These are functions with physical branch cuts whose symbols may be written in terms of the 42 cluster A-coordinates on Gr(4, 7). Motivated by the success of the hexagon bootstrap programme for constructing six-particle amplitudes we initiate the systematic study of the symbols of heptagon functions. We find that there is exactly one such symbol of weight six which satisfies the MHV last-entry condition and is finite in the 7 ll 6 collinear limit. This unique symbol ismore » both dihedral and parity-symmetric, and remarkably its collinear limit is exactly the symbol of the three-loop six-particle MHV amplitude, although none of these properties were assumed a priori. It must therefore be the symbol of the threeloop seven-particle MHV amplitude. The simplicity of its construction suggests that the n-gon bootstrap may be surprisingly powerful for n > 6.« less

  6. Review of Orbital Propellant Transfer Techniques and the Feasibility of a Thermal Bootstrap Propellant Transfer Concepts

    NASA Technical Reports Server (NTRS)

    Yoshikawa, H. H.; Madison, I. B.

    1971-01-01

    This study was performed in support of the NASA Task B-2 Study Plan for Space Basing. The nature of space-based operations implies that orbital transfer of propellant is a prime consideration. The intent of this report is (1) to report on the findings and recommendations of existing literature on space-based propellant transfer techniques, and (2) to determine possible alternatives to the recommended methods. The reviewed literature recommends, in general, the use of conventional liquid transfer techniques (i.e., pumping) in conjunction with an artificially induced gravitational field. An alternate concept that was studied, the Thermal Bootstrap Transfer Process, is based on the compression of a two-phase fluid with subsequent condensation to a liquid (vapor compression/condensation). This concept utilizes the intrinsic energy capacities of the tanks and propellant by exploiting temperature differentials and available energy differences. The results indicate the thermodynamic feasibility of the Thermal Bootstrap Transfer Process for a specific range of tank sizes, temperatures, fill-factors and receiver tank heat transfer coefficients.

  7. Parameterization of light absorption by components of seawater in optically complex coastal waters of the Crimea Peninsula (Black Sea).

    PubMed

    Dmitriev, Egor V; Khomenko, Georges; Chami, Malik; Sokolov, Anton A; Churilova, Tatyana Y; Korotaev, Gennady K

    2009-03-01

    The absorption of sunlight by oceanic constituents significantly contributes to the spectral distribution of the water-leaving radiance. Here it is shown that current parameterizations of absorption coefficients do not apply to the optically complex waters of the Crimea Peninsula. Based on in situ measurements, parameterizations of phytoplankton, nonalgal, and total particulate absorption coefficients are proposed. Their performance is evaluated using a log-log regression combined with a low-pass filter and the nonlinear least-square method. Statistical significance of the estimated parameters is verified using the bootstrap method. The parameterizations are relevant for chlorophyll a concentrations ranging from 0.45 up to 2 mg/m(3).

  8. Use of high order, periodic orbits in the PIES code

    NASA Astrophysics Data System (ADS)

    Monticello, Donald; Reiman, Allan

    2010-11-01

    We have implemented a version of the PIES code (Princeton Iterative Equilibrium SolverootnotetextA. Reiman et al 2007 Nucl. Fusion 47 572) that uses high order periodic orbits to select the surfaces on which straight magnetic field line coordinates will be calculated. The use of high order periodic orbits has increase the robustness and speed of the PIES code. We now have more uniform treatment of in-phase and out-of-phase islands. This new version has better convergence properties and works well with a full Newton scheme. We now have the ability to shrink islands using a bootstrap like current and this includes the m=1 island in tokamaks.

  9. Quantitative void fraction detection with an eddy current flowmeter for generation IV Sodium cooled Fast Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, M.; French Atomic Energy and Alternative Energies Commission; Tordjeman, Ph.

    2015-07-01

    This study was carried out to understand the response of an eddy current type flowmeter in two phase liquid-metal flow. We use the technique of ellipse fit and correlate the fluctuations in the angle of inclination of this ellipse with the void fraction. The effects of physical parameters such as coil excitation frequency and flow velocity have been studied. The results show the possibility of using an eddy current flowmeter as a gas detector for large void fractions. (authors)

  10. Quantitative void fraction measurement with an eddy current flowmeter for generation IV Sodium cooled Fast Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, M.; CEA, DEN, Nuclear Technology Department, F-13108 Saint-Paul-lez-Durance; Tordjeman, Ph.

    2015-07-01

    This study was carried out to understand the response of an eddy current type flowmeter in two phase liquid-metal flow. We use the technique of ellipse fit and correlate the fluctuations in the angle of inclination of this ellipse with the void fraction. The effects of physical parameters such as coil excitation frequency and flow velocity have been studied. The results show the possibility of using an eddy current flowmeter as a gas detector for large void fractions. (authors)

  11. Three ways to solve critical ϕ4 theory on 4 ‑ 𝜖 dimensional real projective space: Perturbation, bootstrap, and Schwinger-Dyson equation

    NASA Astrophysics Data System (ADS)

    Hasegawa, Chika; Nakayama, Yu

    2018-03-01

    In this paper, we solve the two-point function of the lowest dimensional scalar operator in the critical ϕ4 theory on 4 ‑ 𝜖 dimensional real projective space in three different methods. The first is to use the conventional perturbation theory, and the second is to impose the cross-cap bootstrap equation, and the third is to solve the Schwinger-Dyson equation under the assumption of conformal invariance. We find that the three methods lead to mutually consistent results but each has its own advantage.

  12. On critical exponents without Feynman diagrams

    NASA Astrophysics Data System (ADS)

    Sen, Kallol; Sinha, Aninda

    2016-11-01

    In order to achieve a better analytic handle on the modern conformal bootstrap program, we re-examine and extend the pioneering 1974 work of Polyakov’s, which was based on consistency between the operator product expansion and unitarity. As in the bootstrap approach, this method does not depend on evaluating Feynman diagrams. We show how this approach can be used to compute the anomalous dimensions of certain operators in the O(n) model at the Wilson-Fisher fixed point in 4-ɛ dimensions up to O({ɛ }2). AS dedicates this work to the loving memory of his mother.

  13. Bootstrapping 3D fermions

    DOE PAGES

    Iliesiu, Luca; Kos, Filip; Poland, David; ...

    2016-03-17

    We study the conformal bootstrap for a 4-point function of fermions in 3D. We first introduce an embedding formalism for 3D spinors and compute the conformal blocks appearing in fermion 4-point functions. Using these results, we find general bounds on the dimensions of operators appearing in the ψ × ψ OPE, and also on the central charge C T. We observe features in our bounds that coincide with scaling dimensions in the GrossNeveu models at large N. Finally, we also speculate that other features could coincide with a fermionic CFT containing no relevant scalar operators.

  14. New Methods for Estimating Seasonal Potential Climate Predictability

    NASA Astrophysics Data System (ADS)

    Feng, Xia

    This study develops two new statistical approaches to assess the seasonal potential predictability of the observed climate variables. One is the univariate analysis of covariance (ANOCOVA) model, a combination of autoregressive (AR) model and analysis of variance (ANOVA). It has the advantage of taking into account the uncertainty of the estimated parameter due to sampling errors in statistical test, which is often neglected in AR based methods, and accounting for daily autocorrelation that is not considered in traditional ANOVA. In the ANOCOVA model, the seasonal signals arising from external forcing are determined to be identical or not to assess any interannual variability that may exist is potentially predictable. The bootstrap is an attractive alternative method that requires no hypothesis model and is available no matter how mathematically complicated the parameter estimator. This method builds up the empirical distribution of the interannual variance from the resamplings drawn with replacement from the given sample, in which the only predictability in seasonal means arises from the weather noise. These two methods are applied to temperature and water cycle components including precipitation and evaporation, to measure the extent to which the interannual variance of seasonal means exceeds the unpredictable weather noise compared with the previous methods, including Leith-Shukla-Gutzler (LSG), Madden, and Katz. The potential predictability of temperature from ANOCOVA model, bootstrap, LSG and Madden exhibits a pronounced tropical-extratropical contrast with much larger predictability in the tropics dominated by El Nino/Southern Oscillation (ENSO) than in higher latitudes where strong internal variability lowers predictability. Bootstrap tends to display highest predictability of the four methods, ANOCOVA lies in the middle, while LSG and Madden appear to generate lower predictability. Seasonal precipitation from ANOCOVA, bootstrap, and Katz, resembling that for temperature, is more predictable over the tropical regions, and less predictable in extropics. Bootstrap and ANOCOVA are in good agreement with each other, both methods generating larger predictability than Katz. The seasonal predictability of evaporation over land bears considerably similarity with that of temperature using ANOCOVA, bootstrap, LSG and Madden. The remote SST forcing and soil moisture reveal substantial seasonality in their relations with the potentially predictable seasonal signals. For selected regions, either SST or soil moisture or both shows significant relationships with predictable signals, hence providing indirect insight on slowly varying boundary processes involved to enable useful seasonal climate predication. A multivariate analysis of covariance (MANOCOVA) model is established to identify distinctive predictable patterns, which are uncorrelated with each other. Generally speaking, the seasonal predictability from multivariate model is consistent with that from ANOCOVA. Besides unveiling the spatial variability of predictability, MANOCOVA model also reveals the temporal variability of each predictable pattern, which could be linked to the periodic oscillations.

  15. Membrane Capacitive Memory Alters Spiking in Neurons Described by the Fractional-Order Hodgkin-Huxley Model

    PubMed Central

    Weinberg, Seth H.

    2015-01-01

    Excitable cells and cell membranes are often modeled by the simple yet elegant parallel resistor-capacitor circuit. However, studies have shown that the passive properties of membranes may be more appropriately modeled with a non-ideal capacitor, in which the current-voltage relationship is given by a fractional-order derivative. Fractional-order membrane potential dynamics introduce capacitive memory effects, i.e., dynamics are influenced by a weighted sum of the membrane potential prior history. However, it is not clear to what extent fractional-order dynamics may alter the properties of active excitable cells. In this study, we investigate the spiking properties of the neuronal membrane patch, nerve axon, and neural networks described by the fractional-order Hodgkin-Huxley neuron model. We find that in the membrane patch model, as fractional-order decreases, i.e., a greater influence of membrane potential memory, peak sodium and potassium currents are altered, and spike frequency and amplitude are generally reduced. In the nerve axon, the velocity of spike propagation increases as fractional-order decreases, while in a neural network, electrical activity is more likely to cease for smaller fractional-order. Importantly, we demonstrate that the modulation of the peak ionic currents that occurs for reduced fractional-order alone fails to reproduce many of the key alterations in spiking properties, suggesting that membrane capacitive memory and fractional-order membrane potential dynamics are important and necessary to reproduce neuronal electrical activity. PMID:25970534

  16. Identification of cephalopod species from the North and Baltic Seas using morphology, COI and 18S rDNA sequences

    NASA Astrophysics Data System (ADS)

    Gebhardt, Katharina; Knebelsberger, Thomas

    2015-09-01

    We morphologically analyzed 79 cephalopod specimens from the North and Baltic Seas belonging to 13 separate species. Another 29 specimens showed morphological features of either Alloteuthis mediaor Alloteuthis subulata or were found to be in between. Reliable identification features to distinguish between A. media and A. subulata are currently not available. The analysis of the DNA barcoding region of the COI gene revealed intraspecific distances (uncorrected p) ranging from 0 to 2.13 % (average 0.1 %) and interspecific distances between 3.31 and 22 % (average 15.52 %). All species formed monophyletic clusters in a neighbor-joining analysis and were supported by bootstrap values of ≥99 %. All COI haplotypes belonging to the 29 Alloteuthis specimens were grouped in one cluster. Neither COI nor 18S rDNA sequences helped to distinguish between the different Alloteuthis morphotypes. For species identification purposes, we recommend the use of COI, as it showed higher bootstrap support of species clusters and less amplification and sequencing failure compared to 18S. Our data strongly support the assumption that the genus Alloteuthis is only represented by a single species, at least in the North Sea. It remained unclear whether this species is A. subulata or A. media. All COI sequences including important metadata were uploaded to the Barcode of Life Data Systems and can be used as reference library for the molecular identification of more than 50 % of the cephalopod fauna known from the North and Baltic Seas.

  17. Scoring system to guide decision making for the use of gentamicin-impregnated collagen sponge to prevent deep sternal wound infection.

    PubMed

    Benedetto, Umberto; Raja, Shahzad G

    2014-11-01

    The effectiveness of the routine retrosternal placement of a gentamicin-impregnated collagen sponge (GICS) implant before sternotomy closure is currently a matter of some controversy. We aimed to develop a scoring system to guide decision making for the use of GICS to prevent deep sternal wound infection. Fast backward elimination on predictors, including GICS, was performed using the Lawless and Singhal method. The scoring system was reported as a partial nomogram that can be used to manually obtain predicted individual risk of deep sternal wound infection from the regression model. Bootstrapping validation of the regression models was performed. The final populations consisted of 8750 adult patients undergoing cardiac surgery through full sternotomy during the study period. A total of 329 patients (3.8%) received GICS implant. The overall incidence of deep sternal wound infection was lower among patients who received GICS implant (0.6%) than patients who did not (2.01%) (P=.02). A nomogram to predict the individual risk for deep sternal wound infection was developed that included the use of GICS. Bootstrapping validation confirmed a good discriminative power of the models. The scoring system provides an impartial assessment of the decision-making process for clinicians to establish if GICS implant is effective in reducing the risk for deep sternal wound infection in individual patients undergoing cardiac surgery through full sternotomy. Copyright © 2014 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  18. Mapping and Assessing Variability in the Antarctic Marginal Ice Zone, the Pack Ice and Coastal Polynyas

    NASA Astrophysics Data System (ADS)

    Stroeve, Julienne; Jenouvrier, Stephanie

    2016-04-01

    Sea ice variability within the marginal ice zone (MIZ) and polynyas plays an important role for phytoplankton productivity and krill abundance. Therefore mapping their spatial extent, seasonal and interannual variability is essential for understanding how current and future changes in these biological active regions may impact the Antarctic marine ecosystem. Knowledge of the distribution of different ice types to the total Antarctic sea ice cover may also help to shed light on the factors contributing towards recent expansion of the Antarctic ice cover in some regions and contraction in others. The long-term passive microwave satellite data record provides the longest and most consistent data record for assessing different ice types. However, estimates of the amount of MIZ, consolidated pack ice and polynyas depends strongly on what sea ice algorithm is used. This study uses two popular passive microwave sea ice algorithms, the NASA Team and Bootstrap to evaluate the distribution and variability in the MIZ, the consolidated pack ice and coastal polynyas. Results reveal the NASA Team algorithm has on average twice the MIZ and half the consolidated pack ice area as the Bootstrap algorithm. Polynya area is also larger in the NASA Team algorithm, and the timing of maximum polynya area may differ by as much as 5 months between algorithms. These differences lead to different relationships between sea ice characteristics and biological processes, as illustrated here with the breeding success of an Antarctic seabird.

  19. The use of fractional order derivatives for eddy current non-destructive testing

    NASA Astrophysics Data System (ADS)

    Sikora, Ryszard; Grzywacz, Bogdan; Chady, Tomasz

    2018-04-01

    The paper presents the possibility of using the fractional derivatives for non-destructive testing when a multi-frequency method based on eddy current is applied. It is shown that frequency characteristics obtained during tests can be approximated by characteristics of a proposed model in the form of fractional order transfer function, and values of parameters of this model can be utilized for detection and identification of defects.

  20. Memory in a fractional-order cardiomyocyte model alters properties of alternans and spontaneous activity

    NASA Astrophysics Data System (ADS)

    Comlekoglu, T.; Weinberg, S. H.

    2017-09-01

    Cardiac memory is the dependence of electrical activity on the prior history of one or more system state variables, including transmembrane potential (Vm), ionic current gating, and ion concentrations. While prior work has represented memory either phenomenologically or with biophysical detail, in this study, we consider an intermediate approach of a minimal three-variable cardiomyocyte model, modified with fractional-order dynamics, i.e., a differential equation of order between 0 and 1, to account for history-dependence. Memory is represented via both capacitive memory, due to fractional-order Vm dynamics, that arises due to non-ideal behavior of membrane capacitance; and ionic current gating memory, due to fractional-order gating variable dynamics, that arises due to gating history-dependence. We perform simulations for varying Vm and gating variable fractional-orders and pacing cycle length and measure action potential duration (APD) and incidence of alternans, loss of capture, and spontaneous activity. In the absence of ionic current gating memory, we find that capacitive memory, i.e., decreased Vm fractional-order, typically shortens APD, suppresses alternans, and decreases the minimum cycle length (MCL) for loss of capture. However, in the presence of ionic current gating memory, capacitive memory can prolong APD, promote alternans, and increase MCL. Further, we find that reduced Vm fractional order (typically less than 0.75) can drive phase 4 depolarizations that promote spontaneous activity. Collectively, our results demonstrate that memory reproduced by a fractional-order model can play a role in alternans formation and pacemaking, and in general, can greatly increase the range of electrophysiological characteristics exhibited by a minimal model.

  1. Patient-reported urinary incontinence after radiotherapy for prostate cancer: Quantifying the dose-effect.

    PubMed

    Cozzarini, Cesare; Rancati, Tiziana; Palorini, Federica; Avuzzi, Barbara; Garibaldi, Elisabetta; Balestrini, Damiano; Cante, Domenico; Munoz, Fernando; Franco, Pierfrancesco; Girelli, Giuseppe; Sini, Carla; Vavassori, Vittorio; Valdagni, Riccardo; Fiorino, Claudio

    2017-10-01

    Urinary incontinence following radiotherapy (RT) for prostate cancer (PCa) has a relevant impact on patient's quality of life. The aim of the study was to assess the unknown dose-effect relationship for late patient-reported urinary incontinence (LPRUI). Patients were enrolled within the multi-centric study DUE01. Clinical and dosimetry data including the prescribed 2Gy equivalent dose (EQD2) were prospectively collected. LPRUI was evaluated through the ICIQ-SF questionnaire filled in by the patients at RT start/end and therefore every 6months. Patients were treated with conventional (74-80Gy, 1.8-2Gy/fr) or moderately hypo-fractionated RT (65-75.2Gy, 2.2-2.7Gy/fr) in 5 fractions/week with intensity-modulated radiotherapy. Six different end-points of 3-year LPRUI, including or not patient's perception (respectively, subjective and objective end-points), were considered. Multivariable logistic models were developed for each end-point. Data of 298 patients were analyzed. The incidence of the most severe end-point (ICIQ-SF>12) was 5.1%. EQD2 calculated with alpha-beta=0.8Gy showed the best performance in fitting data: the risk of LPRUI markedly increased for EQD2>80Gy. Previous abdominal/pelvic surgery and previous TURP were the clinical factors more significantly predictive of LPRUI. Models showed excellent performances in terms of goodness-of-fit and calibration, confirmed by bootstrap-based internal validation. When included in the analyses, baseline symptoms were a major predictor for 5 out of six end-points. LPRUI after RT for PCa dramatically depends on EQD2 and few clinical factors. Results are consistent with a larger than expected impact of moderate hypo-fractionation on the risk of LPRUI. As expected, baseline symptoms, as captured by ICIQ-SF, are associated to an increased risk of LPRUI. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Patient satisfaction after pulmonary resection for lung cancer: a multicenter comparative analysis.

    PubMed

    Pompili, Cecilia; Brunelli, Alessandro; Rocco, Gaetano; Salvi, Rosario; Xiumé, Francesco; La Rocca, Antonello; Sabbatini, Armando; Martucci, Nicola

    2013-01-01

    Patient satisfaction reflects the perception of the customer about the level of quality of care received during the episode of hospitalization. To compare the levels of satisfaction of patients submitted to lung resection in two different thoracic surgical units. Prospective analysis of 280 consecutive patients submitted to pulmonary resection for neoplastic disease in two centers (center A: 139 patients; center B: 141 patients; 2009-2010). Patients' satisfaction was assessed at discharge through the EORTC-InPatSat32 module, a 32-item, multi-scale self-administered anonymous questionnaire. Each scale (ranging from 0 to 100 in score) was compared between the two units. Multivariable regression and bootstrap were used to verify factors associated with the patients' general satisfaction (dependent variable). Patients from unit B reported a higher general satisfaction (91.5 vs. 88.3, p = 0.04), mainly due to a significantly higher satisfaction in the doctor-related scales (doctors' technical skill: p = 0.001; doctors' interpersonal skill: p = 0.008; doctors' availability: p = 0.005, and doctors information provision: p = 0.0006). Multivariable regression analysis and bootstrap confirmed that level of care in unit B (p = 0.006, bootstrap frequency 60%) along with lower level of education of the patient population (p = 0.02, bootstrap frequency 62%) were independent factors associated with a higher general patient satisfaction. We were able to show a different level of patient satisfaction in patients operated on in two different thoracic surgery units. A reduced level of patient satisfaction may trigger changes in the management policy of individual units in order to meet patients' expectations and improve organizational efficiency. Copyright © 2012 S. Karger AG, Basel.

  3. Phylogenetic relationships of the dwarf boas and a comparison of Bayesian and bootstrap measures of phylogenetic support.

    PubMed

    Wilcox, Thomas P; Zwickl, Derrick J; Heath, Tracy A; Hillis, David M

    2002-11-01

    Four New World genera of dwarf boas (Exiliboa, Trachyboa, Tropidophis, and Ungaliophis) have been placed by many systematists in a single group (traditionally called Tropidophiidae). However, the monophyly of this group has been questioned in several studies. Moreover, the overall relationships among basal snake lineages, including the placement of the dwarf boas, are poorly understood. We obtained mtDNA sequence data for 12S, 16S, and intervening tRNA-val genes from 23 species of snakes representing most major snake lineages, including all four genera of New World dwarf boas. We then examined the phylogenetic position of these species by estimating the phylogeny of the basal snakes. Our phylogenetic analysis suggests that New World dwarf boas are not monophyletic. Instead, we find Exiliboa and Ungaliophis to be most closely related to sand boas (Erycinae), boas (Boinae), and advanced snakes (Caenophidea), whereas Tropidophis and Trachyboa form an independent clade that separated relatively early in snake radiation. Our estimate of snake phylogeny differs significantly in other ways from some previous estimates of snake phylogeny. For instance, pythons do not cluster with boas and sand boas, but instead show a strong relationship with Loxocemus and Xenopeltis. Additionally, uropeltids cluster strongly with Cylindrophis, and together are embedded in what has previously been considered the macrostomatan radiation. These relationships are supported by both bootstrapping (parametric and nonparametric approaches) and Bayesian analysis, although Bayesian support values are consistently higher than those obtained from nonparametric bootstrapping. Simulations show that Bayesian support values represent much better estimates of phylogenetic accuracy than do nonparametric bootstrap support values, at least under the conditions of our study. Copyright 2002 Elsevier Science (USA)

  4. Explanation of Two Anomalous Results in Statistical Mediation Analysis.

    PubMed

    Fritz, Matthew S; Taylor, Aaron B; Mackinnon, David P

    2012-01-01

    Previous studies of different methods of testing mediation models have consistently found two anomalous results. The first result is elevated Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap tests not found in nonresampling tests or in resampling tests that did not include a bias correction. This is of special concern as the bias-corrected bootstrap is often recommended and used due to its higher statistical power compared with other tests. The second result is statistical power reaching an asymptote far below 1.0 and in some conditions even declining slightly as the size of the relationship between X and M , a , increased. Two computer simulations were conducted to examine these findings in greater detail. Results from the first simulation found that the increased Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap are a function of an interaction between the size of the individual paths making up the mediated effect and the sample size, such that elevated Type I error rates occur when the sample size is small and the effect size of the nonzero path is medium or larger. Results from the second simulation found that stagnation and decreases in statistical power as a function of the effect size of the a path occurred primarily when the path between M and Y , b , was small. Two empirical mediation examples are provided using data from a steroid prevention and health promotion program aimed at high school football players (Athletes Training and Learning to Avoid Steroids; Goldberg et al., 1996), one to illustrate a possible Type I error for the bias-corrected bootstrap test and a second to illustrate a loss in power related to the size of a . Implications of these findings are discussed.

  5. Sample size determination for mediation analysis of longitudinal data.

    PubMed

    Pan, Haitao; Liu, Suyu; Miao, Danmin; Yuan, Ying

    2018-03-27

    Sample size planning for longitudinal data is crucial when designing mediation studies because sufficient statistical power is not only required in grant applications and peer-reviewed publications, but is essential to reliable research results. However, sample size determination is not straightforward for mediation analysis of longitudinal design. To facilitate planning the sample size for longitudinal mediation studies with a multilevel mediation model, this article provides the sample size required to achieve 80% power by simulations under various sizes of the mediation effect, within-subject correlations and numbers of repeated measures. The sample size calculation is based on three commonly used mediation tests: Sobel's method, distribution of product method and the bootstrap method. Among the three methods of testing the mediation effects, Sobel's method required the largest sample size to achieve 80% power. Bootstrapping and the distribution of the product method performed similarly and were more powerful than Sobel's method, as reflected by the relatively smaller sample sizes. For all three methods, the sample size required to achieve 80% power depended on the value of the ICC (i.e., within-subject correlation). A larger value of ICC typically required a larger sample size to achieve 80% power. Simulation results also illustrated the advantage of the longitudinal study design. The sample size tables for most encountered scenarios in practice have also been published for convenient use. Extensive simulations study showed that the distribution of the product method and bootstrapping method have superior performance to the Sobel's method, but the product method was recommended to use in practice in terms of less computation time load compared to the bootstrapping method. A R package has been developed for the product method of sample size determination in mediation longitudinal study design.

  6. A Validated Prediction Model for Overall Survival From Stage III Non-Small Cell Lung Cancer: Toward Survival Prediction for Individual Patients.

    PubMed

    Oberije, Cary; De Ruysscher, Dirk; Houben, Ruud; van de Heuvel, Michel; Uyterlinde, Wilma; Deasy, Joseph O; Belderbos, Jose; Dingemans, Anne-Marie C; Rimner, Andreas; Din, Shaun; Lambin, Philippe

    2015-07-15

    Although patients with stage III non-small cell lung cancer (NSCLC) are homogeneous according to the TNM staging system, they form a heterogeneous group, which is reflected in the survival outcome. The increasing amount of information for an individual patient and the growing number of treatment options facilitate personalized treatment, but they also complicate treatment decision making. Decision support systems (DSS), which provide individualized prognostic information, can overcome this but are currently lacking. A DSS for stage III NSCLC requires the development and integration of multiple models. The current study takes the first step in this process by developing and validating a model that can provide physicians with a survival probability for an individual NSCLC patient. Data from 548 patients with stage III NSCLC were available to enable the development of a prediction model, using stratified Cox regression. Variables were selected by using a bootstrap procedure. Performance of the model was expressed as the c statistic, assessed internally and on 2 external data sets (n=174 and n=130). The final multivariate model, stratified for treatment, consisted of age, gender, World Health Organization performance status, overall treatment time, equivalent radiation dose, number of positive lymph node stations, and gross tumor volume. The bootstrapped c statistic was 0.62. The model could identify risk groups in external data sets. Nomograms were constructed to predict an individual patient's survival probability (www.predictcancer.org). The data set can be downloaded at https://www.cancerdata.org/10.1016/j.ijrobp.2015.02.048. The prediction model for overall survival of patients with stage III NSCLC highlights the importance of combining patient, clinical, and treatment variables. Nomograms were developed and validated. This tool could be used as a first building block for a decision support system. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Investigation of the plasma shaping effects on the H-mode pedestal structure using coupled kinetic neoclassical/MHD stability simulations

    NASA Astrophysics Data System (ADS)

    Pankin, A. Y.; Rafiq, T.; Kritz, A. H.; Park, G. Y.; Snyder, P. B.; Chang, C. S.

    2017-06-01

    The effects of plasma shaping on the H-mode pedestal structure are investigated. High fidelity kinetic simulations of the neoclassical pedestal dynamics are combined with the magnetohydrodynamic (MHD) stability conditions for triggering edge localized mode (ELM) instabilities that limit the pedestal width and height in H-mode plasmas. The neoclassical kinetic XGC0 code [Chang et al., Phys. Plasmas 11, 2649 (2004)] is used in carrying out a scan over plasma elongation and triangularity. As plasma profiles evolve, the MHD stability limits of these profiles are analyzed with the ideal MHD ELITE code [Snyder et al., Phys. Plasmas 9, 2037 (2002)]. Simulations with the XGC0 code, which includes coupled ion-electron dynamics, yield predictions for both ion and electron pedestal profiles. The differences in the predicted H-mode pedestal width and height for the DIII-D discharges with different elongation and triangularities are discussed. For the discharges with higher elongation, it is found that the gradients of the plasma profiles in the H-mode pedestal reach semi-steady states. In these simulations, the pedestal slowly continues to evolve to higher pedestal pressures and bootstrap currents until the peeling-ballooning stability conditions are satisfied. The discharges with lower elongation do not reach the semi-steady state, and ELM crashes are triggered at earlier times. The plasma elongation is found to have a stronger stabilizing effect than the plasma triangularity. For the discharges with lower elongation and lower triangularity, the ELM frequency is large, and the H-mode pedestal evolves rapidly. It is found that the temperature of neutrals in the scrape-off-layer (SOL) region can affect the dynamics of the H-mode pedestal buildup. However, the final pedestal profiles are nearly independent of the neutral temperature. The elongation and triangularity affect the pedestal widths of plasma density and electron temperature profiles differently. This provides a new mechanism of controlling the pedestal bootstrap current and the pedestal stability.

  8. Investigation of the plasma shaping effects on the H-mode pedestal structure using coupled kinetic neoclassical/MHD stability simulations

    DOE PAGES

    Pankin, A. Y.; Rafiq, T.; Kritz, A. H.; ...

    2017-06-08

    The effects of plasma shaping on the H-mode pedestal structure are investigated. High fidelity kinetic simulations of the neoclassical pedestal dynamics are combined with the magnetohydrodynamic (MHD) stability conditions for triggering edge localized mode (ELM) instabilities that limit the pedestal width and height in H-mode plasmas. We use the neoclassical kinetic XGC0 code [Chang et al., Phys. Plasmas 11, 2649 (2004)] to carry out a scan over plasma elongation and triangularity. As plasma profiles evolve, the MHD stability limits of these profiles are analyzed with the ideal MHD ELITE code [Snyder et al., Phys. Plasmas 9, 2037 (2002)]. In simulationsmore » with the XGC0 code, which includes coupled ion-electron dynamics, yield predictions for both ion and electron pedestal profiles. The differences in the predicted H-mode pedestal width and height for the DIII-D discharges with different elongation and triangularities are discussed. For the discharges with higher elongation, it is found that the gradients of the plasma profiles in the H-mode pedestal reach semi-steady states. In these simulations, the pedestal slowly continues to evolve to higher pedestal pressures and bootstrap currents until the peeling-ballooning stability conditions are satisfied. The discharges with lower elongation do not reach the semi-steady state, and ELM crashes are triggered at earlier times. The plasma elongation is found to have a stronger stabilizing effect than the plasma triangularity. For the discharges with lower elongation and lower triangularity, the ELM frequency is large, and the H-mode pedestal evolves rapidly. It is found that the temperature of neutrals in the scrape-off-layer (SOL) region can affect the dynamics of the H-mode pedestal buildup. But the final pedestal profiles are nearly independent of the neutral temperature. The elongation and triangularity affect the pedestal widths of plasma density and electron temperature profiles differently. This provides a new mechanism of controlling the pedestal bootstrap current and the pedestal stability.« less

  9. Investigation of the plasma shaping effects on the H-mode pedestal structure using coupled kinetic neoclassical/MHD stability simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pankin, A. Y.; Rafiq, T.; Kritz, A. H.

    The effects of plasma shaping on the H-mode pedestal structure are investigated. High fidelity kinetic simulations of the neoclassical pedestal dynamics are combined with the magnetohydrodynamic (MHD) stability conditions for triggering edge localized mode (ELM) instabilities that limit the pedestal width and height in H-mode plasmas. We use the neoclassical kinetic XGC0 code [Chang et al., Phys. Plasmas 11, 2649 (2004)] to carry out a scan over plasma elongation and triangularity. As plasma profiles evolve, the MHD stability limits of these profiles are analyzed with the ideal MHD ELITE code [Snyder et al., Phys. Plasmas 9, 2037 (2002)]. In simulationsmore » with the XGC0 code, which includes coupled ion-electron dynamics, yield predictions for both ion and electron pedestal profiles. The differences in the predicted H-mode pedestal width and height for the DIII-D discharges with different elongation and triangularities are discussed. For the discharges with higher elongation, it is found that the gradients of the plasma profiles in the H-mode pedestal reach semi-steady states. In these simulations, the pedestal slowly continues to evolve to higher pedestal pressures and bootstrap currents until the peeling-ballooning stability conditions are satisfied. The discharges with lower elongation do not reach the semi-steady state, and ELM crashes are triggered at earlier times. The plasma elongation is found to have a stronger stabilizing effect than the plasma triangularity. For the discharges with lower elongation and lower triangularity, the ELM frequency is large, and the H-mode pedestal evolves rapidly. It is found that the temperature of neutrals in the scrape-off-layer (SOL) region can affect the dynamics of the H-mode pedestal buildup. But the final pedestal profiles are nearly independent of the neutral temperature. The elongation and triangularity affect the pedestal widths of plasma density and electron temperature profiles differently. This provides a new mechanism of controlling the pedestal bootstrap current and the pedestal stability.« less

  10. Using Satellite-derived Ice Concentration to Represent Antarctic Coastal Polynyas in Ocean Climate Models

    NASA Technical Reports Server (NTRS)

    Stoessel, Achim; Markus, Thorsten

    2003-01-01

    The focus of this paper is on the representation of Antarctic coastal polynyas in global ice-ocean general circulation models (OGCMs), in particular their local, regional, and high-frequency behavior. This is verified with the aid of daily ice concentration derived from satellite passive microwave data using the NASATeam 2 (NT2) and the bootstrap (BS) algorithms. Large systematic regional and temporal discrepancies arise, some of which are related to the type of convection parameterization used in the model. An attempt is made to improve the fresh-water flux associated with melting and freezing in Antarctic coastal polynyas by ingesting (assimilating) satellite ice concentration where it comes to determining the thermodynamics of the open-water fraction of a model grid cell. Since the NT2 coastal open-water fraction (polynyas) tends to be less extensive than the simulated one in the decisive season and region, assimilating NT2 coastal ice concentration yields overall reduced net freezing rates, smaller formation rates of Antarctic Bottom Water, and a stronger southward flow of North Atlantic Deep Water across 30 S. Enhanced net freezing rates occur regionally when NT2 coastal ice concentration is assimilated, concomitant with a more realistic ice thickness distribution and accumulation of High-Salinity Shelf Water. Assimilating BS rather than NT2 coastal ice concentration, the differences to the non-assimilated simulation are generally smaller and of opposite sign. This suggests that the model reproduces coastal ice concentration in closer agreement with the BS data than with the NT2 data, while more realistic features emerge when NT2 data are assimilated.

  11. Human salmonellosis: estimation of dose-illness from outbreak data.

    PubMed

    Bollaerts, Kaatje; Aerts, Marc; Faes, Christel; Grijspeerdt, Koen; Dewulf, Jeroen; Mintiens, Koen

    2008-04-01

    The quantification of the relationship between the amount of microbial organisms ingested and a specific outcome such as infection, illness, or mortality is a key aspect of quantitative risk assessment. A main problem in determining such dose-response models is the availability of appropriate data. Human feeding trials have been criticized because only young healthy volunteers are selected to participate and low doses, as often occurring in real life, are typically not considered. Epidemiological outbreak data are considered to be more valuable, but are more subject to data uncertainty. In this article, we model the dose-illness relationship based on data of 20 Salmonella outbreaks, as discussed by the World Health Organization. In particular, we model the dose-illness relationship using generalized linear mixed models and fractional polynomials of dose. The fractional polynomial models are modified to satisfy the properties of different types of dose-illness models as proposed by Teunis et al. Within these models, differences in host susceptibility (susceptible versus normal population) are modeled as fixed effects whereas differences in serovar type and food matrix are modeled as random effects. In addition, two bootstrap procedures are presented. A first procedure accounts for stochastic variability whereas a second procedure accounts for both stochastic variability and data uncertainty. The analyses indicate that the susceptible population has a higher probability of illness at low dose levels when the combination pathogen-food matrix is extremely virulent and at high dose levels when the combination is less virulent. Furthermore, the analyses suggest that immunity exists in the normal population but not in the susceptible population.

  12. Analytical model for describing ion guiding through capillaries in insulating polymers

    NASA Astrophysics Data System (ADS)

    Liu, Shi-Dong; Zhao, Yong-Tao; Wang, Yu-Yu; N, Stolterfoht; Cheng, Rui; Zhou, Xian-Ming; Xu, Hu-Shan; Xiao, Guo-Qing

    2015-08-01

    An analytical description for guiding of ions through nanocapillaries is given on the basis of previous work. The current entering into the capillary is assumed to be divided into a current fraction transmitted through the capillary, a current fraction flowing away via the capillary conductivity and a current fraction remaining within the capillary, which is responsible for its charge-up. The discharging current is assumed to be governed by the Frenkel-Poole process. At higher conductivities the analytical model shows a blocking of the ion transmission, which is in agreement with recent simulations. Also, it is shown that ion blocking observed in experiments is well reproduced by the analytical formula. Furthermore, the asymptotic fraction of transmitted ions is determined. Apart from the key controlling parameter (charge-to-energy ratio), the ratio of the capillary conductivity to the incident current is included in the model. Differences resulting from the nonlinear and linear limits of the Frenkel-Poole discharge are pointed out. Project supported by the Major State Basic Research Development Program of China (Grant No. 2010CB832902) and the National Natural Science Foundation of China (Grant Nos. 11275241, 11275238, 11105192, and 11375034).

  13. Why do workaholics experience depression? A study with Chinese University teachers.

    PubMed

    Nie, Yingzhi; Sun, Haitao

    2016-10-01

    This study focuses on the relationships of workaholism to job burnout and depression of university teachers. The direct and indirect (via job burnout) effects of workaholism on depression were investigated in 412 Chinese university teachers. Structural equation modeling and bootstrap method were used. Results revealed that workaholism, job burnout, and depression significantly correlated with each other. Structural equation modeling and bootstrap test indicated the partial mediation role of job burnout on the relationship between workaholism and depression. The findings shed some light on how workaholism influenced depression and provided valuable evidence for prevention of depression in work. © The Author(s) 2015.

  14. Governance and performance: the performance of Dutch hospitals explained by governance characteristics.

    PubMed

    Blank, Jos L T; van Hulst, Bart Laurents

    2011-10-01

    This paper describes the efficiency of Dutch hospitals using the Data Envelopment Analysis (DEA) method with bootstrapping. In particular, the analysis focuses on accounting for cost inefficiency measures on the part of hospital corporate governance. We use bootstrap techniques, as introduced by Simar and Wilson (J. Econom. 136(1):31-64, 2007), in order to obtain more efficient estimates of the effects of governance on the efficiency. The results show that part of the cost efficiency can be explained with governance. In particular we find that a higher remuneration of the board as well as a higher remuneration of the supervisory board does not implicate better performance.

  15. Construction of prediction intervals for Palmer Drought Severity Index using bootstrap

    NASA Astrophysics Data System (ADS)

    Beyaztas, Ufuk; Bickici Arikan, Bugrayhan; Beyaztas, Beste Hamiye; Kahya, Ercan

    2018-04-01

    In this study, we propose an approach based on the residual-based bootstrap method to obtain valid prediction intervals using monthly, short-term (three-months) and mid-term (six-months) drought observations. The effects of North Atlantic and Arctic Oscillation indexes on the constructed prediction intervals are also examined. Performance of the proposed approach is evaluated for the Palmer Drought Severity Index (PDSI) obtained from Konya closed basin located in Central Anatolia, Turkey. The finite sample properties of the proposed method are further illustrated by an extensive simulation study. Our results revealed that the proposed approach is capable of producing valid prediction intervals for future PDSI values.

  16. Differentiation of Trypanosoma cruzi I subgroups through characterization of cytochrome b gene sequences.

    PubMed

    Spotorno O, Angel E; Córdova, Luis; Solari I, Aldo

    2008-12-01

    To identify and characterize chilean samples of Trypanosoma cruzi and their association with hosts, the first 516 bp of the mitochondrial cytochrome b gene were sequenced from eight biological samples, and phylogenetically compared with other known 20 American sequences. The molecular characterization of these 28 sequences in a maximum likelihood phylogram (-lnL = 1255.12, tree length = 180, consistency index = 0.79) allowed the robust identification (bootstrap % > 99) of three previously known discrete typing units (DTU): DTU IIb, IIa, and I. An apparently undescribed new sequence found in four new chilean samples was detected and designated as DTU Ib; they were separated by 24.7 differences, but robustly related (bootstrap % = 97 in 500 replicates) to those of DTU I by sharing 12 substitutions, among which four were nonsynonymous ones. Such new DTU Ib was also robust (bootstrap % = 100), and characterized by 10 unambiguous substitutions, with a single nonsynonymous G to T change at site 409. The fact that two of such new sequences were found in parasites from a chilean endemic caviomorph rodent, Octodon degus, and that they were closely related to the ancient DTU I suggested old origins and a long association to caviomorph hosts.

  17. A safe an easy method for building consensus HIV sequences from 454 massively parallel sequencing data.

    PubMed

    Fernández-Caballero Rico, Jose Ángel; Chueca Porcuna, Natalia; Álvarez Estévez, Marta; Mosquera Gutiérrez, María Del Mar; Marcos Maeso, María Ángeles; García, Federico

    2018-02-01

    To show how to generate a consensus sequence from the information of massive parallel sequences data obtained from routine HIV anti-retroviral resistance studies, and that may be suitable for molecular epidemiology studies. Paired Sanger (Trugene-Siemens) and next-generation sequencing (NGS) (454 GSJunior-Roche) HIV RT and protease sequences from 62 patients were studied. NGS consensus sequences were generated using Mesquite, using 10%, 15%, and 20% thresholds. Molecular evolutionary genetics analysis (MEGA) was used for phylogenetic studies. At a 10% threshold, NGS-Sanger sequences from 17/62 patients were phylogenetically related, with a median bootstrap-value of 88% (IQR83.5-95.5). Association increased to 36/62 sequences, median bootstrap 94% (IQR85.5-98)], using a 15% threshold. Maximum association was at the 20% threshold, with 61/62 sequences associated, and a median bootstrap value of 99% (IQR98-100). A safe method is presented to generate consensus sequences from HIV-NGS data at 20% threshold, which will prove useful for molecular epidemiological studies. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  18. The relationship between the number of loci and the statistical support for the topology of UPGMA trees obtained from genetic distance data.

    PubMed

    Highton, R

    1993-12-01

    An analysis of the relationship between the number of loci utilized in an electrophoretic study of genetic relationships and the statistical support for the topology of UPGMA trees is reported for two published data sets. These are Highton and Larson (Syst. Zool.28:579-599, 1979), an analysis of the relationships of 28 species of plethodonine salamanders, and Hedges (Syst. Zool., 35:1-21, 1986), a similar study of 30 taxa of Holarctic hylid frogs. As the number of loci increases, the statistical support for the topology at each node in UPGMA trees was determined by both the bootstrap and jackknife methods. The results show that the bootstrap and jackknife probabilities supporting the topology at some nodes of UPGMA trees increase as the number of loci utilized in a study is increased, as expected for nodes that have groupings that reflect phylogenetic relationships. The pattern of increase varies and is especially rapid in the case of groups with no close relatives. At nodes that likely do not represent correct phylogenetic relationships, the bootstrap probabilities do not increase and often decline with the addition of more loci.

  19. Comparison of mode estimation methods and application in molecular clock analysis

    NASA Technical Reports Server (NTRS)

    Hedges, S. Blair; Shah, Prachi

    2003-01-01

    BACKGROUND: Distributions of time estimates in molecular clock studies are sometimes skewed or contain outliers. In those cases, the mode is a better estimator of the overall time of divergence than the mean or median. However, different methods are available for estimating the mode. We compared these methods in simulations to determine their strengths and weaknesses and further assessed their performance when applied to real data sets from a molecular clock study. RESULTS: We found that the half-range mode and robust parametric mode methods have a lower bias than other mode methods under a diversity of conditions. However, the half-range mode suffers from a relatively high variance and the robust parametric mode is more susceptible to bias by outliers. We determined that bootstrapping reduces the variance of both mode estimators. Application of the different methods to real data sets yielded results that were concordant with the simulations. CONCLUSION: Because the half-range mode is a simple and fast method, and produced less bias overall in our simulations, we recommend the bootstrapped version of it as a general-purpose mode estimator and suggest a bootstrap method for obtaining the standard error and 95% confidence interval of the mode.

  20. Bootstrap percolation on spatial networks

    NASA Astrophysics Data System (ADS)

    Gao, Jian; Zhou, Tao; Hu, Yanqing

    2015-10-01

    Bootstrap percolation is a general representation of some networked activation process, which has found applications in explaining many important social phenomena, such as the propagation of information. Inspired by some recent findings on spatial structure of online social networks, here we study bootstrap percolation on undirected spatial networks, with the probability density function of long-range links’ lengths being a power law with tunable exponent. Setting the size of the giant active component as the order parameter, we find a parameter-dependent critical value for the power-law exponent, above which there is a double phase transition, mixed of a second-order phase transition and a hybrid phase transition with two varying critical points, otherwise there is only a second-order phase transition. We further find a parameter-independent critical value around -1, about which the two critical points for the double phase transition are almost constant. To our surprise, this critical value -1 is just equal or very close to the values of many real online social networks, including LiveJournal, HP Labs email network, Belgian mobile phone network, etc. This work helps us in better understanding the self-organization of spatial structure of online social networks, in terms of the effective function for information spreading.

  1. A resampling strategy based on bootstrap to reduce the effect of large blunders in GPS absolute positioning

    NASA Astrophysics Data System (ADS)

    Angrisano, Antonio; Maratea, Antonio; Gaglione, Salvatore

    2018-01-01

    In the absence of obstacles, a GPS device is generally able to provide continuous and accurate estimates of position, while in urban scenarios buildings can generate multipath and echo-only phenomena that severely affect the continuity and the accuracy of the provided estimates. Receiver autonomous integrity monitoring (RAIM) techniques are able to reduce the negative consequences of large blunders in urban scenarios, but require both a good redundancy and a low contamination to be effective. In this paper a resampling strategy based on bootstrap is proposed as an alternative to RAIM, in order to estimate accurately position in case of low redundancy and multiple blunders: starting with the pseudorange measurement model, at each epoch the available measurements are bootstrapped—that is random sampled with replacement—and the generated a posteriori empirical distribution is exploited to derive the final position. Compared to standard bootstrap, in this paper the sampling probabilities are not uniform, but vary according to an indicator of the measurement quality. The proposed method has been compared with two different RAIM techniques on a data set collected in critical conditions, resulting in a clear improvement on all considered figures of merit.

  2. Reanalysis of cancer mortality in Japanese A-bomb survivors exposed to low doses of radiation: bootstrap and simulation methods

    PubMed Central

    2009-01-01

    Background The International Commission on Radiological Protection (ICRP) recommended annual occupational dose limit is 20 mSv. Cancer mortality in Japanese A-bomb survivors exposed to less than 20 mSv external radiation in 1945 was analysed previously, using a latency model with non-linear dose response. Questions were raised regarding statistical inference with this model. Methods Cancers with over 100 deaths in the 0 - 20 mSv subcohort of the 1950-1990 Life Span Study are analysed with Poisson regression models incorporating latency, allowing linear and non-linear dose response. Bootstrap percentile and Bias-corrected accelerated (BCa) methods and simulation of the Likelihood Ratio Test lead to Confidence Intervals for Excess Relative Risk (ERR) and tests against the linear model. Results The linear model shows significant large, positive values of ERR for liver and urinary cancers at latencies from 37 - 43 years. Dose response below 20 mSv is strongly non-linear at the optimal latencies for the stomach (11.89 years), liver (36.9), lung (13.6), leukaemia (23.66), and pancreas (11.86) and across broad latency ranges. Confidence Intervals for ERR are comparable using Bootstrap and Likelihood Ratio Test methods and BCa 95% Confidence Intervals are strictly positive across latency ranges for all 5 cancers. Similar risk estimates for 10 mSv (lagged dose) are obtained from the 0 - 20 mSv and 5 - 500 mSv data for the stomach, liver, lung and leukaemia. Dose response for the latter 3 cancers is significantly non-linear in the 5 - 500 mSv range. Conclusion Liver and urinary cancer mortality risk is significantly raised using a latency model with linear dose response. A non-linear model is strongly superior for the stomach, liver, lung, pancreas and leukaemia. Bootstrap and Likelihood-based confidence intervals are broadly comparable and ERR is strictly positive by bootstrap methods for all 5 cancers. Except for the pancreas, similar estimates of latency and risk from 10 mSv are obtained from the 0 - 20 mSv and 5 - 500 mSv subcohorts. Large and significant cancer risks for Japanese survivors exposed to less than 20 mSv external radiation from the atomic bombs in 1945 cast doubt on the ICRP recommended annual occupational dose limit. PMID:20003238

  3. Emotion dysregulation as a mediator between childhood emotional abuse and current depression in a low-income African-American sample.

    PubMed

    Crow, Thomas; Cross, Dorthie; Powers, Abigail; Bradley, Bekh

    2014-10-01

    Abuse and neglect in childhood are well-established risk factors for later psychopathology. Past research has suggested that childhood emotional abuse may be particularly harmful to psychological development. The current cross-sectional study employed multiple regression techniques to assess the effects of childhood trauma on adulthood depression and emotion dysregulation in a large sample of mostly low-income African Americans recruited in an urban hospital. Bootstrap analyses were used to test emotion dysregulation as a potential mediator between emotional abuse in childhood and current depression. Childhood emotional abuse significantly predicted depressive symptoms even when accounting for all other childhood trauma types, and we found support for a complementary mediation of this relationship by emotion dysregulation. Our findings highlight the importance of emotion dysregulation and childhood emotional abuse in relation to adult depression. Moving forward, clinicians should consider the particular importance of emotional abuse in the development of depression, and future research should seek to identify mechanisms through which emotional abuse increases risk for depression and emotion dysregulation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Emotion dysregulation as a mediator between childhood emotional abuse and current depression in a low-income African-American sample☆

    PubMed Central

    Crow, Thomas; Cross, Dorthie; Powers, Abigail; Bradley, Bekh

    2014-01-01

    Abuse and neglect in childhood are well-established risk factors for later psychopathology. Past research has suggested that childhood emotional abuse may be particularly harmful to psychological development. The current cross-sectional study employed multiple regression techniques to assess the effects of childhood trauma on adulthood depression and emotion dysregulation in a large sample of mostly low-income African Americans recruited in an urban hospital. Bootstrap analyses were used to test emotion dysregulation as a potential mediator between emotional abuse in childhood and current depression. Childhood emotional abuse significantly predicted depressive symptoms even when accounting for all other childhood trauma types, and we found support for a complementary mediation of this relationship by emotion dysregulation. Our findings highlight the importance of emotion dysregulation and childhood emotional abuse in relation to adult depression. Moving forward, clinicians should consider the particular importance of emotional abuse in the development of depression, and future research should seek to identify mechanisms through which emotional abuse increases risk for depression and emotion dysregulation. PMID:25035171

  5. Suppressing magnetic island growth by resonant magnetic perturbation

    NASA Astrophysics Data System (ADS)

    Yu, Q.; Günter, S.; Lackner, K.

    2018-05-01

    The effect of externally applied resonant magnetic perturbations (RMPs) on the growth of magnetic islands is investigated based on two-fluid equations. It is found that if the local bi-normal electron fluid velocity at the resonant surface is sufficiently large, static RMPs of the same helicity and of moderate amplitude can suppress the growth of magnetic islands in high-temperature plasmas. These islands will otherwise grow, driven by an unfavorable plasma current density profile and bootstrap current perturbation. These results indicate that the error field can stabilize island growth, if the error field amplitude is not too large and the local bi-normal electron fluid velocity is not too low. They also indicate that applied rotating RMPs with an appropriate frequency can be utilized to suppress island growth in high-temperature plasmas, even for a low bi-normal electron fluid velocity. A significant change in the local equilibrium plasma current density gradient by small amplitude RMPs is found for realistic plasma parameters, which are important for the island stability and are expected to be more important for fusion reactors with low plasma resistivity.

  6. Plasma stability analysis using Consistent Automatic Kinetic Equilibrium reconstruction (CAKE)

    NASA Astrophysics Data System (ADS)

    Roelofs, Matthijs; Kolemen, Egemen; Eldon, David; Glasser, Alex; Meneghini, Orso; Smith, Sterling P.

    2017-10-01

    Presented here is the Consistent Automatic Kinetic Equilibrium (CAKE) code. CAKE is being developed to perform real-time kinetic equilibrium reconstruction, aiming to do a reconstruction in less than 100ms. This is achieved by taking, next to real-time Motional Stark Effect (MSE) and magnetics data, real-time Thomson Scattering (TS) and real-time Charge Exchange Recombination (CER, still in development) data in to account. Electron densities and temperature are determined by TS, while ion density and pressures are determined using CER. These form, together with the temperature and density of neutrals, the additional pressure constraints. Extra current constraints are imposed in the core by the MSE diagnostics. The pedestal current density is estimated using Sauters equation for the bootstrap current density. By comparing the behaviour of the ideal MHD perturbed potential energy (δW) and the linear stability index (Δ') of CAKE to magnetics-only reconstruction, it can be seen that the use of diagnostics to reconstruct the pedestal have a large effect on stability. Supported by U.S. DOE DE-SC0015878 and DE-FC02-04ER54698.

  7. Nonlinear Fluid Model Of 3-D Field Effects In Tokamak Plasmas

    NASA Astrophysics Data System (ADS)

    Callen, J. D.; Hegna, C. C.; Beidler, M. T.

    2017-10-01

    Extended MHD codes (e.g., NIMROD, M3D-C1) are beginning to explore nonlinear effects of small 3-D magnetic fields on tokamak plasmas. To facilitate development of analogous physically understandable reduced models, a fluid-based dynamic nonlinear model of these added 3-D field effects in the base axisymmetric tokamak magnetic field geometry is being developed. The model incorporates kinetic-based closures within an extended MHD framework. Key 3-D field effects models that have been developed include: 1) a comprehensive modified Rutherford equation for the growth of a magnetic island that includes the classical tearing and NTM perturbed bootstrap current drives, externally applied magnetic field and current drives, and classical and neoclassical polarization current effects, and 2) dynamic nonlinear evolution of the plasma toroidal flow (radial electric field) in response to the 3-D fields. An application of this model to RMP ELM suppression precipitated by an ELM crash will be discussed. Supported by Office of Fusion Energy Sciences, Office of Science, Dept. of Energy Grants DE-FG02-86ER53218 and DE-FG02-92ER54139.

  8. Improved Design of Stellarator Coils for Current Carrying Plasmas

    NASA Astrophysics Data System (ADS)

    Drevlak, M.; Strumberger, E.; Hirshman, S.; Boozer, A.; Brooks, A.; Valanju, P.

    1998-11-01

    The method of automatic optimization (P. Merkel, Nucl. Fus. 27), (1987) 867; P. Merkel, M. Drevlak, Proc 25th EPS Conf. on Cont. Fus. and Plas. Phys., Prague, in print. for the design of stellarator coils consists essentially of determining filaments such that the average relative field error int dS [ (B_coil + B_j) \\cdot n]^2/B^2_coil is minimized on the prescribed plasma boundary. Bj is the magnetic field produced by the plasma currents of the given finite β fixed boundary equilibrium. For equilibria of the W7-X type, Bj can be neglected, because of the reduced parallel plasma currents. This is not true for quasi-axisymmetric stellarator (QAS) configurations (A. Reiman, et al., to be published.) with large equilibrium and net plasma (bootstrap) currents. Although the coils for QAS exhibit low values of the field error, free boundary calculations indicate that the shape of the plasma is usually not accurately reproduced , particularly when saddle coils are used. We investigate if the surface reconstruction can be improved by introducing a modified measure of the field error based on a measure of the resonant components of the normal field.

  9. Current/Pressure Profile Effects on Tearing Mode Stability in DIII-D Hybrid Discharges

    NASA Astrophysics Data System (ADS)

    Kim, K.; Park, J. M.; Murakami, M.; La Haye, R. J.; Na, Yong-Su

    2015-11-01

    It is important to understand the onset threshold and the evolution of tearing modes (TMs) for developing a high-performance steady state fusion reactor. As initial and basic comparisons to determine TM onset, the measured plasma profiles (such as temperature, density, rotation) were compared with the calculated current profiles between a pair of discharges with/without n=1 mode based on the database for DIII-D hybrid plasmas. The profiles were not much different, but the details were analyzed to determine their characteristics, especially near the rational surface. The tearing stability index calculated from PEST3, Δ' tends to increase rapidly just before the n=1 mode onset for these cases. The modeled equilibrium with varying pressure or current profiles parametrically based on the reference discharge is reconstructed for checking the onset dependency on Δ' or neoclassical effects such as bootstrap current. Simulations of TMs with the modeled equilibrium using resistive MHD codes will also be presented and compared with experiments to determine the sensibility for predicting TM onset. Work supported by US DOE under DE-FC02-04ER54698 and DE-AC52-07NA27344.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaing, K. C.; Peng, Yueng Kay Martin

    Transport theory for potato orbits in the region near the magnetic axis in an axisymmetric torus such as tokamaks and spherical tori is extended to the situation where the toroidal flow speed is of the order of the sonic speed as observed in National Spherical Torus Experiment [E. J. Synakowski, M. G. Bell, R. E. Bell et al., Nucl. Fusion 43, 1653 (2003)]. It is found that transport fluxes such as ion radial heat flux, and bootstrap current density are modified by a factor of the order of the square of the toroidal Mach number. The consequences of the orbitmore » squeezing are also presented. The theory is developed for parabolic (in radius r) plasma profiles. A method to apply the results of the theory for the transport modeling is discussed.« less

  11. Attachment and alcohol use amongst athletes: the mediating role of conscientiousness and alexithymia.

    PubMed

    Andres, Fanny; Castanier, Carole; Le Scanff, Christine

    2014-02-01

    The present study aims to explore the mediating effects of conscientiousness and alexithymia in the relationship between parental attachment style and alcohol use in a large sample of athletic young people. Participants included 434 French sport sciences students. Alcohol use, parental attachment style, conscientiousness and alexithymia were assessed. The hypotheses were tested by using regression and bootstrapping mediation analyses. Maternal insecure attachment style is positively associated with alcohol use. The current study highlights a multiple pathway in this relationship. The results reveal the mediating effect of low conscientiousness and alexithymia between maternal insecure attachment and alcohol use. Athletes' alcohol use seems to be the result of a complex association of underlying psychological factors. © 2013.

  12. Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Mai, Juliane; Tolson, Bryan

    2017-04-01

    The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters or model processes. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method independency of the convergence testing method, we applied it to three widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991, Campolongo et al., 2000), the variance-based Sobol' method (Solbol' 1993, Saltelli et al. 2010) and a derivative-based method known as Parameter Importance index (Goehler et al. 2013). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. Subsequently, we focus on the model-independency by testing the frugal method using the hydrologic model mHM (www.ufz.de/mhm) with about 50 model parameters. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed (and published) sensitivity results. This is one step towards reliable and transferable, published sensitivity results.

  13. Interaction between fractional Josephson vortices in multi-gap superconductor tunnel junctions

    NASA Astrophysics Data System (ADS)

    Kim, Ju H.

    In a long Josephson junction (LJJ) with two-band superconductors, fractionalization of Josephson vortices (fluxons) can occur in the broken time reversal symmetry state when spatial phase textures (i-solitons) are excited. Excitation of i-solitons in each superconductor layer of the junction, arising due to the presence of two condensates and the interband Josephson effect, leads to spatial variation of the critical current density between the superconductor layers. Similar to the situation in a YBa2 Cu3O7 - x superconductor film grain boundary, this spatial dependence of the crtitical current density can self-generate magnetic flux in the insulator layer, resulting in fractional fluxons with large and small fraction of flux quantum. Similar to fluxons in one-band superconductor LJJ, these fractional fluxons are found to interact with each other. The interaction between large and small fractional fluxons determines the size of a fluxon which includes two (one large and one small) fractional fluxons. We discuss the nature of interaction between fractional fluxons and suggest that i-soliton excitations in multi-gap superconductor LJJs may be probed by using magnetic flux measurements.

  14. Evaluation of dynamic row-action maximum likelihood algorithm reconstruction for quantitative 15O brain PET.

    PubMed

    Ibaraki, Masanobu; Sato, Kaoru; Mizuta, Tetsuro; Kitamura, Keishi; Miura, Shuichi; Sugawara, Shigeki; Shinohara, Yuki; Kinoshita, Toshibumi

    2009-09-01

    A modified version of row-action maximum likelihood algorithm (RAMLA) using a 'subset-dependent' relaxation parameter for noise suppression, or dynamic RAMLA (DRAMA), has been proposed. The aim of this study was to assess the capability of DRAMA reconstruction for quantitative (15)O brain positron emission tomography (PET). Seventeen healthy volunteers were studied using a 3D PET scanner. The PET study included 3 sequential PET scans for C(15)O, (15)O(2) and H (2) (15) O. First, the number of main iterations (N (it)) in DRAMA was optimized in relation to image convergence and statistical image noise. To estimate the statistical variance of reconstructed images on a pixel-by-pixel basis, a sinogram bootstrap method was applied using list-mode PET data. Once the optimal N (it) was determined, statistical image noise and quantitative parameters, i.e., cerebral blood flow (CBF), cerebral blood volume (CBV), cerebral metabolic rate of oxygen (CMRO(2)) and oxygen extraction fraction (OEF) were compared between DRAMA and conventional FBP. DRAMA images were post-filtered so that their spatial resolutions were matched with FBP images with a 6-mm FWHM Gaussian filter. Based on the count recovery data, N (it) = 3 was determined as an optimal parameter for (15)O PET data. The sinogram bootstrap analysis revealed that DRAMA reconstruction resulted in less statistical noise, especially in a low-activity region compared to FBP. Agreement of quantitative values between FBP and DRAMA was excellent. For DRAMA images, average gray matter values of CBF, CBV, CMRO(2) and OEF were 46.1 +/- 4.5 (mL/100 mL/min), 3.35 +/- 0.40 (mL/100 mL), 3.42 +/- 0.35 (mL/100 mL/min) and 42.1 +/- 3.8 (%), respectively. These values were comparable to corresponding values with FBP images: 46.6 +/- 4.6 (mL/100 mL/min), 3.34 +/- 0.39 (mL/100 mL), 3.48 +/- 0.34 (mL/100 mL/min) and 42.4 +/- 3.8 (%), respectively. DRAMA reconstruction is applicable to quantitative (15)O PET study and is superior to conventional FBP in terms of image quality.

  15. ClonEvol: clonal ordering and visualization in cancer sequencing.

    PubMed

    Dang, H X; White, B S; Foltz, S M; Miller, C A; Luo, J; Fields, R C; Maher, C A

    2017-12-01

    Reconstruction of clonal evolution is critical for understanding tumor progression and implementing personalized therapies. This is often done by clustering somatic variants based on their cellular prevalence estimated via bulk tumor sequencing of multiple samples. The clusters, consisting of the clonal marker variants, are then ordered based on their estimated cellular prevalence to reconstruct clonal evolution trees, a process referred to as 'clonal ordering'. However, cellular prevalence estimate is confounded by statistical variability and errors in sequencing/data analysis, and therefore inhibits accurate reconstruction of the clonal evolution. This problem is further complicated by intra- and inter-tumor heterogeneity. Furthermore, the field lacks a comprehensive visualization tool to facilitate the interpretation of complex clonal relationships. To address these challenges we developed ClonEvol, a unified software tool for clonal ordering, visualization, and interpretation. ClonEvol uses a bootstrap resampling technique to estimate the cellular fraction of the clones and probabilistically models the clonal ordering constraints to account for statistical variability. The bootstrapping allows identification of the sample founding- and sub-clones, thus enabling interpretation of clonal seeding. ClonEvol automates the generation of multiple widely used visualizations for reconstructing and interpreting clonal evolution. ClonEvol outperformed three of the state of the art tools (LICHeE, Canopy and PhyloWGS) for clonal evolution inference, showing more robust error tolerance and producing more accurate trees in a simulation. Building upon multiple recent publications that utilized ClonEvol to study metastasis and drug resistance in solid cancers, here we show that ClonEvol rediscovered relapsed subclones in two published acute myeloid leukemia patients. Furthermore, we demonstrated that through noninvasive monitoring ClonEvol recapitulated the emerging subclones throughout metastatic progression observed in the tumors of a published breast cancer patient. ClonEvol has broad applicability for longitudinal monitoring of clonal populations in tumor biopsies, or noninvasively, to guide precision medicine. ClonEvol is written in R and is available at https://github.com/ChrisMaherLab/ClonEvol. © The Author 2017. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  16. Heating and current drive requirements towards steady state operation in ITER

    NASA Astrophysics Data System (ADS)

    Poli, F. M.; Bonoli, P. T.; Kessel, C. E.; Batchelor, D. B.; Gorelenkova, M.; Harvey, B.; Petrov, Y.

    2014-02-01

    Steady state scenarios envisaged for ITER aim at optimizing the bootstrap current, while maintaining sufficient confinement and stability to provide the necessary fusion yield. Non-inductive scenarios will need to operate with Internal Transport Barriers (ITBs) in order to reach adequate fusion gain at typical currents of 9 MA. However, the large pressure gradients associated with ITBs in regions of weak or negative magnetic shear can be conducive to ideal MHD instabilities, reducing the no-wall limit. The E × B flow shear from toroidal plasma rotation is expected to be low in ITER, with a major role in the ITB dynamics being played by magnetic geometry. Combinations of H/CD sources that maintain weakly reversed magnetic shear profiles throughout the discharge are the focus of this work. Time-dependent transport simulations indicate that, with a trade-off of the EC equatorial and upper launcher, the formation and sustainment of quasi-steady state ITBs could be demonstrated in ITER with the baseline heating configuration. However, with proper constraints from peeling-ballooning theory on the pedestal width and height, the fusion gain and the maximum non-inductive current are below the ITER target. Upgrades of the heating and current drive system in ITER, like the use of Lower Hybrid current drive, could overcome these limitations, sustaining higher non-inductive current and confinement, more expanded ITBs which are ideal MHD stable.

  17. The role of sleep problems in the relationship between peer victimization and antisocial behavior: A five-year longitudinal study.

    PubMed

    Chang, Ling-Yin; Wu, Wen-Chi; Wu, Chi-Chen; Lin, Linen Nymphas; Yen, Lee-Lan; Chang, Hsing-Yi

    2017-01-01

    Peer victimization in children and adolescents is a serious public health concern. Growing evidence exists for negative consequences of peer victimization, but research has mostly been short term and little is known about the mechanisms that moderate and mediate the impacts of peer victimization on subsequent antisocial behavior. The current study intended to examine the longitudinal relationship between peer victimization in adolescence and antisocial behavior in young adulthood and to determine whether sleep problems influence this relationship. In total, 2006 adolescents participated in a prospective study from 2009 to 2013. The moderating role of sleep problems was examined by testing the significance of the interaction between peer victimization and sleep problems. The mediating role of sleep problems was tested by using bootstrapping mediational analyses. All analyses were conducted using SAS 9.3 software. We found that peer victimization during adolescence was positively and significantly associated with antisocial behavior in young adulthood (β = 0.10, p < 0.0001). This association was mediated, but not moderated by sleep problems. Specifically, peer victimization first increased levels of sleep problems, which in turn elevated the risk of antisocial behavior (indirect effect: 0.01, 95% bootstrap confidence interval: 0.004, 0.021). These findings imply that sleep problems may operate as a potential mechanism through which peer victimization during adolescence leads to increases in antisocial behavior in young adulthood. Prevention and intervention programs that target sleep problems may yield benefits for decreasing antisocial behavior in adolescents who have been victimized by peers. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. mBEEF-vdW: Robust fitting of error estimation density functionals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundgaard, Keld T.; Wellendorff, Jess; Voss, Johannes

    Here, we propose a general-purpose semilocal/nonlocal exchange-correlation functional approximation, named mBEEF-vdW. The exchange is a meta generalized gradient approximation, and the correlation is a semilocal and nonlocal mixture, with the Rutgers-Chalmers approximation for van der Waals (vdW) forces. The functional is fitted within the Bayesian error estimation functional (BEEF) framework. We improve the previously used fitting procedures by introducing a robust MM-estimator based loss function, reducing the sensitivity to outliers in the datasets. To more reliably determine the optimal model complexity, we furthermore introduce a generalization of the bootstrap 0.632 estimator with hierarchical bootstrap sampling and geometric mean estimator overmore » the training datasets. Using this estimator, we show that the robust loss function leads to a 10% improvement in the estimated prediction error over the previously used least-squares loss function. The mBEEF-vdW functional is benchmarked against popular density functional approximations over a wide range of datasets relevant for heterogeneous catalysis, including datasets that were not used for its training. Overall, we find that mBEEF-vdW has a higher general accuracy than competing popular functionals, and it is one of the best performing functionals on chemisorption systems, surface energies, lattice constants, and dispersion. We also show the potential-energy curve of graphene on the nickel(111) surface, where mBEEF-vdW matches the experimental binding length. mBEEF-vdW is currently available in gpaw and other density functional theory codes through Libxc, version 3.0.0.« less

  19. mBEEF-vdW: Robust fitting of error estimation density functionals

    DOE PAGES

    Lundgaard, Keld T.; Wellendorff, Jess; Voss, Johannes; ...

    2016-06-15

    Here, we propose a general-purpose semilocal/nonlocal exchange-correlation functional approximation, named mBEEF-vdW. The exchange is a meta generalized gradient approximation, and the correlation is a semilocal and nonlocal mixture, with the Rutgers-Chalmers approximation for van der Waals (vdW) forces. The functional is fitted within the Bayesian error estimation functional (BEEF) framework. We improve the previously used fitting procedures by introducing a robust MM-estimator based loss function, reducing the sensitivity to outliers in the datasets. To more reliably determine the optimal model complexity, we furthermore introduce a generalization of the bootstrap 0.632 estimator with hierarchical bootstrap sampling and geometric mean estimator overmore » the training datasets. Using this estimator, we show that the robust loss function leads to a 10% improvement in the estimated prediction error over the previously used least-squares loss function. The mBEEF-vdW functional is benchmarked against popular density functional approximations over a wide range of datasets relevant for heterogeneous catalysis, including datasets that were not used for its training. Overall, we find that mBEEF-vdW has a higher general accuracy than competing popular functionals, and it is one of the best performing functionals on chemisorption systems, surface energies, lattice constants, and dispersion. We also show the potential-energy curve of graphene on the nickel(111) surface, where mBEEF-vdW matches the experimental binding length. mBEEF-vdW is currently available in gpaw and other density functional theory codes through Libxc, version 3.0.0.« less

  20. Efficient statistical tests to compare Youden index: accounting for contingency correlation.

    PubMed

    Chen, Fangyao; Xue, Yuqiang; Tan, Ming T; Chen, Pingyan

    2015-04-30

    Youden index is widely utilized in studies evaluating accuracy of diagnostic tests and performance of predictive, prognostic, or risk models. However, both one and two independent sample tests on Youden index have been derived ignoring the dependence (association) between sensitivity and specificity, resulting in potentially misleading findings. Besides, paired sample test on Youden index is currently unavailable. This article develops efficient statistical inference procedures for one sample, independent, and paired sample tests on Youden index by accounting for contingency correlation, namely associations between sensitivity and specificity and paired samples typically represented in contingency tables. For one and two independent sample tests, the variances are estimated by Delta method, and the statistical inference is based on the central limit theory, which are then verified by bootstrap estimates. For paired samples test, we show that the estimated covariance of the two sensitivities and specificities can be represented as a function of kappa statistic so the test can be readily carried out. We then show the remarkable accuracy of the estimated variance using a constrained optimization approach. Simulation is performed to evaluate the statistical properties of the derived tests. The proposed approaches yield more stable type I errors at the nominal level and substantially higher power (efficiency) than does the original Youden's approach. Therefore, the simple explicit large sample solution performs very well. Because we can readily implement the asymptotic and exact bootstrap computation with common software like R, the method is broadly applicable to the evaluation of diagnostic tests and model performance. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Understanding London's Water Supply Tradeoffs When Scheduling Interventions Under Deep Uncertainty

    NASA Astrophysics Data System (ADS)

    Huskova, I.; Matrosov, E. S.; Harou, J. J.; Kasprzyk, J. R.; Reed, P. M.

    2015-12-01

    Water supply planning in many major world cities faces several challenges associated with but not limited to climate change, population growth and insufficient land availability for infrastructure development. Long-term plans to maintain supply-demand balance and ecosystem services require careful consideration of uncertainties associated with future conditions. The current approach for London's water supply planning utilizes least cost optimization of future intervention schedules with limited uncertainty consideration. Recently, the focus of the long-term plans has shifted from solely least cost performance to robustness and resilience of the system. Identifying robust scheduling of interventions requires optimizing over a statistically representative sample of stochastic inputs which may be computationally difficult to achieve. In this study we optimize schedules using an ensemble of plausible scenarios and assess how manipulating that ensemble influences the different Pareto-approximate intervention schedules. We investigate how a major stress event's location in time as well as the optimization problem formulation influence the Pareto-approximate schedules. A bootstrapping method that respects the non-stationary trend of climate change scenarios and ensures the even distribution of the major stress event in the scenario ensemble is proposed. Different bootstrapped hydrological scenario ensembles are assessed using many-objective scenario optimization of London's future water supply and demand intervention scheduling. However, such a "fixed" scheduling of interventions approach does not aim to embed flexibility or adapt effectively as the future unfolds. Alternatively, making decisions based on the observations of occurred conditions could help planners who prefer adaptive planning. We will show how rules to guide the implementation of interventions based on observations may result in more flexible strategies.

  2. BATEMANATER: a computer program to estimate and bootstrap mating system variables based on Bateman's principles.

    PubMed

    Jones, Adam G

    2015-11-01

    Bateman's principles continue to play a major role in the characterization of genetic mating systems in natural populations. The modern manifestations of Bateman's ideas include the opportunity for sexual selection (i.e. I(s) - the variance in relative mating success), the opportunity for selection (i.e. I - the variance in relative reproductive success) and the Bateman gradient (i.e. β(ss) - the slope of the least-squares regression of reproductive success on mating success). These variables serve as the foundation for one convenient approach for the quantification of mating systems. However, their estimation presents at least two challenges, which I address here with a new Windows-based computer software package called BATEMANATER. The first challenge is that confidence intervals for these variables are not easy to calculate. BATEMANATER solves this problem using a bootstrapping approach. The second, more serious, problem is that direct estimates of mating system variables from open populations will typically be biased if some potential progeny or adults are missing from the analysed sample. BATEMANATER addresses this problem using a maximum-likelihood approach to estimate mating system variables from incompletely sampled breeding populations. The current version of BATEMANATER addresses the problem for systems in which progeny can be collected in groups of half- or full-siblings, as would occur when eggs are laid in discrete masses or offspring occur in pregnant females. BATEMANATER has a user-friendly graphical interface and thus represents a new, convenient tool for the characterization and comparison of genetic mating systems. © 2015 John Wiley & Sons Ltd.

  3. HSX as an example of a resilient non-resonant divertor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bader, A.; Boozer, A. H.; Hegna, C. C.

    This study describes an initial description of the resilient divertor properties of quasi-symmetric (QS) stellarators using the HSX (Helically Symmetric eXperiment) configuration as a test-case. Divertors in high-performance QS stellarators will need to be resilient to changes in plasma configuration that arise due to evolution of plasma pressure profiles and bootstrap currents for divertor design. Resiliency is tested by examining the changes in strike point patterns from the field line following, which arise due to configurational changes. A low strike point variation with high configuration changes corresponds to high resiliency. The HSX edge displays resilient properties with configuration changes arisingmore » from the (1) wall position, (2) plasma current, and (3) external coils. The resilient behavior is lost if large edge islands intersect the wall structure. The resilient edge properties are corroborated by heat flux calculations from the fully 3-D plasma simulations using EMC3-EIRENE. Additionally, the strike point patterns are found to correspond to high curvature regions of magnetic flux surfaces.« less

  4. HSX as an example of a resilient non-resonant divertor

    DOE PAGES

    Bader, A.; Boozer, A. H.; Hegna, C. C.; ...

    2017-03-16

    This study describes an initial description of the resilient divertor properties of quasi-symmetric (QS) stellarators using the HSX (Helically Symmetric eXperiment) configuration as a test-case. Divertors in high-performance QS stellarators will need to be resilient to changes in plasma configuration that arise due to evolution of plasma pressure profiles and bootstrap currents for divertor design. Resiliency is tested by examining the changes in strike point patterns from the field line following, which arise due to configurational changes. A low strike point variation with high configuration changes corresponds to high resiliency. The HSX edge displays resilient properties with configuration changes arisingmore » from the (1) wall position, (2) plasma current, and (3) external coils. The resilient behavior is lost if large edge islands intersect the wall structure. The resilient edge properties are corroborated by heat flux calculations from the fully 3-D plasma simulations using EMC3-EIRENE. Additionally, the strike point patterns are found to correspond to high curvature regions of magnetic flux surfaces.« less

  5. A high-efficiency low-voltage CMOS rectifier for harvesting energy in implantable devices.

    PubMed

    Hashemi, S Saeid; Sawan, Mohamad; Savaria, Yvon

    2012-08-01

    We present, in this paper, a new full-wave CMOS rectifier dedicated for wirelessly-powered low-voltage biomedical implants. It uses bootstrapped capacitors to reduce the effective threshold voltage of selected MOS switches. It achieves a significant increase in its overall power efficiency and low voltage-drop. Therefore, the rectifier is good for applications with low-voltage power supplies and large load current. The rectifier topology does not require complex circuit design. The highest voltages available in the circuit are used to drive the gates of selected transistors in order to reduce leakage current and to lower their channel on-resistance, while having high transconductance. The proposed rectifier was fabricated using the standard TSMC 0.18 μm CMOS process. When connected to a sinusoidal source of 3.3 V peak amplitude, it allows improving the overall power efficiency by 11% compared to the best recently published results given by a gate cross-coupled-based structure.

  6. Monte Carlo based statistical power analysis for mediation models: methods and software.

    PubMed

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  7. SOCIAL COMPETENCE AND PSYCHOLOGICAL VULNERABILITY: THE MEDIATING ROLE OF FLOURISHING.

    PubMed

    Uysal, Recep

    2015-10-01

    This study examined whether flourishing mediated the social competence and psychological vulnerability. Participants were 259 university students (147 women, 112 men; M age = 21.3 yr., SD = 1.7) who completed the Turkish versions of the Perceived Social Competence Scale, the Flourishing Scale, and the Psychological Vulnerability Scale. Mediation models were tested using the bootstrapping method to examine indirect effects. Consistent with the hypotheses, the results indicated a positive relationship between social competence and flourishing, and a negative relationship between social competence and psychological vulnerability. Results of the bootstrapping method revealed that flourishing significantly mediated the relationship between social competence and psychological vulnerability. The significance and limitations of the results were discussed.

  8. Transformational leadership in the consumer service workgroup: competing models of job satisfaction, change commitment, and cooperative conflict resolution.

    PubMed

    Yang, Yi-Feng

    2014-02-01

    This paper discusses the effects of transformational leadership on cooperative conflict resolution (management) by evaluating several alternative models related to the mediating role of job satisfaction and change commitment. Samples of data from customer service personnel in Taiwan were analyzed. Based on the bootstrap sample technique, an empirical study was carried out to yield the best fitting model. The procedure of hierarchical nested model analysis was used, incorporating the methods of bootstrapping mediation, PRODCLIN2, and structural equation modeling (SEM) comparison. The analysis suggests that leadership that promotes integration (change commitment) and provides inspiration and motivation (job satisfaction), in the proper order, creates the means for cooperative conflict resolution.

  9. Examining Competing Models of Transformational Leadership, Leadership Trust, Change Commitment, and Job Satisfaction.

    PubMed

    Yang, Yi-Feng

    2016-08-01

    This study discusses the influence of transformational leadership on job satisfaction through assessing six alternative models related to the mediators of leadership trust and change commitment utilizing a data sample (N = 341; M age = 32.5 year, SD = 5.2) for service promotion personnel in Taiwan. The bootstrap sampling technique was used to select the better fitting model. The tool of hierarchical nested model analysis was applied, along with the approaches of bootstrapping mediation, PRODCLIN2, and structural equation modeling comparison. The results overall demonstrate that leadership is important and that leadership role identification (trust) and workgroup cohesiveness (commitment) form an ordered serial relationship. © The Author(s) 2016.

  10. Modeling and character analyzing of current-controlled memristors with fractional kinetic transport

    NASA Astrophysics Data System (ADS)

    Si, Gangquan; Diao, Lijie; Zhu, Jianwei; Lei, Yuhang; Babajide, Oresanya; Zhang, Yanbin

    2017-07-01

    Memristors have come into limelight again after it was realized by HP researchers. This paper proposes a memristor model which can be called fractional-order current-controlled memristor, and it is more general and comprehensive. We introduce the fractional integral/differential to the current-controlled memristor model and model memristor with fractional kinetic of charge transport. An interesting phenomena found out is that the I-V characteristic is a triple-loop curve (0 < α < 1) and not the conventional double-loop I-V curve (α=1). Memristance (RM) is analyzed versus the fractional order α and time(t), and it reach saturation faster when 0 < α < 1. The saturation (Rmin → Rmax) time is given and analyzed versus different orders α and frequencies ω, which increase with α increasing and ω decreasing. More importantly, the memristors can't reach the Rmax in some cases. Energy loss of the model is analyzed, and the I-P curves isn't origin-symmetric when 0 < α < 1 which is very different with curves when α = 1 .

  11. Method for analyzing E x B probe spectra from Hall thruster plumes.

    PubMed

    Shastry, Rohit; Hofer, Richard R; Reid, Bryan M; Gallimore, Alec D

    2009-06-01

    Various methods for accurately determining ion species' current fractions using E x B probes in Hall thruster plumes are investigated. The effects of peak broadening and charge exchange on the calculated values of current fractions are quantified in order to determine the importance of accounting for them in the analysis. It is shown that both peak broadening and charge exchange have a significant effect on the calculated current fractions over a variety of operating conditions, especially at operating pressures exceeding 10(-5) torr. However, these effects can be accounted for using a simple approximation for the velocity distribution function and a one-dimensional charge exchange correction model. In order to keep plume attenuation from charge exchange below 30%, it is recommended that pz < or = 2, where p is the measured facility pressure in units of 10(-5) torr and z is the distance from the thruster exit plane to the probe inlet in meters. The spatial variation of the current fractions in the plume of a Hall thruster and the error induced from taking a single-point measurement are also briefly discussed.

  12. [Molecular variability in the commom shrew Sorex araneus L. from European Russia and Siberia inferred from the length polymorphism of DNA regions flanked by short interspersed elements (Inter-SINE PCR) and the relationships between the Moscow and Seliger chromosome races].

    PubMed

    Bannikova, A A; Bulatova, N Sh; Kramerov, D A

    2006-06-01

    Genetic exchange among chromosomal races of the common shrew Sorex araneus and the problem of reproductive barriers have been extensively studied by means of such molecular markers as mtDNA, microsatellites, and allozymes. In the present study, the interpopulation and interracial polymorphism in the common shrew was derived, using fingerprints generated by amplified DNA regions flanked by short interspersed repeats (SINEs)-interSINE PCR (IS-PCR). We used primers, complementary to consensus sequences of two short retroposons: mammalian element MIR and the SOR element from the genome of Sorex araneus. Genetic differentiation among eleven populations of the common shrew from eight chromosome races was estimated. The NP and MJ analyses, as well as multidimensional scaling showed that all samples examined grouped into two main clusters, corresponding to European Russia and Siberia. The bootstrap support of the European Russia cluster in the NJ and MP analyses was respectively 76 and 61%. The bootstrap index for the Siberian cluster was 100% in both analyses; the Tomsk race, included into this cluster, was separated with the bootstrap support of NJ/MP 92/95%.

  13. Wavelet-based time series bootstrap model for multidecadal streamflow simulation using climate indicators

    NASA Astrophysics Data System (ADS)

    Erkyihun, Solomon Tassew; Rajagopalan, Balaji; Zagona, Edith; Lall, Upmanu; Nowak, Kenneth

    2016-05-01

    A model to generate stochastic streamflow projections conditioned on quasi-oscillatory climate indices such as Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO) is presented. Recognizing that each climate index has underlying band-limited components that contribute most of the energy of the signals, we first pursue a wavelet decomposition of the signals to identify and reconstruct these features from annually resolved historical data and proxy based paleoreconstructions of each climate index covering the period from 1650 to 2012. A K-Nearest Neighbor block bootstrap approach is then developed to simulate the total signal of each of these climate index series while preserving its time-frequency structure and marginal distributions. Finally, given the simulated climate signal time series, a K-Nearest Neighbor bootstrap is used to simulate annual streamflow series conditional on the joint state space defined by the simulated climate index for each year. We demonstrate this method by applying it to simulation of streamflow at Lees Ferry gauge on the Colorado River using indices of two large scale climate forcings: Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO), which are known to modulate the Colorado River Basin (CRB) hydrology at multidecadal time scales. Skill in stochastic simulation of multidecadal projections of flow using this approach is demonstrated.

  14. Uncertainty estimation of Intensity-Duration-Frequency relationships: A regional analysis

    NASA Astrophysics Data System (ADS)

    Mélèse, Victor; Blanchet, Juliette; Molinié, Gilles

    2018-03-01

    We propose in this article a regional study of uncertainties in IDF curves derived from point-rainfall maxima. We develop two generalized extreme value models based on the simple scaling assumption, first in the frequentist framework and second in the Bayesian framework. Within the frequentist framework, uncertainties are obtained i) from the Gaussian density stemming from the asymptotic normality theorem of the maximum likelihood and ii) with a bootstrap procedure. Within the Bayesian framework, uncertainties are obtained from the posterior densities. We confront these two frameworks on the same database covering a large region of 100, 000 km2 in southern France with contrasted rainfall regime, in order to be able to draw conclusion that are not specific to the data. The two frameworks are applied to 405 hourly stations with data back to the 1980's, accumulated in the range 3 h-120 h. We show that i) the Bayesian framework is more robust than the frequentist one to the starting point of the estimation procedure, ii) the posterior and the bootstrap densities are able to better adjust uncertainty estimation to the data than the Gaussian density, and iii) the bootstrap density give unreasonable confidence intervals, in particular for return levels associated to large return period. Therefore our recommendation goes towards the use of the Bayesian framework to compute uncertainty.

  15. A bootstrapping method for development of Treebank

    NASA Astrophysics Data System (ADS)

    Zarei, F.; Basirat, A.; Faili, H.; Mirain, M.

    2017-01-01

    Using statistical approaches beside the traditional methods of natural language processing could significantly improve both the quality and performance of several natural language processing (NLP) tasks. The effective usage of these approaches is subject to the availability of the informative, accurate and detailed corpora on which the learners are trained. This article introduces a bootstrapping method for developing annotated corpora based on a complex and rich linguistically motivated elementary structure called supertag. To this end, a hybrid method for supertagging is proposed that combines both of the generative and discriminative methods of supertagging. The method was applied on a subset of Wall Street Journal (WSJ) in order to annotate its sentences with a set of linguistically motivated elementary structures of the English XTAG grammar that is using a lexicalised tree-adjoining grammar formalism. The empirical results confirm that the bootstrapping method provides a satisfactory way for annotating the English sentences with the mentioned structures. The experiments show that the method could automatically annotate about 20% of WSJ with the accuracy of F-measure about 80% of which is particularly 12% higher than the F-measure of the XTAG Treebank automatically generated from the approach proposed by Basirat and Faili [(2013). Bridge the gap between statistical and hand-crafted grammars. Computer Speech and Language, 27, 1085-1104].

  16. Confidence intervals for correlations when data are not normal.

    PubMed

    Bishara, Anthony J; Hittner, James B

    2017-02-01

    With nonnormal data, the typical confidence interval of the correlation (Fisher z') may be inaccurate. The literature has been unclear as to which of several alternative methods should be used instead, and how extreme a violation of normality is needed to justify an alternative. Through Monte Carlo simulation, 11 confidence interval methods were compared, including Fisher z', two Spearman rank-order methods, the Box-Cox transformation, rank-based inverse normal (RIN) transformation, and various bootstrap methods. Nonnormality often distorted the Fisher z' confidence interval-for example, leading to a 95 % confidence interval that had actual coverage as low as 68 %. Increasing the sample size sometimes worsened this problem. Inaccurate Fisher z' intervals could be predicted by a sample kurtosis of at least 2, an absolute sample skewness of at least 1, or significant violations of normality hypothesis tests. Only the Spearman rank-order and RIN transformation methods were universally robust to nonnormality. Among the bootstrap methods, an observed imposed bootstrap came closest to accurate coverage, though it often resulted in an overly long interval. The results suggest that sample nonnormality can justify avoidance of the Fisher z' interval in favor of a more robust alternative. R code for the relevant methods is provided in supplementary materials.

  17. Using Landsat Surface Reflectance Data as a Reference Target for Multiswath Hyperspectral Data Collected Over Mixed Agricultural Rangeland Areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCann, Cooper; Repasky, Kevin S.; Morin, Mikindra

    Low-cost flight-based hyperspectral imaging systems have the potential to provide important information for ecosystem and environmental studies as well as aide in land management. To realize this potential, methods must be developed to provide large-area surface reflectance data allowing for temporal data sets at the mesoscale. This paper describes a bootstrap method of producing a large-area, radiometrically referenced hyperspectral data set using the Landsat surface reflectance (LaSRC) data product as a reference target. The bootstrap method uses standard hyperspectral processing techniques that are extended to remove uneven illumination conditions between flight passes, allowing for radiometrically self-consistent data after mosaicking. Throughmore » selective spectral and spatial resampling, LaSRC data are used as a radiometric reference target. Advantages of the bootstrap method include the need for minimal site access, no ancillary instrumentation, and automated data processing. Data from two hyperspectral flights over the same managed agricultural and unmanaged range land covering approximately 5.8 km 2 acquired on June 21, 2014 and June 24, 2015 are presented. As a result, data from a flight over agricultural land collected on June 6, 2016 are compared with concurrently collected ground-based reflectance spectra as a means of validation.« less

  18. Using Landsat Surface Reflectance Data as a Reference Target for Multiswath Hyperspectral Data Collected Over Mixed Agricultural Rangeland Areas

    DOE PAGES

    McCann, Cooper; Repasky, Kevin S.; Morin, Mikindra; ...

    2017-07-25

    Low-cost flight-based hyperspectral imaging systems have the potential to provide important information for ecosystem and environmental studies as well as aide in land management. To realize this potential, methods must be developed to provide large-area surface reflectance data allowing for temporal data sets at the mesoscale. This paper describes a bootstrap method of producing a large-area, radiometrically referenced hyperspectral data set using the Landsat surface reflectance (LaSRC) data product as a reference target. The bootstrap method uses standard hyperspectral processing techniques that are extended to remove uneven illumination conditions between flight passes, allowing for radiometrically self-consistent data after mosaicking. Throughmore » selective spectral and spatial resampling, LaSRC data are used as a radiometric reference target. Advantages of the bootstrap method include the need for minimal site access, no ancillary instrumentation, and automated data processing. Data from two hyperspectral flights over the same managed agricultural and unmanaged range land covering approximately 5.8 km 2 acquired on June 21, 2014 and June 24, 2015 are presented. As a result, data from a flight over agricultural land collected on June 6, 2016 are compared with concurrently collected ground-based reflectance spectra as a means of validation.« less

  19. Confidence intervals for distinguishing ordinal and disordinal interactions in multiple regression.

    PubMed

    Lee, Sunbok; Lei, Man-Kit; Brody, Gene H

    2015-06-01

    Distinguishing between ordinal and disordinal interaction in multiple regression is useful in testing many interesting theoretical hypotheses. Because the distinction is made based on the location of a crossover point of 2 simple regression lines, confidence intervals of the crossover point can be used to distinguish ordinal and disordinal interactions. This study examined 2 factors that need to be considered in constructing confidence intervals of the crossover point: (a) the assumption about the sampling distribution of the crossover point, and (b) the possibility of abnormally wide confidence intervals for the crossover point. A Monte Carlo simulation study was conducted to compare 6 different methods for constructing confidence intervals of the crossover point in terms of the coverage rate, the proportion of true values that fall to the left or right of the confidence intervals, and the average width of the confidence intervals. The methods include the reparameterization, delta, Fieller, basic bootstrap, percentile bootstrap, and bias-corrected accelerated bootstrap methods. The results of our Monte Carlo simulation study suggest that statistical inference using confidence intervals to distinguish ordinal and disordinal interaction requires sample sizes more than 500 to be able to provide sufficiently narrow confidence intervals to identify the location of the crossover point. (c) 2015 APA, all rights reserved).

  20. A Pilot Investigation of the Relationship between Climate Variability and Milk Compounds under the Bootstrap Technique

    PubMed Central

    Marami Milani, Mohammad Reza; Hense, Andreas; Rahmani, Elham; Ploeger, Angelika

    2015-01-01

    This study analyzes the linear relationship between climate variables and milk components in Iran by applying bootstrapping to include and assess the uncertainty. The climate parameters, Temperature Humidity Index (THI) and Equivalent Temperature Index (ETI) are computed from the NASA-Modern Era Retrospective-Analysis for Research and Applications (NASA-MERRA) reanalysis (2002–2010). Milk data for fat, protein (measured on fresh matter bases), and milk yield are taken from 936,227 milk records for the same period, using cows fed by natural pasture from April to September. Confidence intervals for the regression model are calculated using the bootstrap technique. This method is applied to the original times series, generating statistically equivalent surrogate samples. As a result, despite the short time data and the related uncertainties, an interesting behavior of the relationships between milk compound and the climate parameters is visible. During spring only, a weak dependency of milk yield and climate variations is obvious, while fat and protein concentrations show reasonable correlations. In summer, milk yield shows a similar level of relationship with ETI, but not with temperature and THI. We suggest this methodology for studies in the field of the impacts of climate change and agriculture, also environment and food with short-term data. PMID:28231215

  1. Detecting temporal trends in species assemblages with bootstrapping procedures and hierarchical models

    USGS Publications Warehouse

    Gotelli, Nicholas J.; Dorazio, Robert M.; Ellison, Aaron M.; Grossman, Gary D.

    2010-01-01

    Quantifying patterns of temporal trends in species assemblages is an important analytical challenge in community ecology. We describe methods of analysis that can be applied to a matrix of counts of individuals that is organized by species (rows) and time-ordered sampling periods (columns). We first developed a bootstrapping procedure to test the null hypothesis of random sampling from a stationary species abundance distribution with temporally varying sampling probabilities. This procedure can be modified to account for undetected species. We next developed a hierarchical model to estimate species-specific trends in abundance while accounting for species-specific probabilities of detection. We analysed two long-term datasets on stream fishes and grassland insects to demonstrate these methods. For both assemblages, the bootstrap test indicated that temporal trends in abundance were more heterogeneous than expected under the null model. We used the hierarchical model to estimate trends in abundance and identified sets of species in each assemblage that were steadily increasing, decreasing or remaining constant in abundance over more than a decade of standardized annual surveys. Our methods of analysis are broadly applicable to other ecological datasets, and they represent an advance over most existing procedures, which do not incorporate effects of incomplete sampling and imperfect detection.

  2. Modality specificity and integration in working memory: Insights from visuospatial bootstrapping.

    PubMed

    Allen, Richard J; Havelka, Jelena; Falcon, Thomas; Evans, Sally; Darling, Stephen

    2015-05-01

    The question of how meaningful associations between verbal and spatial information might be utilized to facilitate working memory performance is potentially highly instructive for models of memory function. The present study explored how separable processing capacities within specialized domains might each contribute to this, by examining the disruptive impacts of simple verbal and spatial concurrent tasks on young adults' recall of visually presented digit sequences encountered either in a single location or within a meaningful spatial "keypad" configuration. The previously observed advantage for recall in the latter condition (the "visuospatial bootstrapping effect") consistently emerged across 3 experiments, indicating use of familiar spatial information in boosting verbal memory. The magnitude of this effect interacted with concurrent activity; articulatory suppression during encoding disrupted recall to a greater extent when digits were presented in single locations (Experiment 1), while spatial tapping during encoding had a larger impact on the keypad condition and abolished the visuospatial bootstrapping advantage (Experiment 2). When spatial tapping was performed during recall (Experiment 3), no task by display interaction was observed. Outcomes are discussed within the context of the multicomponent model of working memory, with a particular emphasis on cross-domain storage in the episodic buffer (Baddeley, 2000). (c) 2015 APA, all rights reserved).

  3. Bootstrap-based procedures for inference in nonparametric receiver-operating characteristic curve regression analysis.

    PubMed

    Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G

    2018-03-01

    Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.

  4. Fractional Modeling of the AC Large-Signal Frequency Response in Magnetoresistive Current Sensors

    PubMed Central

    Arias, Sergio Iván Ravello; Muñoz, Diego Ramírez; Moreno, Jaime Sánchez; Cardoso, Susana; Ferreira, Ricardo; de Freitas, Paulo Jorge Peixeiro

    2013-01-01

    Fractional calculus is considered when derivatives and integrals of non-integer order are applied over a specific function. In the electrical and electronic domain, the transfer function dependence of a fractional filter not only by the filter order n, but additionally, of the fractional order α is an example of a great number of systems where its input-output behavior could be more exactly modeled by a fractional behavior. Following this aim, the present work shows the experimental ac large-signal frequency response of a family of electrical current sensors based in different spintronic conduction mechanisms. Using an ac characterization set-up the sensor transimpedance function Zt(if) is obtained considering it as the relationship between sensor output voltage and input sensing current, Zt(jf)=Vo,sensor(jf)/Isensor(jf). The study has been extended to various magnetoresistance sensors based in different technologies like anisotropic magnetoresistance (AMR), giant magnetoresistance (GMR), spin-valve (GMR-SV) and tunnel magnetoresistance (TMR). The resulting modeling shows two predominant behaviors, the low-pass and the inverse low-pass with fractional index different from the classical integer response. The TMR technology with internal magnetization offers the best dynamic and sensitivity properties opening the way to develop actual industrial applications. PMID:24351648

  5. Heating and current drive requirements towards steady state operation in ITER

    NASA Astrophysics Data System (ADS)

    Poli, Francesca; Kessel, Charles; Bonoli, Paul; Batchelor, Donald; Harvey, Bob

    2013-10-01

    Steady state scenarios envisaged for ITER aim at optimizing the bootstrap current, while maintaining sufficient confinement and stability. Non-inductive scenarios will need to operate with Internal Transport Barriers (ITBs) to reach adequate fusion gain at typical currents of 9 MA. Scenarios are established as relaxed flattop states with time-dependent transport simulations with TSC. The E × B flow shear from toroidal plasma rotation is expected to be low in ITER, with a major role in the ITB dynamics being played by magnetic geometry. Combinations of external sources that maintain weakly reversed shear profiles and ρ (qmin >= 0 . 5 are the focus of this work. Simulations indicate that, with a trade-off of the EC equatorial and upper launcher, the formation and sustainment of ITBs could be demonstrated with the baseline configuration. However, with proper constraints from peeling-ballooning theory on the pedestal width and height, the fusion gain and the maximum non-inductive current (6.2MA) are below the target. Upgrades of the heating and current drive system, like the use of Lower Hybrid current drive, could overcome these limitations. With 30MW of coupled LH in the flattop and operating at the Greenwald density, plasmas can sustain ~ 9 MA and achieve Q ~ 4 . Work supported by the US Department of Energy under DE-AC02-CH0911466.

  6. Apparatus for measuring the local void fraction in a flowing liquid containing a gas

    DOEpatents

    Dunn, P.F.

    1979-07-17

    The local void fraction in liquid containing a gas is measured by placing an impedance-variation probe in the liquid, applying a controlled voltage or current to the probe, and measuring the probe current or voltage. A circuit for applying the one electrical parameter and measuring the other includes a feedback amplifier that minimizes the effect of probe capacitance and a digitizer to provide a clean signal. Time integration of the signal provides a measure of the void fraction, and an oscilloscope display also shows bubble size and distribution.

  7. Apparatus for measuring the local void fraction in a flowing liquid containing a gas

    DOEpatents

    Dunn, Patrick F.

    1981-01-01

    The local void fraction in liquid containing a gas is measured by placing an impedance-variation probe in the liquid, applying a controlled voltage or current to the probe, and measuring the probe current or voltage. A circuit for applying the one electrical parameter and measuring the other includes a feedback amplifier that minimizes the effect of probe capacitance and a digitizer to provide a clean signal. Time integration of the signal provides a measure of the void fraction, and an oscilloscope display also shows bubble size and distribution.

  8. Modification of aqueous enzymatic oil extraction to increase the yield of corn oil from dry fractionated corn germ

    USDA-ARS?s Scientific Manuscript database

    In previous aqueous enzymatic extraction experiments we reported an oil yield of 67 grams from 800 grams of dry fractionated corn germ. In the current experiments, a dispersion of 10% cooked, dry-fractionated germ in water and was treated with amylases and a cellulase complex. A foam fraction was s...

  9. Phylogenetic Analysis of Prevalent Tuberculosis and Non-Tuberculosis Mycobacteria in Isfahan, Iran, Based on a 360 bp Sequence of the rpoB Gene

    PubMed Central

    Nasr Esfahani, Bahram; Moghim, Sharareh; Ghasemian Safaei, Hajieh; Moghoofei, Mohsen; Sedighi, Mansour; Hadifar, Shima

    2016-01-01

    Background Taxonomic and phylogenetic studies of Mycobacterium species have been based around the 16sRNA gene for many years. However, due to the high strain similarity between species in the Mycobacterium genus (94.3% - 100%), defining a valid phylogenetic tree is difficult; consequently, its use in estimating the boundaries between species is limited. The sequence of the rpoB gene makes it an appropriate gene for phylogenetic analysis, especially in bacteria with limited variation. Objectives In the present study, a 360bp sequence of rpoB was used for precise classification of Mycobacterium strains isolated in Isfahan, Iran. Materials and Methods From February to October 2013, 57 clinical and environmental isolates were collected, subcultured, and identified by phenotypic methods. After DNA extraction, a 360bp fragment was PCR-amplified and sequenced. The phylogenetic tree was constructed based on consensus sequence data, using MEGA5 software. Results Slow and fast-growing groups of the Mycobacterium strains were clearly differentiated based on the constructed tree of 56 common Mycobacterium isolates. Each species with a unique title in the tree was identified; in total, 13 nods with a bootstrap value of over 50% were supported. Among the slow-growing group was Mycobacterium kansasii, with M. tuberculosis in a cluster with a bootstrap value of 98% and M. gordonae in another cluster with a bootstrap value of 90%. In the fast-growing group, one cluster with a bootstrap value of 89% was defined, including all fast-growing members present in this study. Conclusions The results suggest that only the application of the rpoB gene sequence is sufficient for taxonomic categorization and definition of a new Mycobacterium species, due to its high resolution power and proper variation in its sequence (85% - 100%); the resulting tree has high validity. PMID:27284397

  10. Emergence of Joint Attention through Bootstrap Learning based on the Mechanisms of Visual Attention and Learning with Self-evaluation

    NASA Astrophysics Data System (ADS)

    Nagai, Yukie; Hosoda, Koh; Morita, Akio; Asada, Minoru

    This study argues how human infants acquire the ability of joint attention through interactions with their caregivers from a viewpoint of cognitive developmental robotics. In this paper, a mechanism by which a robot acquires sensorimotor coordination for joint attention through bootstrap learning is described. Bootstrap learning is a process by which a learner acquires higher capabilities through interactions with its environment based on embedded lower capabilities even if the learner does not receive any external evaluation nor the environment is controlled. The proposed mechanism for bootstrap learning of joint attention consists of the robot's embedded mechanisms: visual attention and learning with self-evaluation. The former is to find and attend to a salient object in the field of the robot's view, and the latter is to evaluate the success of visual attention, not joint attention, and then to learn the sensorimotor coordination. Since the object which the robot looks at based on visual attention does not always correspond to the object which the caregiver is looking at in an environment including multiple objects, the robot may have incorrect learning situations for joint attention as well as correct ones. However, the robot is expected to statistically lose the learning data of the incorrect ones as outliers because of its weaker correlation between the sensor input and the motor output than that of the correct ones, and consequently to acquire appropriate sensorimotor coordination for joint attention even if the caregiver does not provide any task evaluation to the robot. The experimental results show the validity of the proposed mechanism. It is suggested that the proposed mechanism could explain the developmental mechanism of infants' joint attention because the learning process of the robot's joint attention can be regarded as equivalent to the developmental process of infants' one.

  11. Uncertainty quantification of CO₂ saturation estimated from electrical resistance tomography data at the Cranfield site

    DOE PAGES

    Yang, Xianjin; Chen, Xiao; Carrigan, Charles R.; ...

    2014-06-03

    A parametric bootstrap approach is presented for uncertainty quantification (UQ) of CO₂ saturation derived from electrical resistance tomography (ERT) data collected at the Cranfield, Mississippi (USA) carbon sequestration site. There are many sources of uncertainty in ERT-derived CO₂ saturation, but we focus on how the ERT observation errors propagate to the estimated CO₂ saturation in a nonlinear inversion process. Our UQ approach consists of three steps. We first estimated the observational errors from a large number of reciprocal ERT measurements. The second step was to invert the pre-injection baseline data and the resulting resistivity tomograph was used as the priormore » information for nonlinear inversion of time-lapse data. We assigned a 3% random noise to the baseline model. Finally, we used a parametric bootstrap method to obtain bootstrap CO₂ saturation samples by deterministically solving a nonlinear inverse problem many times with resampled data and resampled baseline models. Then the mean and standard deviation of CO₂ saturation were calculated from the bootstrap samples. We found that the maximum standard deviation of CO₂ saturation was around 6% with a corresponding maximum saturation of 30% for a data set collected 100 days after injection began. There was no apparent spatial correlation between the mean and standard deviation of CO₂ saturation but the standard deviation values increased with time as the saturation increased. The uncertainty in CO₂ saturation also depends on the ERT reciprocal error threshold used to identify and remove noisy data and inversion constraints such as temporal roughness. Five hundred realizations requiring 3.5 h on a single 12-core node were needed for the nonlinear Monte Carlo inversion to arrive at stationary variances while the Markov Chain Monte Carlo (MCMC) stochastic inverse approach may expend days for a global search. This indicates that UQ of 2D or 3D ERT inverse problems can be performed on a laptop or desktop PC.« less

  12. Analyzing hospitalization data: potential limitations of Poisson regression.

    PubMed

    Weaver, Colin G; Ravani, Pietro; Oliver, Matthew J; Austin, Peter C; Quinn, Robert R

    2015-08-01

    Poisson regression is commonly used to analyze hospitalization data when outcomes are expressed as counts (e.g. number of days in hospital). However, data often violate the assumptions on which Poisson regression is based. More appropriate extensions of this model, while available, are rarely used. We compared hospitalization data between 206 patients treated with hemodialysis (HD) and 107 treated with peritoneal dialysis (PD) using Poisson regression and compared results from standard Poisson regression with those obtained using three other approaches for modeling count data: negative binomial (NB) regression, zero-inflated Poisson (ZIP) regression and zero-inflated negative binomial (ZINB) regression. We examined the appropriateness of each model and compared the results obtained with each approach. During a mean 1.9 years of follow-up, 183 of 313 patients (58%) were never hospitalized (indicating an excess of 'zeros'). The data also displayed overdispersion (variance greater than mean), violating another assumption of the Poisson model. Using four criteria, we determined that the NB and ZINB models performed best. According to these two models, patients treated with HD experienced similar hospitalization rates as those receiving PD {NB rate ratio (RR): 1.04 [bootstrapped 95% confidence interval (CI): 0.49-2.20]; ZINB summary RR: 1.21 (bootstrapped 95% CI 0.60-2.46)}. Poisson and ZIP models fit the data poorly and had much larger point estimates than the NB and ZINB models [Poisson RR: 1.93 (bootstrapped 95% CI 0.88-4.23); ZIP summary RR: 1.84 (bootstrapped 95% CI 0.88-3.84)]. We found substantially different results when modeling hospitalization data, depending on the approach used. Our results argue strongly for a sound model selection process and improved reporting around statistical methods used for modeling count data. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  13. A Bootstrap Approach to an Affordable Exploration Program

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.

    2011-01-01

    This paper examines the potential to build an affordable sustainable exploration program by adopting an approach that requires investing in technologies that can be used to build a space infrastructure from very modest initial capabilities. Human exploration has had a history of flight programs that have high development and operational costs. Since Apollo, human exploration has had very constrained budgets and they are expected be constrained in the future. Due to their high operations costs it becomes necessary to consider retiring established space facilities in order to move on to the next exploration challenge. This practice may save cost in the near term but it does so by sacrificing part of the program s future architecture. Human exploration also has a history of sacrificing fully functional flight hardware to achieve mission objectives. An affordable exploration program cannot be built when it involves billions of dollars of discarded space flight hardware, instead, the program must emphasize preserving its high value space assets and building a suitable permanent infrastructure. Further this infrastructure must reduce operational and logistics cost. The paper examines the importance of achieving a high level of logistics independence by minimizing resource consumption, minimizing the dependency on external logistics, and maximizing the utility of resources available. The approach involves the development and deployment of a core suite of technologies that have minimum initial needs yet are able expand upon initial capability in an incremental bootstrap fashion. The bootstrap approach incrementally creates an infrastructure that grows and becomes self sustaining and eventually begins producing the energy, products and consumable propellants that support human exploration. The bootstrap technologies involve new methods of delivering and manipulating energy and materials. These technologies will exploit the space environment, minimize dependencies, and minimize the need for imported resources. They will provide the widest range of utility in a resource scarce environment and pave the way to an affordable exploration program.

  14. Equilibrium drives of the low and high field side n = 2 plasma response and impact on global confinement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paz-Soldan, C.; Logan, N. C.; Haskey, S. R.

    The nature of the multi-modal n=2 plasma response and its impact on global confinement is studied as a function of the axisymmetric equilibrium pressure, edge safety factor, collisionality, and L-versus H-mode conditions. Varying the relative phase (ΔΦ UL) between upper and lower in-vessel coils demonstrates that different n=2 poloidal spectra preferentially excite different plasma responses. These different plasma response modes are preferentially detected on the tokamak high-field side (HFS) or low-field side (LFS) midplanes, have different radial extents, couple differently to the resonant surfaces, and have variable impacts on edge stability and global confinement. In all equilibrium conditions studied, themore » observed confinement degradation shares the same ΔΦ UL dependence as the coupling to the resonant surfaces given by both ideal (IPEC) and resistive (MARS-F) MHD computation. Varying the edge safety factor shifts the equilibrium field-line pitch and thus the ΔΦ UL dependence of both the global confinement and the n=2 magnetic response. As edge safety factor is varied, modeling finds that the HFS response (but not the LFS response), the resonant surface coupling, and the edge displacements near the X-point all share the same ΔΦ UL dependence. The LFS response magnitude is strongly sensitive to the core pressure and is insensitive to the collisionality and edge safety factor. This indicates that the LFS measurements are primarily sensitive to a pressure-driven kink-ballooning mode that couples to the core plasma. MHD modeling accurately reproduces these (and indeed all) LFS experimental trends and supports this interpretation. In contrast to the LFS, the HFS magnetic response and correlated global confinement impact is unchanged with plasma pressure, but is strongly reduced in high collisionality conditions in both H- and L-mode. This experimentally suggests the bootstrap current drives the HFS response through the kink-peeling mode drive, though surprisingly weak or no dependence on the bootstrap current is seen in modeling. Instead, modeling is revealed to be very sensitive to the details of the edge current profile and equilibrium truncation. Furthermore, holding truncation fixed, most HFS experimental trends are not captured, thus demonstrating a stark contrast between the robustness of the HFS experimental results and the sensitivity of its computation.« less

  15. Equilibrium drives of the low and high field side n  =  2 plasma response and impact on global confinement

    NASA Astrophysics Data System (ADS)

    Paz-Soldan, C.; Logan, N. C.; Haskey, S. R.; Nazikian, R.; Strait, E. J.; Chen, X.; Ferraro, N. M.; King, J. D.; Lyons, B. C.; Park, J.-K.

    2016-05-01

    The nature of the multi-modal n  =  2 plasma response and its impact on global confinement is studied as a function of the axisymmetric equilibrium pressure, edge safety factor, collisionality, and L-versus H-mode conditions. Varying the relative phase (Δ {φ\\text{UL}} ) between upper and lower in-vessel coils demonstrates that different n  =  2 poloidal spectra preferentially excite different plasma responses. These different plasma response modes are preferentially detected on the tokamak high-field side (HFS) or low-field side (LFS) midplanes, have different radial extents, couple differently to the resonant surfaces, and have variable impacts on edge stability and global confinement. In all equilibrium conditions studied, the observed confinement degradation shares the same Δ {φ\\text{UL}} dependence as the coupling to the resonant surfaces given by both ideal (IPEC) and resistive (MARS-F) MHD computation. Varying the edge safety factor shifts the equilibrium field-line pitch and thus the Δ {φ\\text{UL}} dependence of both the global confinement and the n  =  2 magnetic response. As edge safety factor is varied, modeling finds that the HFS response (but not the LFS response), the resonant surface coupling, and the edge displacements near the X-point all share the same Δ {φ\\text{UL}} dependence. The LFS response magnitude is strongly sensitive to the core pressure and is insensitive to the collisionality and edge safety factor. This indicates that the LFS measurements are primarily sensitive to a pressure-driven kink-ballooning mode that couples to the core plasma. MHD modeling accurately reproduces these (and indeed all) LFS experimental trends and supports this interpretation. In contrast to the LFS, the HFS magnetic response and correlated global confinement impact is unchanged with plasma pressure, but is strongly reduced in high collisionality conditions in both H- and L-mode. This experimentally suggests the bootstrap current drives the HFS response through the kink-peeling mode drive, though surprisingly weak or no dependence on the bootstrap current is seen in modeling. Instead, modeling is revealed to be very sensitive to the details of the edge current profile and equilibrium truncation. Holding truncation fixed, most HFS experimental trends are not captured, thus demonstrating a stark contrast between the robustness of the HFS experimental results and the sensitivity of its computation.

  16. Equilibrium drives of the low and high field side n = 2 plasma response and impact on global confinement

    DOE PAGES

    Paz-Soldan, C.; Logan, N. C.; Haskey, S. R.; ...

    2016-03-31

    The nature of the multi-modal n=2 plasma response and its impact on global confinement is studied as a function of the axisymmetric equilibrium pressure, edge safety factor, collisionality, and L-versus H-mode conditions. Varying the relative phase (ΔΦ UL) between upper and lower in-vessel coils demonstrates that different n=2 poloidal spectra preferentially excite different plasma responses. These different plasma response modes are preferentially detected on the tokamak high-field side (HFS) or low-field side (LFS) midplanes, have different radial extents, couple differently to the resonant surfaces, and have variable impacts on edge stability and global confinement. In all equilibrium conditions studied, themore » observed confinement degradation shares the same ΔΦ UL dependence as the coupling to the resonant surfaces given by both ideal (IPEC) and resistive (MARS-F) MHD computation. Varying the edge safety factor shifts the equilibrium field-line pitch and thus the ΔΦ UL dependence of both the global confinement and the n=2 magnetic response. As edge safety factor is varied, modeling finds that the HFS response (but not the LFS response), the resonant surface coupling, and the edge displacements near the X-point all share the same ΔΦ UL dependence. The LFS response magnitude is strongly sensitive to the core pressure and is insensitive to the collisionality and edge safety factor. This indicates that the LFS measurements are primarily sensitive to a pressure-driven kink-ballooning mode that couples to the core plasma. MHD modeling accurately reproduces these (and indeed all) LFS experimental trends and supports this interpretation. In contrast to the LFS, the HFS magnetic response and correlated global confinement impact is unchanged with plasma pressure, but is strongly reduced in high collisionality conditions in both H- and L-mode. This experimentally suggests the bootstrap current drives the HFS response through the kink-peeling mode drive, though surprisingly weak or no dependence on the bootstrap current is seen in modeling. Instead, modeling is revealed to be very sensitive to the details of the edge current profile and equilibrium truncation. Furthermore, holding truncation fixed, most HFS experimental trends are not captured, thus demonstrating a stark contrast between the robustness of the HFS experimental results and the sensitivity of its computation.« less

  17. The value of value of information: best informing research design and prioritization using current methods.

    PubMed

    Eckermann, Simon; Karnon, Jon; Willan, Andrew R

    2010-01-01

    Value of information (VOI) methods have been proposed as a systematic approach to inform optimal research design and prioritization. Four related questions arise that VOI methods could address. (i) Is further research for a health technology assessment (HTA) potentially worthwhile? (ii) Is the cost of a given research design less than its expected value? (iii) What is the optimal research design for an HTA? (iv) How can research funding be best prioritized across alternative HTAs? Following Occam's razor, we consider the usefulness of VOI methods in informing questions 1-4 relative to their simplicity of use. Expected value of perfect information (EVPI) with current information, while simple to calculate, is shown to provide neither a necessary nor a sufficient condition to address question 1, given that what EVPI needs to exceed varies with the cost of research design, which can vary from very large down to negligible. Hence, for any given HTA, EVPI does not discriminate, as it can be large and further research not worthwhile or small and further research worthwhile. In contrast, each of questions 1-4 are shown to be fully addressed (necessary and sufficient) where VOI methods are applied to maximize expected value of sample information (EVSI) minus expected costs across designs. In comparing complexity in use of VOI methods, applying the central limit theorem (CLT) simplifies analysis to enable easy estimation of EVSI and optimal overall research design, and has been shown to outperform bootstrapping, particularly with small samples. Consequently, VOI methods applying the CLT to inform optimal overall research design satisfy Occam's razor in both improving decision making and reducing complexity. Furthermore, they enable consideration of relevant decision contexts, including option value and opportunity cost of delay, time, imperfect implementation and optimal design across jurisdictions. More complex VOI methods such as bootstrapping of the expected value of partial EVPI may have potential value in refining overall research design. However, Occam's razor must be seriously considered in application of these VOI methods, given their increased complexity and current limitations in informing decision making, with restriction to EVPI rather than EVSI and not allowing for important decision-making contexts. Initial use of CLT methods to focus these more complex partial VOI methods towards where they may be useful in refining optimal overall trial design is suggested. Integrating CLT methods with such partial VOI methods to allow estimation of partial EVSI is suggested in future research to add value to the current VOI toolkit.

  18. Analysis of Drude model using fractional derivatives without singular kernels

    NASA Astrophysics Data System (ADS)

    Jiménez, Leonardo Martínez; García, J. Juan Rosales; Contreras, Abraham Ortega; Baleanu, Dumitru

    2017-11-01

    We report study exploring the fractional Drude model in the time domain, using fractional derivatives without singular kernels, Caputo-Fabrizio (CF), and fractional derivatives with a stretched Mittag-Leffler function. It is shown that the velocity and current density of electrons moving through a metal depend on both the time and the fractional order 0 < γ ≤ 1. Due to non-singular fractional kernels, it is possible to consider complete memory effects in the model, which appear neither in the ordinary model, nor in the fractional Drude model with Caputo fractional derivative. A comparison is also made between these two representations of the fractional derivatives, resulting a considered difference when γ < 0.8.

  19. Using methods from the data mining and machine learning literature for disease classification and prediction: A case study examining classification of heart failure sub-types

    PubMed Central

    Austin, Peter C.; Tu, Jack V.; Ho, Jennifer E.; Levy, Daniel; Lee, Douglas S.

    2014-01-01

    Objective Physicians classify patients into those with or without a specific disease. Furthermore, there is often interest in classifying patients according to disease etiology or subtype. Classification trees are frequently used to classify patients according to the presence or absence of a disease. However, classification trees can suffer from limited accuracy. In the data-mining and machine learning literature, alternate classification schemes have been developed. These include bootstrap aggregation (bagging), boosting, random forests, and support vector machines. Study design and Setting We compared the performance of these classification methods with those of conventional classification trees to classify patients with heart failure according to the following sub-types: heart failure with preserved ejection fraction (HFPEF) vs. heart failure with reduced ejection fraction (HFREF). We also compared the ability of these methods to predict the probability of the presence of HFPEF with that of conventional logistic regression. Results We found that modern, flexible tree-based methods from the data mining literature offer substantial improvement in prediction and classification of heart failure sub-type compared to conventional classification and regression trees. However, conventional logistic regression had superior performance for predicting the probability of the presence of HFPEF compared to the methods proposed in the data mining literature. Conclusion The use of tree-based methods offers superior performance over conventional classification and regression trees for predicting and classifying heart failure subtypes in a population-based sample of patients from Ontario. However, these methods do not offer substantial improvements over logistic regression for predicting the presence of HFPEF. PMID:23384592

  20. Recovery of butanol by counter-current carbon dioxide fractionation with its potential application to butanol fermentation

    USDA-ARS?s Scientific Manuscript database

    A counter-current CO2 fractionation method was studied as a means to recover butanol (also known as 1-butanol or n-butanol) and other compounds that are typically obtained from biobutanol fermentation broth from aqueous solutions. The influence of operating parameters, such as solvent-to-feed ratio,...

  1. Neoclassical transport in toroidal plasmas with nonaxisymmetric flux surfaces

    DOE PAGES

    Belli, Emily A.; Candy, Jefferey M.

    2015-04-15

    The capability to treat nonaxisymmetric flux surface geometry has been added to the drift-kinetic code NEO. Geometric quantities (i.e. metric elements) are supplied by a recently-developed local 3D equilibrium solver, allowing neoclassical transport coefficients to be systematically computed while varying the 3D plasma shape in a simple and intuitive manner. Code verification is accomplished via detailed comparison with 3D Pfirsch–Schlüter theory. A discussion of the various collisionality regimes associated with 3D transport is given, with an emphasis on non-ambipolar particle flux, neoclassical toroidal viscosity, energy flux and bootstrap current. As a result, we compute the transport in the presence ofmore » ripple-type perturbations in a DIII-D-like H-mode edge plasma.« less

  2. Refining search terms for nanotechnology

    NASA Astrophysics Data System (ADS)

    Porter, Alan L.; Youtie, Jan; Shapira, Philip; Schoeneck, David J.

    2008-05-01

    The ability to delineate the boundaries of an emerging technology is central to obtaining an understanding of the technology's research paths and commercialization prospects. Nowhere is this more relevant than in the case of nanotechnology (hereafter identified as "nano") given its current rapid growth and multidisciplinary nature. (Under the rubric of nanotechnology, we also include nanoscience and nanoengineering.) Past efforts have utilized several strategies, including simple term search for the prefix nano, complex lexical and citation-based approaches, and bootstrapping techniques. This research introduces a modularized Boolean approach to defining nanotechnology which has been applied to several research and patenting databases. We explain our approach to downloading and cleaning data, and report initial results. Comparisons of this approach with other nanotechnology search formulations are presented. Implications for search strategy development and profiling of the nanotechnology field are discussed.

  3. Changes in glycolytic enzyme activities in aging erythrocytes fractionated by counter-current distribution in aqueous polymer two-phase systems.

    PubMed Central

    Jimeno, P; Garcia-Perez, A I; Luque, J; Pinilla, M

    1991-01-01

    Human and rat erythrocytes were fractionated by counter-current distribution in charge-sensitive dextran/poly(ethylene glycol) two-phase systems. The specific activities of the key glycolytic enzymes (hexokinase, phosphofructokinase and pyruvate kinase) declined along the distribution profiles, although the relative positions of the activity profiles were reversed in the two species. These enzymes maintained their normal response to specific regulatory effectors in all cell fractions. No variations were observed for phosphoglycerate kinase and bisphosphoglycerate mutase activities. Some correlations between enzyme activities (pyruvate kinase/hexokinase, pyruvate kinase/phosphofructokinase, pyruvate kinase/pyruvate kinase plus phosphoglycerate kinase, pyruvate kinase/bisphosphoglycerate mutase and phosphoglycerate kinase/bisphosphoglycerate mutase ratios) were studied in whole erythrocyte populations as well as in cell fractions. These results strongly support the fractionation of human erythrocytes according to cell age, as occurs with rat erythrocytes. PMID:1656939

  4. New Fraction Time Annealing Method For Improving Organic Light Emitting Diode Current Stability of Hydorgenated Amorphous Silicon Thin-Film Transistor Based Active Matrix Organic Light Emitting Didode Backplane

    NASA Astrophysics Data System (ADS)

    Lee, Jae-Hoon; Park, Sang-Geun; Jeon, Jae-Hong; Goh, Joon-chul; Huh, Jong-moo; Choi, Joonhoo; Chung, Kyuha; Han, Min-Koo

    2007-03-01

    We propose and fabricate a new hydrogenated amorphous silicon (a-Si:H) thin-film transistor (TFT) pixel employing a fraction time annealing (FTA), which can supply a negative gate bias during a fraction time of each frame rather than the entire whole frame, in order to improve the organic light emitting diode (OLED) current stability for an active matrix (AM) OLED. When an electrical bias for an initial reference current of 2 μA at 60 °C is applied to an FTA-driven pixel more than 100 h and the temperature is increased up to 60 °C rather than room temperature, the OLED current is reduced by 22% in the FTA-driven pixel, whereas it is reduced by 53% in a conventional pixel. The current stability of the proposed pixel is improved, because the applied negative bias can suppress the threshold voltage degradation of the a-Si:H TFT itself, which may be attributed to hole trapping into SiNx. The proposed fraction time annealing method can successfully suppress Vth shift of the a-Si:H TFT itself due to hole trapping into SiNx induced by negative gate bias annealing.

  5. [Plasma fractionation in the world: current status].

    PubMed

    Burnouf, T

    2007-05-01

    From 22 to 25 million liters of plasma are fractionated yearly in about 70 fractionation plants, either private or government-owned, mainly located in industrialized countries, and with a capacity ranging from 50000 to three million liters. In an increasingly global environment, the plasma industry has recently gone through a major consolidation phase that has seen mergers and acquisitions, and has led to the closure of a number of small plants in Europe. Currently, some fifteen countries are involved into contract plasma fractionation programs to ensure a supply of plasma-derived medicinal products. The majority of the plasma for fractionation is obtained by automated plasmapheresis, the remaining (recovered plasma) being prepared from whole blood as a by-product of red cell production. Plasma for fractionation should be produced, and controlled following well established procedures to meet the strict quality requirements set by regulatory authorities and fractionators. The plasma fractionation technology still relies heavily on the cold ethanol fractionation process, but has been improved by the introduction of modern chromatographic purification methods, and efficient viral inactivation and removal treatments, ensuring quality and safety to a large portfolio of fractionated plasma products. The safety of these products with regards to the risk of transmission of variant Creutzfeldt-Jakob disease seems to be provided, based on current scientific data, by extensive removal of the infectious agent during certain fractionation steps. The leading plasma product is now the intravenous immunoglobulin G, which has replaced factor VIII and albumin in this role. The supply of plasma products (most specifically coagulation products and immunoglobulin) at an affordable price and in sufficient quantity remains an issue; the problem is particularly acute in developing countries, as the switch to recombinant factor VIII in rich countries has not solved the supply issue and has even led to an increase of the mean price of plasma-derived factor VIII to the developing world. In the last few years, the plasma fractionation industry has improved greatly, and should remain essential in the years to come for the procurement of many essential medicines.

  6. Kappa statistic for the clustered dichotomous responses from physicians and patients

    PubMed Central

    Kang, Chaeryon; Qaqish, Bahjat; Monaco, Jane; Sheridan, Stacey L.; Cai, Jianwen

    2013-01-01

    The bootstrap method for estimating the standard error of the kappa statistic in the presence of clustered data is evaluated. Such data arise, for example, in assessing agreement between physicians and their patients regarding their understanding of the physician-patient interaction and discussions. We propose a computationally efficient procedure for generating correlated dichotomous responses for physicians and assigned patients for simulation studies. The simulation result demonstrates that the proposed bootstrap method produces better estimate of the standard error and better coverage performance compared to the asymptotic standard error estimate that ignores dependence among patients within physicians with at least a moderately large number of clusters. An example of an application to a coronary heart disease prevention study is presented. PMID:23533082

  7. Tests of Mediation: Paradoxical Decline in Statistical Power as a Function of Mediator Collinearity

    PubMed Central

    Beasley, T. Mark

    2013-01-01

    Increasing the correlation between the independent variable and the mediator (a coefficient) increases the effect size (ab) for mediation analysis; however, increasing a by definition increases collinearity in mediation models. As a result, the standard error of product tests increase. The variance inflation due to increases in a at some point outweighs the increase of the effect size (ab) and results in a loss of statistical power. This phenomenon also occurs with nonparametric bootstrapping approaches because the variance of the bootstrap distribution of ab approximates the variance expected from normal theory. Both variances increase dramatically when a exceeds the b coefficient, thus explaining the power decline with increases in a. Implications for statistical analysis and applied researchers are discussed. PMID:24954952

  8. The Bootstrap, the Jackknife, and the Randomization Test: A Sampling Taxonomy.

    PubMed

    Rodgers, J L

    1999-10-01

    A simple sampling taxonomy is defined that shows the differences between and relationships among the bootstrap, the jackknife, and the randomization test. Each method has as its goal the creation of an empirical sampling distribution that can be used to test statistical hypotheses, estimate standard errors, and/or create confidence intervals. Distinctions between the methods can be made based on the sampling approach (with replacement versus without replacement) and the sample size (replacing the whole original sample versus replacing a subset of the original sample). The taxonomy is useful for teaching the goals and purposes of resampling schemes. An extension of the taxonomy implies other possible resampling approaches that have not previously been considered. Univariate and multivariate examples are presented.

  9. Modeling and analysis of fractional order DC-DC converter.

    PubMed

    Radwan, Ahmed G; Emira, Ahmed A; AbdelAty, Amr M; Azar, Ahmad Taher

    2017-07-11

    Due to the non-idealities of commercial inductors, the demand for a better model that accurately describe their dynamic response is elevated. So, the fractional order models of Buck, Boost and Buck-Boost DC-DC converters are presented in this paper. The detailed analysis is made for the two most common modes of converter operation: Continuous Conduction Mode (CCM) and Discontinuous Conduction Mode (DCM). Closed form time domain expressions are derived for inductor currents, voltage gain, average current, conduction time and power efficiency where the effect of the fractional order inductor is found to be strongly present. For example, the peak inductor current at steady state increases with decreasing the inductor order. Advanced Design Systems (ADS) circuit simulations are used to verify the derived formulas, where the fractional order inductor is simulated using Valsa Constant Phase Element (CPE) approximation and Generalized Impedance Converter (GIC). Different simulation results are introduced with good matching to the theoretical formulas for the three DC-DC converter topologies under different fractional orders. A comprehensive comparison with the recently published literature is presented to show the advantages and disadvantages of each approach. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Fractions, Number Lines, Third Graders

    ERIC Educational Resources Information Center

    Cramer, Kathleen; Ahrendt, Sue; Monson, Debra; Wyberg, Terry; Colum, Karen

    2017-01-01

    The Common Core State Standards for Mathematics (CCSSM) (CCSSI 2010) outlines ambitious goals for fraction learning, starting in third grade, that include the use of the number line model. Understanding and constructing fractions on a number line are particularly complex tasks. The current work of the authors centers on ways to successfully…

  11. Blood flow problem in the presence of magnetic particles through a circular cylinder using Caputo-Fabrizio fractional derivative

    NASA Astrophysics Data System (ADS)

    Uddin, Salah; Mohamad, Mahathir; Khalid, Kamil; Abdulhammed, Mohammed; Saifullah Rusiman, Mohd; Che – Him, Norziha; Roslan, Rozaini

    2018-04-01

    In this paper, the flow of blood mixed with magnetic particles subjected to uniform transverse magnetic field and pressure gradient in an axisymmetric circular cylinder is studied by using a new trend of fractional derivative without singular kernel. The governing equations are fractional partial differential equations derived based on the Caputo-Fabrizio time-fractional derivatives NFDt. The current result agrees considerably well with that of the previous Caputo fractional derivatives UFDt.

  12. Quantization and fractional quantization of currents in periodically driven stochastic systems. I. Average currents

    NASA Astrophysics Data System (ADS)

    Chernyak, Vladimir Y.; Klein, John R.; Sinitsyn, Nikolai A.

    2012-04-01

    This article studies Markovian stochastic motion of a particle on a graph with finite number of nodes and periodically time-dependent transition rates that satisfy the detailed balance condition at any time. We show that under general conditions, the currents in the system on average become quantized or fractionally quantized for adiabatic driving at sufficiently low temperature. We develop the quantitative theory of this quantization and interpret it in terms of topological invariants. By implementing the celebrated Kirchhoff theorem we derive a general and explicit formula for the average generated current that plays a role of an efficient tool for treating the current quantization effects.

  13. Non-Solenoidal Startup Research Directions on the Pegasus Toroidal Experiment

    NASA Astrophysics Data System (ADS)

    Fonck, R. J.; Bongard, M. W.; Lewicki, B. T.; Reusch, J. A.; Winz, G. R.

    2017-10-01

    The Pegasus research program has been focused on developing a physical understanding and predictive models for non-solenoidal tokamak plasma startup using Local Helicity Injection (LHI). LHI employs strong localized electron currents injected along magnetic field lines in the plasma edge that relax through magnetic turbulence to form a tokamak-like plasma. Pending approval, the Pegasus program will address a broader, more comprehensive examination of non-solenoidal tokamak startup techniques. New capabilities may include: increasing the toroidal field to 0.6 T to support critical scaling tests to near-NSTX-U field levels; deploying internal plasma diagnostics; installing a coaxial helicity injection (CHI) capability in the upper divertor region; and deploying a modest (200-400 kW) electron cyclotron RF capability. These efforts will address scaling of relevant physics to higher BT, separate and comparative studies of helicity injection techniques, efficiency of handoff to consequent current sustainment techniques, and the use of ECH to synergistically improve the target plasma for consequent bootstrap and neutral beam current drive sustainment. This has an ultimate goal of validating techniques to produce a 1 MA target plasma in NSTX-U and beyond. Work supported by US DOE Grant DE-FG02-96ER54375.

  14. A new path to first light for the Magdalena Ridge Observatory interferometer

    NASA Astrophysics Data System (ADS)

    Creech-Eakman, M. J.; Romero, V.; Payne, I.; Haniff, C. A.; Buscher, D. F.; Young, J. S.; Cervantes, R.; Dahl, C.; Farris, A.; Fisher, M.; Johnston, P.; Klinglesmith, D.; Love, H.; Ochoa, D.; Olivares, A.; Pino, J.; Salcido, C.; Santoro, F.; Schmidt, L.; Seneta, E. B.; Sun, X.; Jenka, L.; Kelly, R.; Price, J.; Rea, A.; Riker, J.; Rochelle, S.

    2016-08-01

    The Magdalena Ridge Observatory Interferometer (MROI) was the most ambitious infrared interferometric facility conceived of in 2003 when funding began. Today, despite having suffered some financial short-falls, it is still one of the most ambitious interferometric imaging facilities ever designed. With an innovative approach to attaining the original goal of fringe tracking to H = 14th magnitude via completely redesigned mobile telescopes, and a unique approach to the beam train and delay lines, the MROI will be able to image faint and complex objects with milliarcsecond resolutions for a fraction of the cost of giant telescopes or space-based facilities. The design goals of MROI have been optimized for studying stellar astrophysical processes such as mass loss and mass transfer, the formation and evolution of YSOs and their disks, and the environs of nearby AGN. The global needs for Space Situational Awareness (SSA) have moved to the forefront in many communities as Space becomes a more integral part of a national security portfolio. These needs drive imaging capabilities ultimately to a few tens of centimeter resolution at geosynchronous orbits. Any array capable of producing images on faint and complex geosynchronous objects in just a few hours will be outstanding not only as an astrophysical tool, but also for these types of SSA missions. With the recent infusion of new funding from the Air Force Research Lab (AFRL) in Albuquerque, NM, MROI will be able to attain first light, first fringes, and demonstrate bootstrapping with three telescopes by 2020. MROI's current status along with a sketch of our activities over the coming 5 years will be presented, as well as clear opportunities to collaborate on various aspects of the facility as it comes online. Further funding is actively being sought to accelerate the capability of the array for interferometric imaging on a short time-scale so as to achieve the original goals of this ambitious facility

  15. Schema building profiles among elementary school students in solving problems related to operations of addition to fractions on the basis of mathematic abilities

    NASA Astrophysics Data System (ADS)

    Gembong, S.; Suwarsono, S. T.; Prabowo

    2018-03-01

    Schema in the current study refers to a set of action, process, object and other schemas already possessed to build an individual’s ways of thinking to solve a given problem. The current study aims to investigate the schemas built among elementary school students in solving problems related to operations of addition to fractions. The analyses of the schema building were done qualitatively on the basis of the analytical framework of the APOS theory (Action, Process, Object, and Schema). Findings show that the schemas built on students of high and middle ability indicate the following. In the action stage, students were able to add two fractions by way of drawing a picture or procedural way. In the Stage of process, they could add two and three fractions. In the stage of object, they could explain the steps of adding two fractions and change a fraction into addition of fractions. In the last stage, schema, they could add fractions by relating them to another schema they have possessed i.e. the least common multiple. Those of high and middle mathematic abilities showed that their schema building in solving problems related to operations odd addition to fractions worked in line with the framework of the APOS theory. Those of low mathematic ability, however, showed that their schema on each stage did not work properly.

  16. Rubiaceae in Brazilian Atlantic Forest remnants: floristic similarity and implications for conservation.

    PubMed

    de Paiva, Alessandra Marques; Barberena, Felipe Fajardo Villela Antolin; Lopes, Rosana Conrado

    2016-06-01

    Brazil holds most of the Atlantic Forest Domain and is also one of the Rubiaceae diversity centers in the Neotropics. Despite the urban expansion in the state of Rio de Janeiro, large areas of continuous vegetation with high connectivity degree can still be found. Recently, new Rubiaceae species have been described in the Rio de Janeiro flora, which present small populations and very particular distribution. The current paper analyzed the similarity in the floristic composition of the Rubiaceae in eight Atlantic Forest remnants of Rio de Janeiro state protected by Conservation Units. We also surveyed and set guidelines for conservation of microendemic species. The similarity analysis were based on previously published studies in Área de Proteção Ambiental de Grumari, Área de Proteção Ambiental Palmares, Parque Estadual da Serra da Tiririca, Parque Nacional do Itatiaia, Parque Nacional de Jurubatiba, Reserva Biológica de Poço das Antas, Reserva Biológica do Tinguá and Reserva Ecológica de Macaé de Cima - using the PAST software (“Paleontological Statistics”) with Sørensen coefficient. The floristic similarity analysis revealed two groups with distinct physiographic characteristics and different vegetation types. Group A consisted in two Restinga areas, Área de Proteção Ambiental de Grumari and Parque Nacional de Jurubatiba, which showed strong bootstrap support (98 %). Group B included forest remnants with distinct phytophisiognomies or altitudes, but with moderate bootstrap support. Low similarity levels among the eight areas were found due to the habitats’ heterogeneity. The current study pointed out 19 microendemic species from the Atlantic Forest, they present a single-site distribution or a distribution restricted to Mountain and Metropolitan regions of Rio de Janeiro state. Concerning the conservation status of microendemic species, discrepancies between the Catalogue of Flora of Rio de Janeiro and the Red Book of Brazilian Flora (two of the main reference catalogs of Brazilian flora) have been identified. We have also highlighted the need for recollecting microendemic species from the Atlantic Forest, and for properly assessing the degree of threat faced by these taxons early.

  17. Tests for informative cluster size using a novel balanced bootstrap scheme.

    PubMed

    Nevalainen, Jaakko; Oja, Hannu; Datta, Somnath

    2017-07-20

    Clustered data are often encountered in biomedical studies, and to date, a number of approaches have been proposed to analyze such data. However, the phenomenon of informative cluster size (ICS) is a challenging problem, and its presence has an impact on the choice of a correct analysis methodology. For example, Dutta and Datta (2015, Biometrics) presented a number of marginal distributions that could be tested. Depending on the nature and degree of informativeness of the cluster size, these marginal distributions may differ, as do the choices of the appropriate test. In particular, they applied their new test to a periodontal data set where the plausibility of the informativeness was mentioned, but no formal test for the same was conducted. We propose bootstrap tests for testing the presence of ICS. A balanced bootstrap method is developed to successfully estimate the null distribution by merging the re-sampled observations with closely matching counterparts. Relying on the assumption of exchangeability within clusters, the proposed procedure performs well in simulations even with a small number of clusters, at different distributions and against different alternative hypotheses, thus making it an omnibus test. We also explain how to extend the ICS test to a regression setting and thereby enhancing its practical utility. The methodologies are illustrated using the periodontal data set mentioned earlier. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Bootstrap Enhanced Penalized Regression for Variable Selection with Neuroimaging Data.

    PubMed

    Abram, Samantha V; Helwig, Nathaniel E; Moodie, Craig A; DeYoung, Colin G; MacDonald, Angus W; Waller, Niels G

    2016-01-01

    Recent advances in fMRI research highlight the use of multivariate methods for examining whole-brain connectivity. Complementary data-driven methods are needed for determining the subset of predictors related to individual differences. Although commonly used for this purpose, ordinary least squares (OLS) regression may not be ideal due to multi-collinearity and over-fitting issues. Penalized regression is a promising and underutilized alternative to OLS regression. In this paper, we propose a nonparametric bootstrap quantile (QNT) approach for variable selection with neuroimaging data. We use real and simulated data, as well as annotated R code, to demonstrate the benefits of our proposed method. Our results illustrate the practical potential of our proposed bootstrap QNT approach. Our real data example demonstrates how our method can be used to relate individual differences in neural network connectivity with an externalizing personality measure. Also, our simulation results reveal that the QNT method is effective under a variety of data conditions. Penalized regression yields more stable estimates and sparser models than OLS regression in situations with large numbers of highly correlated neural predictors. Our results demonstrate that penalized regression is a promising method for examining associations between neural predictors and clinically relevant traits or behaviors. These findings have important implications for the growing field of functional connectivity research, where multivariate methods produce numerous, highly correlated brain networks.

  19. More N =4 superconformal bootstrap

    NASA Astrophysics Data System (ADS)

    Beem, Christopher; Rastelli, Leonardo; van Rees, Balt C.

    2017-08-01

    In this long overdue second installment, we continue to develop the conformal bootstrap program for N =4 superconformal field theories (SCFTs) in four dimensions via an analysis of the correlation function of four stress-tensor supermultiplets. We review analytic results for this correlator and make contact with the SCFT/chiral algebra correspondence of Beem et al. [Commun. Math. Phys. 336, 1359 (2015), 10.1007/s00220-014-2272-x]. We demonstrate that the constraints of unitarity and crossing symmetry require the central charge c to be greater than or equal to 3 /4 in any interacting N =4 SCFT. We apply numerical bootstrap methods to derive upper bounds on scaling dimensions and operator product expansion coefficients for several low-lying, unprotected operators as a function of the central charge. We interpret our bounds in the context of N =4 super Yang-Mills theories, formulating a series of conjectures regarding the embedding of the conformal manifold—parametrized by the complexified gauge coupling—into the space of scaling dimensions and operator product expansion coefficients. Our conjectures assign a distinguished role to points on the conformal manifold that are self-dual under a subgroup of the S -duality group. This paper contains a more detailed exposition of a number of results previously reported in Beem et al. [Phys. Rev. Lett. 111, 071601 (2013), 10.1103/PhysRevLett.111.071601] in addition to new results.

  20. Bootstrap Enhanced Penalized Regression for Variable Selection with Neuroimaging Data

    PubMed Central

    Abram, Samantha V.; Helwig, Nathaniel E.; Moodie, Craig A.; DeYoung, Colin G.; MacDonald, Angus W.; Waller, Niels G.

    2016-01-01

    Recent advances in fMRI research highlight the use of multivariate methods for examining whole-brain connectivity. Complementary data-driven methods are needed for determining the subset of predictors related to individual differences. Although commonly used for this purpose, ordinary least squares (OLS) regression may not be ideal due to multi-collinearity and over-fitting issues. Penalized regression is a promising and underutilized alternative to OLS regression. In this paper, we propose a nonparametric bootstrap quantile (QNT) approach for variable selection with neuroimaging data. We use real and simulated data, as well as annotated R code, to demonstrate the benefits of our proposed method. Our results illustrate the practical potential of our proposed bootstrap QNT approach. Our real data example demonstrates how our method can be used to relate individual differences in neural network connectivity with an externalizing personality measure. Also, our simulation results reveal that the QNT method is effective under a variety of data conditions. Penalized regression yields more stable estimates and sparser models than OLS regression in situations with large numbers of highly correlated neural predictors. Our results demonstrate that penalized regression is a promising method for examining associations between neural predictors and clinically relevant traits or behaviors. These findings have important implications for the growing field of functional connectivity research, where multivariate methods produce numerous, highly correlated brain networks. PMID:27516732

Top