Test of bootstrap current models using high- β p EAST-demonstration plasmas on DIII-D
Ren, Qilong; Lao, Lang L.; Garofalo, Andrea M.; ...
2015-01-12
Magnetic measurements together with kinetic profile and motional Stark effect measurements are used in full kinetic equilibrium reconstructions to test the Sauter and NEO bootstrap current models in a DIII-D high-more » $${{\\beta}_{\\text{p}}}$$ EAST-demonstration experiment. This aims at developing on DIII-D a high bootstrap current scenario to be extended on EAST for a demonstration of true steady-state at high performance and uses EAST-similar operational conditions: plasma shape, plasma current, toroidal magnetic field, total heating power and current ramp-up rate. It is found that the large edge bootstrap current in these high-$${{\\beta}_{\\text{p}}}$$ plasmas allows the use of magnetic measurements to clearly distinguish the two bootstrap current models. In these high collisionality and high-$${{\\beta}_{\\text{p}}}$$ plasmas, the Sauter model overpredicts the peak of the edge current density by about 30%, while the first-principle kinetic NEO model is in close agreement with the edge current density of the reconstructed equilibrium. Furthermore, these results are consistent with recent work showing that the Sauter model largely overestimates the edge bootstrap current at high collisionality.« less
Effects of magnetic islands on bootstrap current in toroidal plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, G.; Lin, Z.
The effects of magnetic islands on electron bootstrap current in toroidal plasmas are studied using gyrokinetic simulations. The magnetic islands cause little changes of the bootstrap current level in the banana regime because of trapped electron effects. In the plateau regime, the bootstrap current is completely suppressed at the island centers due to the destruction of trapped electron orbits by collisions and the flattening of pressure profiles by the islands. In the collisional regime, small but finite bootstrap current can exist inside the islands because of the pressure gradients created by large collisional transport across the islands. Lastly, simulation resultsmore » show that the bootstrap current level increases near the island separatrix due to steeper local density gradients.« less
Effects of magnetic islands on bootstrap current in toroidal plasmas
Dong, G.; Lin, Z.
2016-12-19
The effects of magnetic islands on electron bootstrap current in toroidal plasmas are studied using gyrokinetic simulations. The magnetic islands cause little changes of the bootstrap current level in the banana regime because of trapped electron effects. In the plateau regime, the bootstrap current is completely suppressed at the island centers due to the destruction of trapped electron orbits by collisions and the flattening of pressure profiles by the islands. In the collisional regime, small but finite bootstrap current can exist inside the islands because of the pressure gradients created by large collisional transport across the islands. Lastly, simulation resultsmore » show that the bootstrap current level increases near the island separatrix due to steeper local density gradients.« less
Electron transport fluxes in potato plateau regime
NASA Astrophysics Data System (ADS)
Shaing, K. C.; Hazeltine, R. D.
1997-12-01
Electron transport fluxes in the potato plateau regime are calculated from the solutions of the drift kinetic equation and fluid equations. It is found that the bootstrap current density remains finite in the region close to the magnetic axis, although it decreases with increasing collision frequency. This finite amount of the bootstrap current in the relatively collisional regime is important in modeling tokamak startup with 100% bootstrap current.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaing, K.C.; Hazeltine, R.D.
Electron transport fluxes in the potato plateau regime are calculated from the solutions of the drift kinetic equation and fluid equations. It is found that the bootstrap current density remains finite in the region close to the magnetic axis, although it decreases with increasing collision frequency. This finite amount of the bootstrap current in the relatively collisional regime is important in modeling tokamak startup with 100{percent} bootstrap current. {copyright} {ital 1997 American Institute of Physics.}
Bootstrap and fast wave current drive for tokamak reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ehst, D.A.
1991-09-01
Using the multi-species neoclassical treatment of Hirshman and Sigmar we study steady state bootstrap equilibria with seed currents provided by low frequency (ICRF) fast waves and with additional surface current density driven by lower hybrid waves. This study applies to reactor plasmas of arbitrary aspect ratio. IN one limit the bootstrap component can supply nearly the total equilibrium current with minimal driving power (< 20 MW). However, for larger total currents considerable driving power is required (for ITER: I{sub o} = 18 MA needs P{sub FW} = 15 MW, P{sub LH} = 75 MW). A computational survey of bootstrap fractionmore » and current drive efficiency is presented. 11 refs., 8 figs.« less
Validation of neoclassical bootstrap current models in the edge of an H-mode plasma.
Wade, M R; Murakami, M; Politzer, P A
2004-06-11
Analysis of the parallel electric field E(parallel) evolution following an L-H transition in the DIII-D tokamak indicates the generation of a large negative pulse near the edge which propagates inward, indicative of the generation of a noninductive edge current. Modeling indicates that the observed E(parallel) evolution is consistent with a narrow current density peak generated in the plasma edge. Very good quantitative agreement is found between the measured E(parallel) evolution and that expected from neoclassical theory predictions of the bootstrap current.
Impact of bootstrap current and Landau-fluid closure on ELM crashes and transport
NASA Astrophysics Data System (ADS)
Chen, J. G.; Xu, X. Q.; Ma, C. H.; Lei, Y. A.
2018-05-01
Results presented here are from 6-field Landau-Fluid simulations using shifted circular cross-section tokamak equilibria on BOUT++ framework. Linear benchmark results imply that the collisional and collisionless Landau resonance closures make a little difference on linear growth rate spectra which are quite close to the results with the flux limited Spitzer-Härm parallel flux. Both linear and nonlinear simulations show that the plasma current profile plays dual roles on the peeling-ballooning modes that it can drive the low-n peeling modes and stabilize the high-n ballooning modes. For fixed total pressure and current, as the pedestal current decreases due to the bootstrap current which becomes smaller when the density (collisionality) increases, the operational point is shifted downwards vertically in the Jped - α diagram, resulting in threshold changes of different modes. The bootstrap current can slightly increase radial turbulence spreading range and enhance the energy and particle transports by increasing the perturbed amplitude and broadening cross-phase frequency distribution.
Impact of Sampling Density on the Extent of HIV Clustering
Novitsky, Vlad; Moyo, Sikhulile; Lei, Quanhong; DeGruttola, Victor
2014-01-01
Abstract Identifying and monitoring HIV clusters could be useful in tracking the leading edge of HIV transmission in epidemics. Currently, greater specificity in the definition of HIV clusters is needed to reduce confusion in the interpretation of HIV clustering results. We address sampling density as one of the key aspects of HIV cluster analysis. The proportion of viral sequences in clusters was estimated at sampling densities from 1.0% to 70%. A set of 1,248 HIV-1C env gp120 V1C5 sequences from a single community in Botswana was utilized in simulation studies. Matching numbers of HIV-1C V1C5 sequences from the LANL HIV Database were used as comparators. HIV clusters were identified by phylogenetic inference under bootstrapped maximum likelihood and pairwise distance cut-offs. Sampling density below 10% was associated with stochastic HIV clustering with broad confidence intervals. HIV clustering increased linearly at sampling density >10%, and was accompanied by narrowing confidence intervals. Patterns of HIV clustering were similar at bootstrap thresholds 0.7 to 1.0, but the extent of HIV clustering decreased with higher bootstrap thresholds. The origin of sampling (local concentrated vs. scattered global) had a substantial impact on HIV clustering at sampling densities ≥10%. Pairwise distances at 10% were estimated as a threshold for cluster analysis of HIV-1 V1C5 sequences. The node bootstrap support distribution provided additional evidence for 10% sampling density as the threshold for HIV cluster analysis. The detectability of HIV clusters is substantially affected by sampling density. A minimal genotyping density of 10% and sampling density of 50–70% are suggested for HIV-1 V1C5 cluster analysis. PMID:25275430
Non-inductive current generation in fusion plasmas with turbulence
NASA Astrophysics Data System (ADS)
Wang, Weixing; Ethier, S.; Startsev, E.; Chen, J.; Hahm, T. S.; Yoo, M. G.
2017-10-01
It is found that plasma turbulence may strongly influence non-inductive current generation. This may have radical impact on various aspects of tokamak physics. Our simulation study employs a global gyrokinetic model coupling self-consistent neoclassical and turbulent dynamics with focus on electron current. Distinct phases in electron current generation are illustrated in the initial value simulation. In the early phase before turbulence develops, the electron bootstrap current is established in a time scale of a few electron collision times, which closely agrees with the neoclassical prediction. The second phase follows when turbulence begins to saturate, during which turbulent fluctuations are found to strongly affect electron current. The profile structure, amplitude and phase space structure of electron current density are all significantly modified relative to the neoclassical bootstrap current by the presence of turbulence. Both electron parallel acceleration and parallel residual stress drive are shown to play important roles in turbulence-induced current generation. The current density profile is modified in a way that correlates with the fluctuation intensity gradient through its effect on k//-symmetry breaking in fluctuation spectrum. Turbulence is shown to deduct (enhance) plasma self-generated current in low (high) collisionality regime, and the reduction of total electron current relative to the neoclassical bootstrap current increases as collisionality decreases. The implication of this result to the fully non-inductive current operation in steady state burning plasma regime should be investigated. Finally, significant non-inductive current is observed in flat pressure region, which is a nonlocal effect and results from turbulence spreading induced current diffusion. Work supported by U.S. DOE Contract DE-AC02-09-CH11466.
A condition for small bootstrap current in three-dimensional toroidal configurations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikhailov, M. I., E-mail: mikhaylov-mi@nrcki.ru; Nührenberg, J.; Zille, R.
2016-11-15
It is shown that, if the maximum of the magnetic field strength on a magnetic surface in a threedimensional magnetic confinement configuration with stellarator symmetry constitutes a line that is orthogonal to the field lines and crosses the symmetry line, then the bootstrap current density is smaller compared to that in quasi-axisymmetric (qa) [J. Nührenberg et al., in Proc. of Joint Varenna−Lausanne Int. Workshop on Theory of Fusion Plasmas, Varenna, 1994, p. 3] and quasi-helically (qh) symmetric [J. Nührenberg and R. Zille, Phys. Lett. A 129, 113 (1988)] configurations.
NASA Astrophysics Data System (ADS)
Wu, M. Q.; Pan, C. K.; Chan, V. S.; Li, G. Q.; Garofalo, A. M.; Jian, X.; Liu, L.; Ren, Q. L.; Chen, J. L.; Gao, X.; Gong, X. Z.; Ding, S. Y.; Qian, J. P.; Cfetr Physics Team
2018-04-01
Time-dependent integrated modeling of DIII-D ITER-like and high bootstrap current plasma ramp-up discharges has been performed with the equilibrium code EFIT, and the transport codes TGYRO and ONETWO. Electron and ion temperature profiles are simulated by TGYRO with the TGLF (SAT0 or VX model) turbulent and NEO neoclassical transport models. The VX model is a new empirical extension of the TGLF turbulent model [Jian et al., Nucl. Fusion 58, 016011 (2018)], which captures the physics of multi-scale interaction between low-k and high-k turbulence from nonlinear gyro-kinetic simulation. This model is demonstrated to accurately model low Ip discharges from the EAST tokamak. Time evolution of the plasma current density profile is simulated by ONETWO with the experimental current ramp-up rate. The general trend of the predicted evolution of the current density profile is consistent with that obtained from the equilibrium reconstruction with Motional Stark effect constraints. The predicted evolution of βN , li , and βP also agrees well with the experiments. For the ITER-like cases, the predicted electron and ion temperature profiles using TGLF_Sat0 agree closely with the experimental measured profiles, and are demonstrably better than other proposed transport models. For the high bootstrap current case, the predicted electron and ion temperature profiles perform better in the VX model. It is found that the SAT0 model works well at high IP (>0.76 MA) while the VX model covers a wider range of plasma current ( IP > 0.6 MA). The results reported in this paper suggest that the developed integrated modeling could be a candidate for ITER and CFETR ramp-up engineering design modeling.
Core transport properties in JT-60U and JET identity plasmas
NASA Astrophysics Data System (ADS)
Litaudon, X.; Sakamoto, Y.; de Vries, P. C.; Salmi, A.; Tala, T.; Angioni, C.; Benkadda, S.; Beurskens, M. N. A.; Bourdelle, C.; Brix, M.; Crombé, K.; Fujita, T.; Futatani, S.; Garbet, X.; Giroud, C.; Hawkes, N. C.; Hayashi, N.; Hoang, G. T.; Hogeweij, G. M. D.; Matsunaga, G.; Nakano, T.; Oyama, N.; Parail, V.; Shinohara, K.; Suzuki, T.; Takechi, M.; Takenaga, H.; Takizuka, T.; Urano, H.; Voitsekhovitch, I.; Yoshida, M.; ITPA Transport Group; JT-60 Team; EFDA contributors, JET
2011-07-01
The paper compares the transport properties of a set of dimensionless identity experiments performed between JET and JT-60U in the advanced tokamak regime with internal transport barrier, ITB. These International Tokamak Physics Activity, ITPA, joint experiments were carried out with the same plasma shape, toroidal magnetic field ripple and dimensionless profiles as close as possible during the ITB triggering phase in terms of safety factor, normalized Larmor radius, normalized collision frequency, thermal beta, ratio of ion to electron temperatures. Similarities in the ITB triggering mechanisms and sustainment were observed when a good match was achieved of the most relevant normalized profiles except the toroidal Mach number. Similar thermal ion transport levels in the two devices have been measured in either monotonic or non-monotonic q-profiles. In contrast, differences between JET and JT-60U were observed on the electron thermal and particle confinement in reversed magnetic shear configurations. It was found that the larger shear reversal in the very centre (inside normalized radius of 0.2) of JT-60U plasmas allowed the sustainment of stronger electron density ITBs compared with JET. As a consequence of peaked density profile, the core bootstrap current density is more than five times higher in JT-60U compared with JET. Thanks to the bootstrap effect and the slightly broader neutral beam deposition, reversed magnetic shear configurations are self-sustained in JT-60U scenarios. Analyses of similarities and differences between the two devices address key questions on the validity of the usual assumptions made in ITER steady scenario modelling, e.g. a flat density profile in the core with thermal transport barrier? Such assumptions have consequences on the prediction of fusion performance, bootstrap current and on the sustainment of the scenario.
Reduced ion bootstrap current drive on NTM instability
NASA Astrophysics Data System (ADS)
Qu, Hongpeng; Wang, Feng; Wang, Aike; Peng, Xiaodong; Li, Jiquan
2018-05-01
The loss of bootstrap current inside magnetic island plays a dominant role in driving the neoclassical tearing mode (NTM) instability in tokamak plasmas. In this work, we investigate the finite-banana-width (FBW) effect on the profile of ion bootstrap current in the island vicinity via an analytical approach. The results show that even if the pressure gradient vanishes inside the island, the ion bootstrap current can partly survive due to the FBW effect. The efficiency of the FBW effect is higher when the island width becomes smaller. Nevertheless, even when the island width is comparable to the ion FBW, the unperturbed ion bootstrap current inside the island cannot be largely recovered by the FBW effect, and thus the current loss still exists. This suggests that FBW effect alone cannot dramatically reduce the ion bootstrap current drive on NTMs.
NASA Astrophysics Data System (ADS)
Cesario, R. C.; Castaldo, C.; Fonseca, A.; De Angelis, R.; Parail, V.; Smeulders, P.; Beurskens, M.; Brix, M.; Calabrò, G.; De Vries, P.; Mailloux, J.; Pericoli, V.; Ravera, G.; Zagorski, R.
2007-09-01
LHCD has been used in JET experiments aimed at producing internal transport barriers (ITBs) in highly triangular plasmas (δ≈0.4) at high βN (up to 3) for steady-state application. The LHCD is a potentially valuable tool for (i) modifying the target q-profile, which can help avoid deleterious MHD modes and favour the formation of ITBs, and (ii) contributing to the non-inductive current drive required to prolong such plasma regimes. The q-profile evolution has been simulated during the current ramp-up phase for such a discharge (B0 = 2.3 T, IP = 1.5 MA) where 2 MW of LHCD has been coupled. The JETTO code was used taking measured plasma profiles, and the LHCD profile modeled by the LHstar code. The results are in agreement with MSE measurements and indicate the importance of the elevated electron temperature due to LHCD, as well as the driven current. During main heating with 18 MW of NBI and 3 MW of ICRH the bootstrap current density at the edge also becomes large, consistently with the observed reduction of the local turbulence and of the MHD activity. JETTO modelling suggests that the bootstrap current can reduce the magnetic shear (sh) at large radius, potentially affecting the MHD stability and turbulence behaviour in this region. Keywords: lower hybrid current drive (LHCD), bootstrap current, q (safety factor) and shear (sh) profile evolutions.
NASA Astrophysics Data System (ADS)
Monticello, D. A.; Reiman, A. H.; Watanabe, K. Y.; Nakajima, N.; Okamoto, M.
1997-11-01
The existence of bootstrap currents in both tokamaks and stellarators was confirmed, experimentally, more than ten years ago. Such currents can have significant effects on the equilibrium and stability of these MHD devices. In addition, stellarators, with the notable exception of W7-X, are predicted to have such large bootstrap currents that reliable equilibrium calculations require the self-consistent evaluation of bootstrap currents. Modeling of discharges which contain islands requires an algorithm that does not assume good surfaces. Only one of the two 3-D equilibrium codes that exist, PIES( Reiman, A. H., Greenside, H. S., Compt. Phys. Commun. 43), (1986)., can easily be modified to handle bootstrap current. Here we report on the coupling of the PIES 3-D equilibrium code and NIFS bootstrap code(Watanabe, K., et al., Nuclear Fusion 35) (1995), 335.
Bootstrap Percolation on Homogeneous Trees Has 2 Phase Transitions
NASA Astrophysics Data System (ADS)
Fontes, L. R. G.; Schonmann, R. H.
2008-09-01
We study the threshold θ bootstrap percolation model on the homogeneous tree with degree b+1, 2≤ θ≤ b, and initial density p. It is known that there exists a nontrivial critical value for p, which we call p f , such that a) for p> p f , the final bootstrapped configuration is fully occupied for almost every initial configuration, and b) if p< p f , then for almost every initial configuration, the final bootstrapped configuration has density of occupied vertices less than 1. In this paper, we establish the existence of a distinct critical value for p, p c , such that 0< p c < p f , with the following properties: 1) if p≤ p c , then for almost every initial configuration there is no infinite cluster of occupied vertices in the final bootstrapped configuration; 2) if p> p c , then for almost every initial configuration there are infinite clusters of occupied vertices in the final bootstrapped configuration. Moreover, we show that 3) for p< p c , the distribution of the occupied cluster size in the final bootstrapped configuration has an exponential tail; 4) at p= p c , the expected occupied cluster size in the final bootstrapped configuration is infinite; 5) the probability of percolation of occupied vertices in the final bootstrapped configuration is continuous on [0, p f ] and analytic on ( p c , p f ), admitting an analytic continuation from the right at p c and, only in the case θ= b, also from the left at p f .
Joint DIII-D/EAST Experiments Toward Steady State AT Demonstration
NASA Astrophysics Data System (ADS)
Garofalo, A. M.; Meneghini, O.; Staebler, G. M.; van Zeeland, M. A.; Gong, X.; Ding, S.; Qian, J.; Ren, Q.; Xu, G.; Grierson, B. A.; Solomon, W. M.; Holcomb, C. T.
2015-11-01
Joint DIII-D/EAST experiments on fully noninductive operation at high poloidal beta have demonstrated several attractive features of this regime for a steady-state fusion reactor. Very large bootstrap fraction (>80 %) is desirable because it reduces the demands on external noninductive current drive. High bootstrap fraction with an H-mode edge results in a broad current profile and internal transport barriers (ITBs) at large minor radius, leading to high normalized energy confinement and high MHD stability limits. The ITB radius expands with higher normalized beta, further improving both stability and confinement. Electron density ITB and large Shafranov shift lead to low AE activity in the plasma core and low anomalous fast ion losses. Both the ITB and the current profile show remarkable robustness against perturbations, without external control. Supported by US DOE under DE-FC02-04ER54698, DE-AC02-09CH11466 & DE-AC52-07NA27344 & by NMCFSP under contracts 2015GB102000 and 2015GB110001.
Prospects for steady-state scenarios on JET
NASA Astrophysics Data System (ADS)
Litaudon, X.; Bizarro, J. P. S.; Challis, C. D.; Crisanti, F.; DeVries, P. C.; Lomas, P.; Rimini, F. G.; Tala, T. J. J.; Akers, R.; Andrew, Y.; Arnoux, G.; Artaud, J. F.; Baranov, Yu F.; Beurskens, M.; Brix, M.; Cesario, R.; DeLa Luna, E.; Fundamenski, W.; Giroud, C.; Hawkes, N. C.; Huber, A.; Joffrin, E.; Pitts, R. A.; Rachlew, E.; Reyes-Cortes, S. D. A.; Sharapov, S. E.; Zastrow, K. D.; Zimmermann, O.; JET EFDA contributors, the
2007-09-01
In the 2006 experimental campaign, progress has been made on JET to operate non-inductive scenarios at higher applied powers (31 MW) and density (nl ~ 4 × 1019 m-3), with ITER-relevant safety factor (q95 ~ 5) and plasma shaping, taking advantage of the new divertor capabilities. The extrapolation of the performance using transport modelling benchmarked on the experimental database indicates that the foreseen power upgrade (~45 MW) will allow the development of non-inductive scenarios where the bootstrap current is maximized together with the fusion yield and not, as in present-day experiments, at its expense. The tools for the long-term JET programme are the new ITER-like ICRH antenna (~15 MW), an upgrade of the NB power (35 MW/20 s or 17.5 MW/40 s), a new ITER-like first wall, a new pellet injector for edge localized mode control together with improved diagnostic and control capability. Operation with the new wall will set new constraints on non-inductive scenarios that are already addressed experimentally and in the modelling. The fusion performance and driven current that could be reached at high density and power have been estimated using either 0D or 1-1/2D validated transport models. In the high power case (45 MW), the calculations indicate the potential for the operational space of the non-inductive regime to be extended in terms of current (~2.5 MA) and density (nl > 5 × 1019 m-3), with high βN (βN > 3.0) and a fraction of the bootstrap current within 60-70% at high toroidal field (~3.5 T).
Bootstrap current in a tokamak
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kessel, C.E.
1994-03-01
The bootstrap current in a tokamak is examined by implementing the Hirshman-Sigmar model and comparing the predicted current profiles with those from two popular approximations. The dependences of the bootstrap current profile on the plasma properties are illustrated. The implications for steady state tokamaks are presented through two constraints; the pressure profile must be peaked and {beta}{sub p} must be kept below a critical value.
McClenaghan, Joseph; Garofalo, Andrea M.; Meneghini, Orso; ...
2017-08-03
In this study, transport modeling of a proposed ITER steady-state scenario based on DIII-D high poloidal-beta (more » $${{\\beta}_{p}}$$ ) discharges finds that ITB formation can occur with either sufficient rotation or a negative central shear q-profile. The high $${{\\beta}_{p}}$$ scenario is characterized by a large bootstrap current fraction (80%) which reduces the demands on the external current drive, and a large radius internal transport barrier which is associated with excellent normalized confinement. Modeling predictions of the electron transport in the high $${{\\beta}_{p}}$$ scenario improve as $${{q}_{95}}$$ approaches levels similar to typical existing models of ITER steady-state and the ion transport is turbulence dominated. Typical temperature and density profiles from the non-inductive high $${{\\beta}_{p}}$$ scenario on DIII-D are scaled according to 0D modeling predictions of the requirements for achieving a $Q=5$ steady-state fusion gain in ITER with 'day one' heating and current drive capabilities. Then, TGLF turbulence modeling is carried out under systematic variations of the toroidal rotation and the core q-profile. A high bootstrap fraction, high $${{\\beta}_{p}}$$ scenario is found to be near an ITB formation threshold, and either strong negative central magnetic shear or rotation in a high bootstrap fraction are found to successfully provide the turbulence suppression required to achieve $Q=5$.« less
Control of bootstrap current in the pedestal region of tokamaks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaing, K. C.; Department of Engineering Physics, University of Wisconsin, Madison, Wisconsin 53796; Lai, A. L.
2013-12-15
The high confinement mode (H-mode) plasmas in the pedestal region of tokamaks are characterized by steep gradient of the radial electric field, and sonic poloidal U{sub p,m} flow that consists of poloidal components of the E×B flow and the plasma flow velocity that is parallel to the magnetic field B. Here, E is the electric field. The bootstrap current that is important for the equilibrium, and stability of the pedestal of H-mode plasmas is shown to have an expression different from that in the conventional theory. In the limit where ‖U{sub p,m}‖≫ 1, the bootstrap current is driven by themore » electron temperature gradient and inductive electric field fundamentally different from that in the conventional theory. The bootstrap current in the pedestal region can be controlled through manipulating U{sub p,m} and the gradient of the radial electric. This, in turn, can control plasma stability such as edge-localized modes. Quantitative evaluations of various coefficients are shown to illustrate that the bootstrap current remains finite when ‖U{sub p,m}‖ approaches infinite and to provide indications how to control the bootstrap current. Approximate analytic expressions for viscous coefficients that join results in the banana and plateau-Pfirsch-Schluter regimes are presented to facilitate bootstrap and neoclassical transport simulations in the pedestal region.« less
Three-dimensional magnetohydrodynamic equilibrium of quiescent H-modes in tokamak systems
NASA Astrophysics Data System (ADS)
Cooper, W. A.; Graves, J. P.; Duval, B. P.; Sauter, O.; Faustin, J. M.; Kleiner, A.; Lanthaler, S.; Patten, H.; Raghunathan, M.; Tran, T.-M.; Chapman, I. T.; Ham, C. J.
2016-06-01
Three dimensional free boundary magnetohydrodynamic equilibria that recover saturated ideal kink/peeling structures are obtained numerically. Simulations that model the JET tokamak at fixed < β > =1.7% with a large edge bootstrap current that flattens the q-profile near the plasma boundary demonstrate that a radial parallel current density ribbon with a dominant m /n = 5/1 Fourier component at {{I}\\text{t}}=2.2 MA develops into a broadband spectrum when the toroidal current I t is increased to 2.5 MA.
From current-driven to neoclassically driven tearing modes.
Reimerdes, H; Sauter, O; Goodman, T; Pochelon, A
2002-03-11
In the TCV tokamak, the m/n = 2/1 island is observed in low-density discharges with central electron-cyclotron current drive. The evolution of its width has two distinct growth phases, one of which can be linked to a "conventional" tearing mode driven unstable by the current profile and the other to a neoclassical tearing mode driven by a perturbation of the bootstrap current. The TCV results provide the first clear observation of such a destabilization mechanism and reconcile the theory of conventional and neoclassical tearing modes, which differ only in the dominant driving term.
Plasma stability analysis using Consistent Automatic Kinetic Equilibrium reconstruction (CAKE)
NASA Astrophysics Data System (ADS)
Roelofs, Matthijs; Kolemen, Egemen; Eldon, David; Glasser, Alex; Meneghini, Orso; Smith, Sterling P.
2017-10-01
Presented here is the Consistent Automatic Kinetic Equilibrium (CAKE) code. CAKE is being developed to perform real-time kinetic equilibrium reconstruction, aiming to do a reconstruction in less than 100ms. This is achieved by taking, next to real-time Motional Stark Effect (MSE) and magnetics data, real-time Thomson Scattering (TS) and real-time Charge Exchange Recombination (CER, still in development) data in to account. Electron densities and temperature are determined by TS, while ion density and pressures are determined using CER. These form, together with the temperature and density of neutrals, the additional pressure constraints. Extra current constraints are imposed in the core by the MSE diagnostics. The pedestal current density is estimated using Sauters equation for the bootstrap current density. By comparing the behaviour of the ideal MHD perturbed potential energy (δW) and the linear stability index (Δ') of CAKE to magnetics-only reconstruction, it can be seen that the use of diagnostics to reconstruct the pedestal have a large effect on stability. Supported by U.S. DOE DE-SC0015878 and DE-FC02-04ER54698.
Epistemic uncertainty in the location and magnitude of earthquakes in Italy from Macroseismic data
Bakun, W.H.; Gomez, Capera A.; Stucchi, M.
2011-01-01
Three independent techniques (Bakun and Wentworth, 1997; Boxer from Gasperini et al., 1999; and Macroseismic Estimation of Earthquake Parameters [MEEP; see Data and Resources section, deliverable D3] from R.M.W. Musson and M.J. Jimenez) have been proposed for estimating an earthquake location and magnitude from intensity data alone. The locations and magnitudes obtained for a given set of intensity data are almost always different, and no one technique is consistently best at matching instrumental locations and magnitudes of recent well-recorded earthquakes in Italy. Rather than attempting to select one of the three solutions as best, we use all three techniques to estimate the location and the magnitude and the epistemic uncertainties among them. The estimates are calculated using bootstrap resampled data sets with Monte Carlo sampling of a decision tree. The decision-tree branch weights are based on goodness-of-fit measures of location and magnitude for recent earthquakes. The location estimates are based on the spatial distribution of locations calculated from the bootstrap resampled data. The preferred source location is the locus of the maximum bootstrap location spatial density. The location uncertainty is obtained from contours of the bootstrap spatial density: 68% of the bootstrap locations are within the 68% confidence region, and so on. For large earthquakes, our preferred location is not associated with the epicenter but with a location on the extended rupture surface. For small earthquakes, the epicenters are generally consistent with the location uncertainties inferred from the intensity data if an epicenter inaccuracy of 2-3 km is allowed. The preferred magnitude is the median of the distribution of bootstrap magnitudes. As with location uncertainties, the uncertainties in magnitude are obtained from the distribution of bootstrap magnitudes: the bounds of the 68% uncertainty range enclose 68% of the bootstrap magnitudes, and so on. The instrumental magnitudes for large and small earthquakes are generally consistent with the confidence intervals inferred from the distribution of bootstrap resampled magnitudes.
Edge Currents and Stability in DIII-D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, D M; Fenstermacher, M E; Finkenthal, D K
2004-12-01
Understanding the stability physics of the H-mode pedestal in tokamak devices requires an accurate measurement of plasma current in the pedestal region with good spatial resolution. Theoretically, the high pressure gradients achieved in the edge of H-mode plasmas should lead to generation of a significant edge current density peak through bootstrap and Pfirsh-Schl{umlt u}ter effects. This edge current is important for the achievement of second stability in the context of coupled magneto hydrodynamic (MHD) modes which are both pressure (ballooning) and current (peeling) driven. Many aspects of edge localized mode (ELM) behavior can be accounted for in terms of anmore » edge current density peak, with the identification of Type 1 ELMs as intermediate-n toroidal mode number MHD modes being a natural feature of this model. The development of a edge localized instabilities in tokamak experiments code (ELITE) based on this model allows one to efficiently calculate the stability and growth of the relevant modes for a broad range of plasma parameters and thus provides a framework for understanding the limits on pedestal height. This however requires an accurate assessment of the edge current. While estimates of j{sub edge} can be made based on specific bootstrap models, their validity may be limited in the edge (gradient scalelengths comparable to orbit size, large changes in collisionality, etc.). Therefore it is highly desirable to have an actual measurement. Such measurements have been made on the DIII-D tokamak using combined polarimetry and spectroscopy of an injected lithium beam. By analyzing one of the Zeeman-split 2S-2P lithium resonance line components, one can obtain direct information on the local magnetic field components. These values allow one to infer details of the edge current density. Because of the negligible Stark mixing of the relevant atomic levels in lithium, this method of determining j(r) is insensitive to the large local electric fields typically found in enhanced confinement (H-mode) edges, and thus avoids an ambiguity common to MSE measurements of B{sub pol}.« less
Edge Currents and Stability in DIII-D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, D M; Fenstermacher, M E; Finkenthal, D K
2005-05-05
Understanding the stability physics of the H-mode pedestal in tokamak devices requires an accurate measurement of plasma current in the pedestal region with good spatial resolution. Theoretically, the high pressure gradients achieved in the edge of H-mode plasmas should lead to generation of a significant edge current density peak through bootstrap and Pfirsh-Schlueter effects. This edge current is important for the achievement of second stability in the context of coupled magneto hydrodynamic (MHD) modes which are both pressure (ballooning) and current (peeling) driven [1]. Many aspects of edge localized mode (ELM) behavior can be accounted for in terms of anmore » edge current density peak, with the identification of Type 1 ELMs as intermediate-n toroidal mode number MHD modes being a natural feature of this model [2]. The development of a edge localized instabilities in tokamak experiments code (ELITE) based on this model allows one to efficiently calculate the stability and growth of the relevant modes for a broad range of plasma parameters [3,4] and thus provides a framework for understanding the limits on pedestal height. This however requires an accurate assessment of the edge current. While estimates of j{sub edge} can be made based on specific bootstrap models, their validity may be limited in the edge (gradient scale lengths comparable to orbit size, large changes in collisionality, etc.). Therefore it is highly desirable to have an actual measurement. Such measurements have been made on the DIII-D tokamak using combined polarimetry and spectroscopy of an injected lithium beam. [5,6]. By analyzing one of the Zeeman-split 2S-2P lithium resonance line components, one can obtain direct information on the local magnetic field components. These values allow one to infer details of the edge current density. Because of the negligible Stark mixing of the relevant atomic levels in lithium, this method of determining j(r) is insensitive to the large local electric fields typically found in enhanced confinement (H-mode) edges, and thus avoids an ambiguity common to MSE measurements of B{sub pol}.« less
Uncertainty estimation of Intensity-Duration-Frequency relationships: A regional analysis
NASA Astrophysics Data System (ADS)
Mélèse, Victor; Blanchet, Juliette; Molinié, Gilles
2018-03-01
We propose in this article a regional study of uncertainties in IDF curves derived from point-rainfall maxima. We develop two generalized extreme value models based on the simple scaling assumption, first in the frequentist framework and second in the Bayesian framework. Within the frequentist framework, uncertainties are obtained i) from the Gaussian density stemming from the asymptotic normality theorem of the maximum likelihood and ii) with a bootstrap procedure. Within the Bayesian framework, uncertainties are obtained from the posterior densities. We confront these two frameworks on the same database covering a large region of 100, 000 km2 in southern France with contrasted rainfall regime, in order to be able to draw conclusion that are not specific to the data. The two frameworks are applied to 405 hourly stations with data back to the 1980's, accumulated in the range 3 h-120 h. We show that i) the Bayesian framework is more robust than the frequentist one to the starting point of the estimation procedure, ii) the posterior and the bootstrap densities are able to better adjust uncertainty estimation to the data than the Gaussian density, and iii) the bootstrap density give unreasonable confidence intervals, in particular for return levels associated to large return period. Therefore our recommendation goes towards the use of the Bayesian framework to compute uncertainty.
1984-09-28
variables before simula- tion of model - Search for reality checks a, - Express uncertainty as a probability density distribution. a. H2 a, H-22 TWIF... probability that the software con- tains errors. This prior is updated as test failure data are accumulated. Only a p of 1 (software known to contain...discusssed; both parametric and nonparametric versions are presented. It is shown by the author that the bootstrap underlies the jackknife method and
NASA Astrophysics Data System (ADS)
Peraza-Rodriguez, H.; Reynolds-Barredo, J. M.; Sanchez, R.; Tribaldos, V.; Geiger, J.
2018-02-01
The recently developed free-plasma-boundary version of the SIESTA MHD equilibrium code (Hirshman et al 2011 Phys. Plasmas 18 062504; Peraza-Rodriguez et al 2017 Phys. Plasmas 24 082516) is used for the first time to study scenarios with considerable bootstrap currents for the Wendelstein 7-X (W7-X) stellarator. Bootstrap currents in the range of tens of kAs can lead to the formation of unwanted magnetic island chains or stochastic regions within the plasma and alter the boundary rotational transform due to the small shear in W7-X. The latter issue is of relevance since the island divertor operation of W7-X relies on a proper positioning of magnetic island chains at the plasma edge to control the particle and energy exhaust towards the divertor plates. Two scenarios are examined with the new free-plasma-boundary capabilities of SIESTA: a freely evolving bootstrap current one that illustrates the difficulties arising from the dislocation of the boundary islands, and a second one in which off-axis electron cyclotron current drive (ECCD) is applied to compensate the effects of the bootstrap current and keep the island divertor configuration intact. SIESTA finds that off-axis ECCD is indeed able to keep the location and phase of the edge magnetic island chain unchanged, but it may also lead to an undesired stochastization of parts of the confined plasma if the EC deposition radial profile becomes too narrow.
ERIC Educational Resources Information Center
Kim, Se-Kang
2010-01-01
The aim of the current study is to validate the invariance of major profile patterns derived from multidimensional scaling (MDS) by bootstrapping. Profile Analysis via Multidimensional Scaling (PAMS) was employed to obtain profiles and bootstrapping was used to construct the sampling distributions of the profile coordinates and the empirical…
Comparison of bootstrap approaches for estimation of uncertainties of DTI parameters.
Chung, SungWon; Lu, Ying; Henry, Roland G
2006-11-01
Bootstrap is an empirical non-parametric statistical technique based on data resampling that has been used to quantify uncertainties of diffusion tensor MRI (DTI) parameters, useful in tractography and in assessing DTI methods. The current bootstrap method (repetition bootstrap) used for DTI analysis performs resampling within the data sharing common diffusion gradients, requiring multiple acquisitions for each diffusion gradient. Recently, wild bootstrap was proposed that can be applied without multiple acquisitions. In this paper, two new approaches are introduced called residual bootstrap and repetition bootknife. We show that repetition bootknife corrects for the large bias present in the repetition bootstrap method and, therefore, better estimates the standard errors. Like wild bootstrap, residual bootstrap is applicable to single acquisition scheme, and both are based on regression residuals (called model-based resampling). Residual bootstrap is based on the assumption that non-constant variance of measured diffusion-attenuated signals can be modeled, which is actually the assumption behind the widely used weighted least squares solution of diffusion tensor. The performances of these bootstrap approaches were compared in terms of bias, variance, and overall error of bootstrap-estimated standard error by Monte Carlo simulation. We demonstrate that residual bootstrap has smaller biases and overall errors, which enables estimation of uncertainties with higher accuracy. Understanding the properties of these bootstrap procedures will help us to choose the optimal approach for estimating uncertainties that can benefit hypothesis testing based on DTI parameters, probabilistic fiber tracking, and optimizing DTI methods.
NASA Astrophysics Data System (ADS)
Wang, Xiaojing; Yu, Qingquan; Zhang, Xiaodong; Zhang, Yang; Zhu, Sizheng; Wang, Xiaoguang; Wu, Bin
2018-04-01
Numerical studies on the stabilization of neoclassical tearing modes (NTMs) by electron cyclotron current drive (ECCD) have been carried out based on reduced MHD equations, focusing on the amount of the required driven current for mode stabilization and the comparison with analytical results. The dependence of the minimum driven current required for NTM stabilization on some parameters, including the bootstrap current density, radial width of the driven current, radial deviation of the driven current from the resonant surface, and the island width when applying ECCD, are studied. By fitting the numerical results, simple expressions for these dependences are obtained. Analysis based on the modified Rutherford equation (MRE) has also been carried out, and the corresponding results have the same trend as numerical ones, while a quantitative difference between them exists. This difference becomes smaller when the applied radio frequency (rf) current is smaller.
A neural network based reputation bootstrapping approach for service selection
NASA Astrophysics Data System (ADS)
Wu, Quanwang; Zhu, Qingsheng; Li, Peng
2015-10-01
With the concept of service-oriented computing becoming widely accepted in enterprise application integration, more and more computing resources are encapsulated as services and published online. Reputation mechanism has been studied to establish trust on prior unknown services. One of the limitations of current reputation mechanisms is that they cannot assess the reputation of newly deployed services as no record of their previous behaviours exists. Most of the current bootstrapping approaches merely assign default reputation values to newcomers. However, by this kind of methods, either newcomers or existing services will be favoured. In this paper, we present a novel reputation bootstrapping approach, where correlations between features and performance of existing services are learned through an artificial neural network (ANN) and they are then generalised to establish a tentative reputation when evaluating new and unknown services. Reputations of services published previously by the same provider are also incorporated for reputation bootstrapping if available. The proposed reputation bootstrapping approach is seamlessly embedded into an existing reputation model and implemented in the extended service-oriented architecture. Empirical studies of the proposed approach are shown at last.
High performance advanced tokamak regimes in DIII-D for next-step experiments
NASA Astrophysics Data System (ADS)
Greenfield, C. M.; Murakami, M.; Ferron, J. R.; Wade, M. R.; Luce, T. C.; Petty, C. C.; Menard, J. E.; Petrie, T. W.; Allen, S. L.; Burrell, K. H.; Casper, T. A.; DeBoo, J. C.; Doyle, E. J.; Garofalo, A. M.; Gorelov, I. A.; Groebner, R. J.; Hobirk, J.; Hyatt, A. W.; Jayakumar, R. J.; Kessel, C. E.; La Haye, R. J.; Jackson, G. L.; Lohr, J.; Makowski, M. A.; Pinsker, R. I.; Politzer, P. A.; Prater, R.; Strait, E. J.; Taylor, T. S.; West, W. P.; DIII-D Team
2004-05-01
Advanced Tokamak (AT) research in DIII-D [K. H. Burrell for the DIII-D Team, in Proceedings of the 19th Fusion Energy Conference, Lyon, France, 2002 (International Atomic Energy Agency, Vienna, 2002) published on CD-ROM] seeks to provide a scientific basis for steady-state high performance operation in future devices. These regimes require high toroidal beta to maximize fusion output and poloidal beta to maximize the self-driven bootstrap current. Achieving these conditions requires integrated, simultaneous control of the current and pressure profiles, and active magnetohydrodynamic stability control. The building blocks for AT operation are in hand. Resistive wall mode stabilization via plasma rotation and active feedback with nonaxisymmetric coils allows routine operation above the no-wall beta limit. Neoclassical tearing modes are stabilized by active feedback control of localized electron cyclotron current drive (ECCD). Plasma shaping and profile control provide further improvements. Under these conditions, bootstrap supplies most of the current. Steady-state operation requires replacing the remaining Ohmic current, mostly located near the half radius, with noninductive external sources. In DIII-D this current is provided by ECCD, and nearly stationary AT discharges have been sustained with little remaining Ohmic current. Fast wave current drive is being developed to control the central magnetic shear. Density control, with divertor cryopumps, of AT discharges with edge localized moding H-mode edges facilitates high current drive efficiency at reactor relevant collisionalities. A sophisticated plasma control system allows integrated control of these elements. Close coupling between modeling and experiment is key to understanding the separate elements, their complex nonlinear interactions, and their integration into self-consistent high performance scenarios. Progress on this development, and its implications for next-step devices, will be illustrated by results of recent experiment and simulation efforts.
Kinetic effects on the currents determining the stability of a magnetic island in tokamaks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poli, E., E-mail: emanuele.poli@ipp.mpg.de; Bergmann, A.; Casson, F. J.
The role of the bootstrap and polarization currents for the stability of neoclassical tearing modes is investigated employing both a drift kinetic and a gyrokinetic approach. The adiabatic response of the ions around the island separatrix implies, for island widths below or around the ion thermal banana width, density flattening for islands rotating at the ion diamagnetic frequency, while for islands rotating at the electron diamagnetic frequency the density is unperturbed and the only contribution to the neoclassical drive arises from electron temperature flattening. As for the polarization current, the full inclusion of finite orbit width effects in the calculationmore » of the potential developing in a rotating island leads to a smoothing of the discontinuous derivatives exhibited by the analytic potential on which the polarization term used in the modeling is based. This leads to a reduction of the polarization-current contribution with respect to the analytic estimate, in line with other studies. Other contributions to the perpendicular ion current, related to the response of the particles around the island separatrix, are found to compete or even dominate the polarization-current term for realistic island rotation frequencies.« less
mBEEF-vdW: Robust fitting of error estimation density functionals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lundgaard, Keld T.; Wellendorff, Jess; Voss, Johannes
Here, we propose a general-purpose semilocal/nonlocal exchange-correlation functional approximation, named mBEEF-vdW. The exchange is a meta generalized gradient approximation, and the correlation is a semilocal and nonlocal mixture, with the Rutgers-Chalmers approximation for van der Waals (vdW) forces. The functional is fitted within the Bayesian error estimation functional (BEEF) framework. We improve the previously used fitting procedures by introducing a robust MM-estimator based loss function, reducing the sensitivity to outliers in the datasets. To more reliably determine the optimal model complexity, we furthermore introduce a generalization of the bootstrap 0.632 estimator with hierarchical bootstrap sampling and geometric mean estimator overmore » the training datasets. Using this estimator, we show that the robust loss function leads to a 10% improvement in the estimated prediction error over the previously used least-squares loss function. The mBEEF-vdW functional is benchmarked against popular density functional approximations over a wide range of datasets relevant for heterogeneous catalysis, including datasets that were not used for its training. Overall, we find that mBEEF-vdW has a higher general accuracy than competing popular functionals, and it is one of the best performing functionals on chemisorption systems, surface energies, lattice constants, and dispersion. We also show the potential-energy curve of graphene on the nickel(111) surface, where mBEEF-vdW matches the experimental binding length. mBEEF-vdW is currently available in gpaw and other density functional theory codes through Libxc, version 3.0.0.« less
mBEEF-vdW: Robust fitting of error estimation density functionals
Lundgaard, Keld T.; Wellendorff, Jess; Voss, Johannes; ...
2016-06-15
Here, we propose a general-purpose semilocal/nonlocal exchange-correlation functional approximation, named mBEEF-vdW. The exchange is a meta generalized gradient approximation, and the correlation is a semilocal and nonlocal mixture, with the Rutgers-Chalmers approximation for van der Waals (vdW) forces. The functional is fitted within the Bayesian error estimation functional (BEEF) framework. We improve the previously used fitting procedures by introducing a robust MM-estimator based loss function, reducing the sensitivity to outliers in the datasets. To more reliably determine the optimal model complexity, we furthermore introduce a generalization of the bootstrap 0.632 estimator with hierarchical bootstrap sampling and geometric mean estimator overmore » the training datasets. Using this estimator, we show that the robust loss function leads to a 10% improvement in the estimated prediction error over the previously used least-squares loss function. The mBEEF-vdW functional is benchmarked against popular density functional approximations over a wide range of datasets relevant for heterogeneous catalysis, including datasets that were not used for its training. Overall, we find that mBEEF-vdW has a higher general accuracy than competing popular functionals, and it is one of the best performing functionals on chemisorption systems, surface energies, lattice constants, and dispersion. We also show the potential-energy curve of graphene on the nickel(111) surface, where mBEEF-vdW matches the experimental binding length. mBEEF-vdW is currently available in gpaw and other density functional theory codes through Libxc, version 3.0.0.« less
The Relationships Between ELM Suppression, Pedestal Profiles, and Lithium Wall Coatings in NSTX
DOE Office of Scientific and Technical Information (OSTI.GOV)
D.P. Boyle, R. Maingi, P.B. Snyder, J. Manickam, T.H. Osborne, R.E. Bell, B.P. LeBlanc, and the NSTX Team
2012-08-17
Recently in the National Spherical Torus Experiment (NSTX), increasing lithium wall coatings suppressed edge localized modes (ELMs), gradually but not quite monotonically. This work details profile and stability analysis as ELMs disappeared throughout the lithium scan. While the quantity of lithium deposited between discharges did not uniquely determine the presence of ELMs, profile analysis demonstrated that lithium was correlated to wider density and pressure pedestals with peak gradients farther from the separatrix. Moreover, the ELMy and ELM-free discharges were cleanly separated by their density and pedestal widths and peak gradient locations. Ultimately, ELMs were only suppressed when lithium caused themore » density pedestal to widen and shift inward. These changes in the density gradient were directly reflected in the pressure gradient and calculated bootstrap current. This supports the theory that ELMs in NSTX are caused by peeling and/or ballooning modes, as kink/peeling modes are stabilized when the edge current and pressure gradient shift away from the separatrix. Edge stability analysis using ELITE corroborated this picture, as reconstructed equilibria from ELM-free discharges were generally farther from their kink/peeling stability boundaries than ELMy discharges. We conclude that density profile control provided by lithium is the key first step to ELM suppression in NSTX« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyle, D. P.; Maingi, R.; Snyder, P. B.
2011-01-01
Recently in the National Spherical Torus Experiment (NSTX), increasing lithium wall coatings suppressed edge localized modes (ELMs), gradually but not quite monotonically. This work details profile and stability analysis as ELMs disappeared throughout the lithium scan. While the quantity of lithium deposited between discharges did not uniquely determine the presence of ELMs, profile analysis demonstrated that lithium was correlated with wider density and pressure pedestals with peak gradients farther from the separatrix. Moreover, the ELMy and ELM-free discharges were cleanly separated by their density and pedestal widths and peak gradient locations. Ultimately, ELMs were only suppressed when lithium caused themore » density pedestal to widen and shift inward. These changes in the density gradient were directly reflected in the pressure gradient and calculated bootstrap current. This supports the theory that ELMs in NSTX are caused by peeling and/or ballooning modes, as kink/peeling modes are stabilized when the edge current and pressure gradient shift away from the separatrix. Edge stability analysis using ELITE corroborated this picture, as reconstructed equilibria from ELM-free discharges were generally farther from their kink/peeling stability boundaries than ELMy discharges. We conclude that density profile control provided by lithium is the key first step to ELM suppression in NSTX.« less
Multi-baseline bootstrapping at the Navy precision optical interferometer
NASA Astrophysics Data System (ADS)
Armstrong, J. T.; Schmitt, H. R.; Mozurkewich, D.; Jorgensen, A. M.; Muterspaugh, M. W.; Baines, E. K.; Benson, J. A.; Zavala, Robert T.; Hutter, D. J.
2014-07-01
The Navy Precision Optical Interferometer (NPOI) was designed from the beginning to support baseline boot- strapping with equally-spaced array elements. The motivation was the desire to image the surfaces of resolved stars with the maximum resolution possible with a six-element array. Bootstrapping two baselines together to track fringes on a third baseline has been used at the NPOI for many years, but the capabilities of the fringe tracking software did not permit us to bootstrap three or more baselines together. Recently, both a new backend (VISION; Tennessee State Univ.) and new hardware and firmware (AZ Embedded Systems and New Mexico Tech, respectively) for the current hybrid backend have made multi-baseline bootstrapping possible.
Suppressing magnetic island growth by resonant magnetic perturbation
NASA Astrophysics Data System (ADS)
Yu, Q.; Günter, S.; Lackner, K.
2018-05-01
The effect of externally applied resonant magnetic perturbations (RMPs) on the growth of magnetic islands is investigated based on two-fluid equations. It is found that if the local bi-normal electron fluid velocity at the resonant surface is sufficiently large, static RMPs of the same helicity and of moderate amplitude can suppress the growth of magnetic islands in high-temperature plasmas. These islands will otherwise grow, driven by an unfavorable plasma current density profile and bootstrap current perturbation. These results indicate that the error field can stabilize island growth, if the error field amplitude is not too large and the local bi-normal electron fluid velocity is not too low. They also indicate that applied rotating RMPs with an appropriate frequency can be utilized to suppress island growth in high-temperature plasmas, even for a low bi-normal electron fluid velocity. A significant change in the local equilibrium plasma current density gradient by small amplitude RMPs is found for realistic plasma parameters, which are important for the island stability and are expected to be more important for fusion reactors with low plasma resistivity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Medvedev, S. Yu., E-mail: medvedev@a5.kiam.ru; Ivanov, A. A., E-mail: aai@a5.kiam.ru; Martynov, A. A., E-mail: martynov@a5.kiam.ru
The influence of current density and pressure gradient profiles in the pedestal on the access to the regimes free from edge localized modes (ELMs) like quiescent H-mode in ITER is investigated. Using the simulator of MHD modes localized near plasma boundary based on the KINX code, calculations of the ELM stability were performed for the ITER plasma in scenarios 2 and 4 under variations of density and temperature profiles with the self-consistent bootstrap current in the pedestal. Low pressure gradient values at the separatrix, the same position of the density and temperature pedestals and high poloidal beta values facilitate reachingmore » high current density in the pedestal and a potential transition into the regime with saturated large scale kink modes. New version of the localized MHD mode simulator allows one to compute the growth rates of ideal peeling-ballooning modes with different toroidal mode numbers and to determine the stability region taking into account diamagnetic stabilization. The edge stability diagrams computations and sensitivity studies of the stability limits to the value of diamagnetic frequency show that diamagnetic stabilization of the modes with high toroidal mode numbers can help to access the quiescent H-mode even with high plasma density but only with low pressure gradient values at the separatrix. The limiting pressure at the top of the pedestal increases for higher plasma density. With flat density profile the access to the quiescent H-mode is closed even with diamagnetic stabilization taken into account, while toroidal mode numbers of the most unstable peeling-ballooning mode decrease from n = 10−40 to n = 3−20.« less
Comparison of parametric and bootstrap method in bioequivalence test.
Ahn, Byung-Jin; Yim, Dong-Seok
2009-10-01
The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption.
Comparison of Parametric and Bootstrap Method in Bioequivalence Test
Ahn, Byung-Jin
2009-01-01
The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption. PMID:19915699
Bootstrap Current for the Edge Pedestal Plasma in a Diverted Tokamak Geometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koh, S.; Chang, C. S.; Ku, S.
The edge bootstrap current plays a critical role in the equilibrium and stability of the steep edge pedestal plasma. The pedestal plasma has an unconventional and difficult neoclassical property, as compared with the core plasma. It has a narrow passing particle region in velocity space that can be easily modified or destroyed by Coulomb collisions. At the same time, the edge pedestal plasma has steep pressure and electrostatic potential gradients whose scale-lengths are comparable with the ion banana width, and includes a magnetic separatrix surface, across which the topological properties of the magnetic field and particle orbits change abruptly. Amore » driftkinetic particle code XGC0, equipped with a mass-momentum-energy conserving collision operator, is used to study the edge bootstrap current in a realistic diverted magnetic field geometry with a self-consistent radial electric field. When the edge electrons are in the weakly collisional banana regime, surprisingly, the present kinetic simulation confirms that the existing analytic expressions [represented by O. Sauter et al. , Phys. Plasmas 6 , 2834 (1999)] are still valid in this unconventional region, except in a thin radial layer in contact with the magnetic separatrix. The agreement arises from the dominance of the electron contribution to the bootstrap current compared with ion contribution and from a reasonable separation of the trapped-passing dynamics without a strong collisional mixing. However, when the pedestal electrons are in plateau-collisional regime, there is significant deviation of numerical results from the existing analytic formulas, mainly due to large effective collisionality of the passing and the boundary layer trapped particles in edge region. In a conventional aspect ratio tokamak, the edge bootstrap current from kinetic simulation can be significantly less than that from the Sauter formula if the electron collisionality is high. On the other hand, when the aspect ratio is close to unity, the collisional edge bootstrap current can be significantly greater than that from the Sauter formula. Rapid toroidal rotation of the magnetic field lines at the high field side of a tight aspect-ratio tokamak is believed to be the cause of the different behavior. A new analytic fitting formula, as a simple modification to the Sauter formula, is obtained to bring the analytic expression to a better agreement with the edge kinetic simulation results« less
Bootstrap current for the edge pedestal plasma in a diverted tokamak geometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koh, S.; Choe, W.; Chang, C. S.
The edge bootstrap current plays a critical role in the equilibrium and stability of the steep edge pedestal plasma. The pedestal plasma has an unconventional and difficult neoclassical property, as compared with the core plasma. It has a narrow passing particle region in velocity space that can be easily modified or destroyed by Coulomb collisions. At the same time, the edge pedestal plasma has steep pressure and electrostatic potential gradients whose scale-lengths are comparable with the ion banana width, and includes a magnetic separatrix surface, across which the topological properties of the magnetic field and particle orbits change abruptly. Amore » drift-kinetic particle code XGC0, equipped with a mass-momentum-energy conserving collision operator, is used to study the edge bootstrap current in a realistic diverted magnetic field geometry with a self-consistent radial electric field. When the edge electrons are in the weakly collisional banana regime, surprisingly, the present kinetic simulation confirms that the existing analytic expressions [represented by O. Sauter et al., Phys. Plasmas 6, 2834 (1999)] are still valid in this unconventional region, except in a thin radial layer in contact with the magnetic separatrix. The agreement arises from the dominance of the electron contribution to the bootstrap current compared with ion contribution and from a reasonable separation of the trapped-passing dynamics without a strong collisional mixing. However, when the pedestal electrons are in plateau-collisional regime, there is significant deviation of numerical results from the existing analytic formulas, mainly due to large effective collisionality of the passing and the boundary layer trapped particles in edge region. In a conventional aspect ratio tokamak, the edge bootstrap current from kinetic simulation can be significantly less than that from the Sauter formula if the electron collisionality is high. On the other hand, when the aspect ratio is close to unity, the collisional edge bootstrap current can be significantly greater than that from the Sauter formula. Rapid toroidal rotation of the magnetic field lines at the high field side of a tight aspect-ratio tokamak is believed to be the cause of the different behavior. A new analytic fitting formula, as a simple modification to the Sauter formula, is obtained to bring the analytic expression to a better agreement with the edge kinetic simulation results.« less
mBEEF-vdW: Robust fitting of error estimation density functionals
NASA Astrophysics Data System (ADS)
Lundgaard, Keld T.; Wellendorff, Jess; Voss, Johannes; Jacobsen, Karsten W.; Bligaard, Thomas
2016-06-01
We propose a general-purpose semilocal/nonlocal exchange-correlation functional approximation, named mBEEF-vdW. The exchange is a meta generalized gradient approximation, and the correlation is a semilocal and nonlocal mixture, with the Rutgers-Chalmers approximation for van der Waals (vdW) forces. The functional is fitted within the Bayesian error estimation functional (BEEF) framework [J. Wellendorff et al., Phys. Rev. B 85, 235149 (2012), 10.1103/PhysRevB.85.235149; J. Wellendorff et al., J. Chem. Phys. 140, 144107 (2014), 10.1063/1.4870397]. We improve the previously used fitting procedures by introducing a robust MM-estimator based loss function, reducing the sensitivity to outliers in the datasets. To more reliably determine the optimal model complexity, we furthermore introduce a generalization of the bootstrap 0.632 estimator with hierarchical bootstrap sampling and geometric mean estimator over the training datasets. Using this estimator, we show that the robust loss function leads to a 10 % improvement in the estimated prediction error over the previously used least-squares loss function. The mBEEF-vdW functional is benchmarked against popular density functional approximations over a wide range of datasets relevant for heterogeneous catalysis, including datasets that were not used for its training. Overall, we find that mBEEF-vdW has a higher general accuracy than competing popular functionals, and it is one of the best performing functionals on chemisorption systems, surface energies, lattice constants, and dispersion. We also show the potential-energy curve of graphene on the nickel(111) surface, where mBEEF-vdW matches the experimental binding length. mBEEF-vdW is currently available in gpaw and other density functional theory codes through Libxc, version 3.0.0.
NASA Astrophysics Data System (ADS)
Olafsdottir, Kristin B.; Mudelsee, Manfred
2013-04-01
Estimation of the Pearson's correlation coefficient between two time series to evaluate the influences of one time depended variable on another is one of the most often used statistical method in climate sciences. Various methods are used to estimate confidence interval to support the correlation point estimate. Many of them make strong mathematical assumptions regarding distributional shape and serial correlation, which are rarely met. More robust statistical methods are needed to increase the accuracy of the confidence intervals. Bootstrap confidence intervals are estimated in the Fortran 90 program PearsonT (Mudelsee, 2003), where the main intention was to get an accurate confidence interval for correlation coefficient between two time series by taking the serial dependence of the process that generated the data into account. However, Monte Carlo experiments show that the coverage accuracy for smaller data sizes can be improved. Here we adapt the PearsonT program into a new version called PearsonT3, by calibrating the confidence interval to increase the coverage accuracy. Calibration is a bootstrap resampling technique, which basically performs a second bootstrap loop or resamples from the bootstrap resamples. It offers, like the non-calibrated bootstrap confidence intervals, robustness against the data distribution. Pairwise moving block bootstrap is used to preserve the serial correlation of both time series. The calibration is applied to standard error based bootstrap Student's t confidence intervals. The performances of the calibrated confidence intervals are examined with Monte Carlo simulations, and compared with the performances of confidence intervals without calibration, that is, PearsonT. The coverage accuracy is evidently better for the calibrated confidence intervals where the coverage error is acceptably small (i.e., within a few percentage points) already for data sizes as small as 20. One form of climate time series is output from numerical models which simulate the climate system. The method is applied to model data from the high resolution ocean model, INALT01 where the relationship between the Agulhas Leakage and the North Brazil Current is evaluated. Preliminary results show significant correlation between the two variables when there is 10 year lag between them, which is more or less the time that takes the Agulhas Leakage water to reach the North Brazil Current. Mudelsee, M., 2003. Estimating Pearson's correlation coefficient with bootstrap confidence interval from serially dependent time series. Mathematical Geology 35, 651-665.
Molinos-Senante, María; Donoso, Guillermo; Sala-Garrido, Ramon; Villegas, Andrés
2018-03-01
Benchmarking the efficiency of water companies is essential to set water tariffs and to promote their sustainability. In doing so, most of the previous studies have applied conventional data envelopment analysis (DEA) models. However, it is a deterministic method that does not allow to identify environmental factors influencing efficiency scores. To overcome this limitation, this paper evaluates the efficiency of a sample of Chilean water and sewerage companies applying a double-bootstrap DEA model. Results evidenced that the ranking of water and sewerage companies changes notably whether efficiency scores are computed applying conventional or double-bootstrap DEA models. Moreover, it was found that the percentage of non-revenue water and customer density are factors influencing the efficiency of Chilean water and sewerage companies. This paper illustrates the importance of using a robust and reliable method to increase the relevance of benchmarking tools.
NASA Astrophysics Data System (ADS)
McClenaghan, J.; Garofalo, A. M.; Meneghini, O.; Smith, S. P.
2016-10-01
Transport modeling of a proposed ITER steady-state scenario based on DIII-D high βP discharges finds that the core confinement may be improved with either sufficient rotation or a negative central shear q-profile. The high poloidal beta scenario is characterized by a large bootstrap current fraction( 80%) which reduces the demands on the external current drive, and a large radius internal transport barrier which is associated with improved normalized confinement. Typical temperature and density profiles from the non-inductive high poloidal beta scenario on DIII-D are scaled according to 0D modeling predictions of the requirements for achieving Q=5 steady state performance in ITER with ``day one'' H&CD capabilities. Then, TGLF turbulence modeling is carried out under systematic variations of the toroidal rotation and the core q-profile. Either strong negative central magnetic shear or rotation are found to successfully provide the turbulence suppression required to maintain the temperature and density profiles. This work supported by the US Department of Energy under DE-FC02-04ER54698.
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, W P; Burrell, K H; Casper, T A
2004-12-03
The quiescent H (QH) mode, an edge localized mode (ELM)-free, high-confinement mode, combines well with an internal transport barrier to form quiescent double barrier (QDB) stationary state, high performance plasmas. The QH-mode edge pedestal pressure is similar to that seen in ELMing phases of the same discharge, with similar global energy confinement. The pedestal density in early ELMing phases of strongly pumped counter injection discharges drops and a transition to QH-mode occurs, leading to lower calculated edge bootstrap current. Plasmas current ramp experiment and ELITE code modeling of edge stability suggest that QH-modes lie near an edge current stability boundary.more » At high triangularity, QH-mode discharges operate at higher pedestal density and pressure, and have achieved ITER level values of {beta}{sub PED} and {nu}*. The QDB achieves performance of {alpha}{sub N}H{sub 89} {approx} 7 in quasi-stationary conditions for a duration of 10 tE, limited by hardware. Recently we demonstrated stationary state QDB discharges with little change in kinetic and q profiles (q{sub 0} > 1) for 2 s, comparable to ELMing ''hybrid scenarios'', yet without the debilitating effects of ELMs. Plasma profile control tools, including electron cyclotron heating and current drive and neutral beam heating, have been demonstrated to control simultaneously the q profile development, the density peaking, impurity accumulation and plasma beta.« less
Hager, Robert; Chang, C. S.
2016-04-08
As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. In conclusion, a new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hager, Robert; Chang, C. S.
As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. In conclusion, a new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hager, Robert, E-mail: rhager@pppl.gov; Chang, C. S., E-mail: cschang@pppl.gov
As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. A new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less
Investigation of the n = 1 resistive wall modes in the ITER high-mode confinement
NASA Astrophysics Data System (ADS)
Zheng, L. J.; Kotschenreuther, M. T.; Valanju, P.
2017-06-01
The n = 1 resistive wall mode (RWM) stability of ITER high-mode confinement is investigated with bootstrap current included for equilibrium, together with the rotation and diamagnetic drift effects for stability. Here, n is the toroidal mode number. We use the CORSICA code for computing the free boundary equilibrium and AEGIS code for stability. We find that the inclusion of bootstrap current for equilibrium is critical. It can reduce the local magnetic shear in the pedestal, so that the infernal mode branches can develop. Consequently, the n = 1 modes become unstable without a stabilizing wall at a considerably lower beta limit, driven by the steep pressure gradient in the pedestal. Typical values of the wall position stabilize the ideal mode, but give rise to the ‘pedestal’ resistive wall modes. We find that the rotation can contribute a stabilizing effect on RWMs and the diamagnetic drift effects can further improve the stability in the co-current rotation case. But, generally speaking, the rotation stabilization effects are not as effective as the case without including the bootstrap current effects on equilibrium. We also find that the diamagnetic drift effects are actually destabilizing when there is a counter-current rotation.
Carving out the end of the world or (superconformal bootstrap in six dimensions)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Chi-Ming; Lin, Ying-Hsuan
We bootstrap N=(1,0) superconformal field theories in six dimensions, by analyzing the four-point function of flavor current multiplets. By assuming E 8 flavor group, we present universal bounds on the central charge C T and the flavor central charge C J. Based on the numerical data, we conjecture that the rank-one E-string theory saturates the universal lower bound on C J , and numerically determine the spectrum of long multiplets in the rank-one E-string theory. We comment on the possibility of solving the higher-rank E-string theories by bootstrap and thereby probing M-theory on AdS 7×S 4/Z 2 .
Carving out the end of the world or (superconformal bootstrap in six dimensions)
Chang, Chi-Ming; Lin, Ying-Hsuan
2017-08-29
We bootstrap N=(1,0) superconformal field theories in six dimensions, by analyzing the four-point function of flavor current multiplets. By assuming E 8 flavor group, we present universal bounds on the central charge C T and the flavor central charge C J. Based on the numerical data, we conjecture that the rank-one E-string theory saturates the universal lower bound on C J , and numerically determine the spectrum of long multiplets in the rank-one E-string theory. We comment on the possibility of solving the higher-rank E-string theories by bootstrap and thereby probing M-theory on AdS 7×S 4/Z 2 .
Innovation cascades: artefacts, organization and attributions
2016-01-01
Innovation cascades inextricably link the introduction of new artefacts, transformations in social organization, and the emergence of new functionalities and new needs. This paper describes a positive feedback dynamic, exaptive bootstrapping, through which these cascades proceed, and the characteristics of the relationships in which the new attributions that drive this dynamic are generated. It concludes by arguing that the exaptive bootstrapping dynamic is the principal driver of our current Innovation Society. PMID:26926284
NASA Astrophysics Data System (ADS)
Chatthong, B.; Onjun, T.
2016-01-01
A set of heat and particle transport equations with the inclusion of E × B flow and magnetic shear is used to understand the formation and behaviors of edge transport barriers (ETBs) and internal transport barriers (ITBs) in tokamak plasmas based on two-field bifurcation concept. A simple model that can describe the E × B flow shear and magnetic shear effect in tokamak plasma is used for anomalous transport suppression with the effect of bootstrap current included. Consequently, conditions and formations of ETB and ITB can be visualized and studied. It can be seen that the ETB formation depends sensitively on the E × B flow shear suppression with small dependence on the magnetic shear suppression. However, the ITB formation depends sensitively on the magnetic shear suppression with a small dependence on the E × B flow shear suppression. Once the H-mode is achieved, the s-curve bifurcation diagram is modified due to an increase of bootstrap current at the plasma edge, resulting in reductions of both L-H and H-L transition thresholds with stronger hysteresis effects. It is also found that both ITB and ETB widths appear to be governed by heat or particle sources and the location of the current peaking. In addition, at a marginal flux just below the L-H threshold, a small perturbation in terms of heat or density fluctuation can result in a transition, which can remain after the perturbation is removed due to the hysteresis effect.
Progress Toward Steady State Tokamak Operation Exploiting the high bootstrap current fraction regime
NASA Astrophysics Data System (ADS)
Ren, Q.
2015-11-01
Recent DIII-D experiments have advanced the normalized fusion performance of the high bootstrap current fraction tokamak regime toward reactor-relevant steady state operation. The experiments, conducted by a joint team of researchers from the DIII-D and EAST tokamaks, developed a fully noninductive scenario that could be extended on EAST to a demonstration of long pulse steady-state tokamak operation. Fully noninductive plasmas with extremely high values of the poloidal beta, βp >= 4 , have been sustained at βT >= 2 % for long durations with excellent energy confinement quality (H98y,2 >= 1 . 5) and internal transport barriers (ITBs) generated at large minor radius (>= 0 . 6) in all channels (Te, Ti, ne, VTf). Large bootstrap fraction (fBS ~ 80 %) has been obtained with high βp. ITBs have been shown to be compatible with steady state operation. Because of the unusually large ITB radius, normalized pressure is not limited to low βN values by internal ITB-driven modes. βN up to ~4.3 has been obtained by optimizing the plasma-wall distance. The scenario is robust against several variations, including replacing some on-axis with off-axis neutral beam injection (NBI), adding electron cyclotron (EC) heating, and reducing the NBI torque by a factor of 2. This latter observation is particularly promising for extension of the scenario to EAST, where maximum power is obtained with balanced NBI injection, and to a reactor, expected to have low rotation. However, modeling of this regime has provided new challenges to state-of-the-art modeling capabilities: quasilinear models can dramatically underpredict the electron transport, and the Sauter bootstrap current can be insufficient. The analysis shows first-principle NEO is in good agreement with experiments for the bootstrap current calculation and ETG modes with a larger saturated amplitude or EM modes may provide the missing electron transport. Work supported in part by the US DOE under DE-FC02-04ER54698, DE-AC52-07NA27344, DE-AC02-09CH11466, and the NMCFP of China under 2015GB110000 and 2015GB102000.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaing, K. C.; Peng, Yueng Kay Martin
Transport theory for potato orbits in the region near the magnetic axis in an axisymmetric torus such as tokamaks and spherical tori is extended to the situation where the toroidal flow speed is of the order of the sonic speed as observed in National Spherical Torus Experiment [E. J. Synakowski, M. G. Bell, R. E. Bell et al., Nucl. Fusion 43, 1653 (2003)]. It is found that transport fluxes such as ion radial heat flux, and bootstrap current density are modified by a factor of the order of the square of the toroidal Mach number. The consequences of the orbitmore » squeezing are also presented. The theory is developed for parabolic (in radius r) plasma profiles. A method to apply the results of the theory for the transport modeling is discussed.« less
Metastable Behavior for Bootstrap Percolation on Regular Trees
NASA Astrophysics Data System (ADS)
Biskup, Marek; Schonmann, Roberto H.
2009-08-01
We examine bootstrap percolation on a regular ( b+1)-ary tree with initial law given by Bernoulli( p). The sites are updated according to the usual rule: a vacant site becomes occupied if it has at least θ occupied neighbors, occupied sites remain occupied forever. It is known that, when b> θ≥2, the limiting density q= q( p) of occupied sites exhibits a jump at some p T= p T( b, θ)∈(0,1) from q T:= q( p T)<1 to q( p)=1 when p> p T. We investigate the metastable behavior associated with this transition. Explicitly, we pick p= p T+ h with h>0 and show that, as h ↓0, the system lingers around the "critical" state for time order h -1/2 and then passes to fully occupied state in time O(1). The law of the entire configuration observed when the occupation density is q∈( q T,1) converges, as h ↓0, to a well-defined measure.
Transport simulation of EAST long-pulse H-mode discharge with integrated modeling
NASA Astrophysics Data System (ADS)
Wu, M. Q.; Li, G. Q.; Chen, J. L.; Du, H. F.; Gao, X.; Ren, Q. L.; Li, K.; Chan, Vincent; Pan, C. K.; Ding, S. Y.; Jian, X.; Zhu, X.; Lian, H.; Qian, J. P.; Gong, X. Z.; Zang, Q.; Duan, Y. M.; Liu, H. Q.; Lyu, B.
2018-04-01
In the 2017 EAST experimental campaign, a steady-state long-pulse H-mode discharge lasting longer than 100 s has been obtained using only radio frequency heating and current drive, and the confinement quality is slightly better than standard H-mode, H98y2 ~ 1.1, with stationary peaked electron temperature profiles. Integrated modeling of one long-pulse H-mode discharge in the 2016 EAST experimental campaign has been performed with equilibrium code EFIT, and transport codes TGYRO and ONETWO under integrated modeling framework OMFIT. The plasma current is fully-noninductively driven with a combination of ~2.2 MW LHW, ~0.3 MW ECH and ~1.1 MW ICRF. Time evolution of the predicted electron and ion temperature profiles through integrated modeling agree closely with that from measurements. The plasma current (I p ~ 0.45 MA) and electron density are kept constantly. A steady-state is achieved using integrated modeling, and the bootstrap current fraction is ~28%, the RF drive current fraction is ~72%. The predicted current density profile matches the experimental one well. Analysis shows that electron cyclotron heating (ECH) makes large contribution to the plasma confinement when heating in the core region while heating in large radius does smaller improvement, also a more peaked LHW driven current profile is got when heating in the core. Linear analysis shows that the high-k modes instability (electron temperature gradient driven modes) is suppressed in the core region where exists weak electron internal transport barriers. The trapped electron modes dominates in the low-k region, which is mainly responsible for driving the electron energy flux. It is found that the ECH heating effect is very local and not the main cause to sustained the good confinement, the peaked current density profile has the most important effect on plasma confinement improvement. Transport analysis of the long-pulse H-mode experiments on EAST will be helpful to build future experiments.
Impurities in a non-axisymmetric plasma. Transport and effect on bootstrap current
Mollén, A.; Landreman, M.; Smith, H. M.; ...
2015-11-20
Impurities cause radiation losses and plasma dilution, and in stellarator plasmas the neoclassical ambipolar radial electric field is often unfavorable for avoiding strong impurity peaking. In this work we use a new continuum drift-kinetic solver, the SFINCS code (the Stellarator Fokker-Planck Iterative Neoclassical Conservative Solver) [M. Landreman et al., Phys. Plasmas 21 (2014) 042503] which employs the full linearized Fokker-Planck-Landau operator, to calculate neoclassical impurity transport coefficients for a Wendelstein 7-X (W7-X) magnetic configuration. We compare SFINCS calculations with theoretical asymptotes in the high collisionality limit. We observe and explain a 1/nu-scaling of the inter-species radial transport coefficient at lowmore » collisionality, arising due to the field term in the inter-species collision operator, and which is not found with simplified collision models even when momentum correction is applied. However, this type of scaling disappears if a radial electric field is present. We use SFINCS to analyze how the impurity content affects the neoclassical impurity dynamics and the bootstrap current. We show that a change in plasma effective charge Z eff of order unity can affect the bootstrap current enough to cause a deviation in the divertor strike point locations.« less
Consumers limit the abundance and dynamics of a perennial shrub with a seed bank
Kauffman, M.J.; Maron, J.L.
2006-01-01
For nearly 30 years, ecologists have argued that predators of seeds and seedlings seldom have population-level effects on plants with persistent seed banks and density-dependent seedling survival. We parameterized stage-based population models that incorporated density dependence and seed dormancy with data from a 5.5-year experiment that quantified how granivorous mice and herbivorous voles influence bush lupine (Lupinus arboreus) demography. We asked how seed dormancy and density-dependent seedling survival mediate the impacts of these consumers in dune and grassland habitats. In dune habitat, mice reduced analytical ?? (the intrinsic rate of population growth) by 39%, the equilibrium number of above-ground plants by 90%, and the seed bank by 98%; voles had minimal effects. In adjacent grasslands, mice had minimal effects, but seedling herbivory by voles reduced analytical ?? by 15% and reduced both the equilibrium number of aboveground plants and dormant seeds by 63%. A bootstrap analysis demonstrated that these consumer effects were robust to parameter uncertainty. Our results demonstrate that the quantitative strengths of seed dormancy and density-dependent seedling survival-not their mere existence-critically mediate consumer effects. This study suggests that plant population dynamics and distribution may be more strongly influenced by consumers of seeds and seedlings than is currently recognized. ?? 2006 by The University of Chicago.
The prospects for magnetohydrodynamic stability in advanced tokamak regimes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manickam, J.; Chance, M.S.; Jardin, S.C.
1994-05-01
Stability analysis of advanced regime tokamaks is presented. Here advanced regimes are defined to include configurations where the ratio of the bootstrap current, [ital I][sub BS], to the total plasma current, [ital I][sub [ital p
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wingen, Andreas; Ferraro, Nathaniel M.; Shafer, Morgan W.
Calculations of the plasma response to applied non-axisymmetric fields in several DIII-D discharges show that predicted displacements depend strongly on the edge current density. This result is found using both a linear two-fluid-MHD model (M3D-C1) and a nonlinear ideal-MHD model (VMEC). Furthermore, it is observed that the probability of a discharge being edge localized mode (ELM)-suppressed is most closely related to the edge current density, as opposed to the pressure gradient. It is found that discharges with a stronger kink response are closer to the peeling–ballooning stability limit in ELITE simulations and eventually cross into the unstable region, causing ELMsmore » to reappear. Thus for effective ELM suppression, the RMP has to prevent the plasma from generating a large kink response, associated with ELM instability. Experimental observations are in agreement with the finding; discharges which have a strong kink response in the MHD simulations show ELMs or ELM mitigation during the RMP phase of the experiment, while discharges with a small kink response in the MHD simulations are fully ELM suppressed in the experiment by the applied resonant magnetic perturbation. The results are cross-checked against modeled 3D ideal MHD equilibria using the VMEC code. The procedure of constructing optimal 3D equilibria for diverted H-mode discharges using VMEC is presented. As a result, kink displacements in VMEC are found to scale with the edge current density, similar to M3D-C1, but the displacements are smaller. A direct correlation in the flux surface displacements to the bootstrap current is shown.« less
Wingen, Andreas; Ferraro, Nathaniel M.; Shafer, Morgan W.; ...
2015-09-03
Calculations of the plasma response to applied non-axisymmetric fields in several DIII-D discharges show that predicted displacements depend strongly on the edge current density. This result is found using both a linear two-fluid-MHD model (M3D-C1) and a nonlinear ideal-MHD model (VMEC). Furthermore, it is observed that the probability of a discharge being edge localized mode (ELM)-suppressed is most closely related to the edge current density, as opposed to the pressure gradient. It is found that discharges with a stronger kink response are closer to the peeling–ballooning stability limit in ELITE simulations and eventually cross into the unstable region, causing ELMsmore » to reappear. Thus for effective ELM suppression, the RMP has to prevent the plasma from generating a large kink response, associated with ELM instability. Experimental observations are in agreement with the finding; discharges which have a strong kink response in the MHD simulations show ELMs or ELM mitigation during the RMP phase of the experiment, while discharges with a small kink response in the MHD simulations are fully ELM suppressed in the experiment by the applied resonant magnetic perturbation. The results are cross-checked against modeled 3D ideal MHD equilibria using the VMEC code. The procedure of constructing optimal 3D equilibria for diverted H-mode discharges using VMEC is presented. As a result, kink displacements in VMEC are found to scale with the edge current density, similar to M3D-C1, but the displacements are smaller. A direct correlation in the flux surface displacements to the bootstrap current is shown.« less
The prevalence of terraced treescapes in analyses of phylogenetic data sets.
Dobrin, Barbara H; Zwickl, Derrick J; Sanderson, Michael J
2018-04-04
The pattern of data availability in a phylogenetic data set may lead to the formation of terraces, collections of equally optimal trees. Terraces can arise in tree space if trees are scored with parsimony or with partitioned, edge-unlinked maximum likelihood. Theory predicts that terraces can be large, but their prevalence in contemporary data sets has never been surveyed. We selected 26 data sets and phylogenetic trees reported in recent literature and investigated the terraces to which the trees would belong, under a common set of inference assumptions. We examined terrace size as a function of the sampling properties of the data sets, including taxon coverage density (the proportion of taxon-by-gene positions with any data present) and a measure of gene sampling "sufficiency". We evaluated each data set in relation to the theoretical minimum gene sampling depth needed to reduce terrace size to a single tree, and explored the impact of the terraces found in replicate trees in bootstrap methods. Terraces were identified in nearly all data sets with taxon coverage densities < 0.90. They were not found, however, in high-coverage-density (i.e., ≥ 0.94) transcriptomic and genomic data sets. The terraces could be very large, and size varied inversely with taxon coverage density and with gene sampling sufficiency. Few data sets achieved a theoretical minimum gene sampling depth needed to reduce terrace size to a single tree. Terraces found during bootstrap resampling reduced overall support. If certain inference assumptions apply, trees estimated from empirical data sets often belong to large terraces of equally optimal trees. Terrace size correlates to data set sampling properties. Data sets seldom include enough genes to reduce terrace size to one tree. When bootstrap replicate trees lie on a terrace, statistical support for phylogenetic hypotheses may be reduced. Although some of the published analyses surveyed were conducted with edge-linked inference models (which do not induce terraces), unlinked models have been used and advocated. The present study describes the potential impact of that inference assumption on phylogenetic inference in the context of the kinds of multigene data sets now widely assembled for large-scale tree construction.
2014-01-01
Background Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. Methods We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. Results In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. Conclusions The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes. PMID:24888356
Sadatsafavi, Mohsen; Marra, Carlo; Aaron, Shawn; Bryan, Stirling
2014-06-03
Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes.
The Inverse Bagging Algorithm: Anomaly Detection by Inverse Bootstrap Aggregating
NASA Astrophysics Data System (ADS)
Vischia, Pietro; Dorigo, Tommaso
2017-03-01
For data sets populated by a very well modeled process and by another process of unknown probability density function (PDF), a desired feature when manipulating the fraction of the unknown process (either for enhancing it or suppressing it) consists in avoiding to modify the kinematic distributions of the well modeled one. A bootstrap technique is used to identify sub-samples rich in the well modeled process, and classify each event according to the frequency of it being part of such sub-samples. Comparisons with general MVA algorithms will be shown, as well as a study of the asymptotic properties of the method, making use of a public domain data set that models a typical search for new physics as performed at hadronic colliders such as the Large Hadron Collider (LHC).
PROGRESS IN THE PEELING-BALLOONING MODEL OF ELMS: TOROIDAL ROTATION AND 3D NONLINEAR DYNAMICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
SNYDER,P.B; WILSON,H.R; XU,X.Q
2004-06-01
Understanding the physics of the H-Mode pedestal and edge localized modes (ELMs) is very important to next-step fusion devices for two primary reasons: (1) The pressure at the top of the edge barrier (''pedestal height'') strongly impacts global confinement and fusion performance, and (2) large ELMs lead to localized transient heat loads on material surfaces that may constrain component lifetimes. The development of the peeling-ballooning model has shed light on these issues by positing a mechanism for ELM onset and constraints on the pedestal height. The mechanism involves instability of ideal coupled ''peeling-ballooning'' modes driven by the sharp pressure gradientmore » and consequent large bootstrap current in the H-mode edge. It was first investigated in the local, high-n limit [1], and later quantified for non-local, finite-n modes in general toroidal geometry [2,3]. Important aspects are that a range of wavelengths may potentially be unstable, with intermediate n's (n {approx} 3-30) generally limiting in high performance regimes, and that stability bounds are strongly sensitive to shape [Fig l(a)], and to collisionality (i.e. temperature and density) [4] through the bootstrap current. The development of efficient MHD stability codes such as ELITE [3,2] and MISHKA [5] has allowed detailed quantification of peeling-ballooning stability bounds (e.g. [6]) and extensive and largely successful comparisons with observation (e.g. [2,6-9]). These previous calculations are ideal, static, and linear. Here we extend this work to incorporate the impact of sheared toroidal rotation, and the non-ideal, nonlinear dynamics which must be studied to quantify ELM size and heat deposition on material surfaces.« less
Bootstrap percolation on spatial networks
NASA Astrophysics Data System (ADS)
Gao, Jian; Zhou, Tao; Hu, Yanqing
2015-10-01
Bootstrap percolation is a general representation of some networked activation process, which has found applications in explaining many important social phenomena, such as the propagation of information. Inspired by some recent findings on spatial structure of online social networks, here we study bootstrap percolation on undirected spatial networks, with the probability density function of long-range links’ lengths being a power law with tunable exponent. Setting the size of the giant active component as the order parameter, we find a parameter-dependent critical value for the power-law exponent, above which there is a double phase transition, mixed of a second-order phase transition and a hybrid phase transition with two varying critical points, otherwise there is only a second-order phase transition. We further find a parameter-independent critical value around -1, about which the two critical points for the double phase transition are almost constant. To our surprise, this critical value -1 is just equal or very close to the values of many real online social networks, including LiveJournal, HP Labs email network, Belgian mobile phone network, etc. This work helps us in better understanding the self-organization of spatial structure of online social networks, in terms of the effective function for information spreading.
Causality constraints in conformal field theory
Hartman, Thomas; Jain, Sachin; Kundu, Sandipan
2016-05-17
Causality places nontrivial constraints on QFT in Lorentzian signature, for example fixing the signs of certain terms in the low energy Lagrangian. In d dimensional conformal field theory, we show how such constraints are encoded in crossing symmetry of Euclidean correlators, and derive analogous constraints directly from the conformal bootstrap (analytically). The bootstrap setup is a Lorentzian four-point function corresponding to propagation through a shockwave. Crossing symmetry fixes the signs of certain log terms that appear in the conformal block expansion, which constrains the interactions of low-lying operators. As an application, we use the bootstrap to rederive the well knownmore » sign constraint on the (Φ) 4 coupling in effective field theory, from a dual CFT. We also find constraints on theories with higher spin conserved currents. As a result, our analysis is restricted to scalar correlators, but we argue that similar methods should also impose nontrivial constraints on the interactions of spinning operators« less
Conformal Bootstrap in Mellin Space
NASA Astrophysics Data System (ADS)
Gopakumar, Rajesh; Kaviraj, Apratim; Sen, Kallol; Sinha, Aninda
2017-02-01
We propose a new approach towards analytically solving for the dynamical content of conformal field theories (CFTs) using the bootstrap philosophy. This combines the original bootstrap idea of Polyakov with the modern technology of the Mellin representation of CFT amplitudes. We employ exchange Witten diagrams with built-in crossing symmetry as our basic building blocks rather than the conventional conformal blocks in a particular channel. Demanding consistency with the operator product expansion (OPE) implies an infinite set of constraints on operator dimensions and OPE coefficients. We illustrate the power of this method in the ɛ expansion of the Wilson-Fisher fixed point by reproducing anomalous dimensions and, strikingly, obtaining OPE coefficients to higher orders in ɛ than currently available using other analytic techniques (including Feynman diagram calculations). Our results enable us to get a somewhat better agreement between certain observables in the 3D Ising model and the precise numerical values that have been recently obtained.
Current/Pressure Profile Effects on Tearing Mode Stability in DIII-D Hybrid Discharges
NASA Astrophysics Data System (ADS)
Kim, K.; Park, J. M.; Murakami, M.; La Haye, R. J.; Na, Yong-Su
2015-11-01
It is important to understand the onset threshold and the evolution of tearing modes (TMs) for developing a high-performance steady state fusion reactor. As initial and basic comparisons to determine TM onset, the measured plasma profiles (such as temperature, density, rotation) were compared with the calculated current profiles between a pair of discharges with/without n=1 mode based on the database for DIII-D hybrid plasmas. The profiles were not much different, but the details were analyzed to determine their characteristics, especially near the rational surface. The tearing stability index calculated from PEST3, Δ' tends to increase rapidly just before the n=1 mode onset for these cases. The modeled equilibrium with varying pressure or current profiles parametrically based on the reference discharge is reconstructed for checking the onset dependency on Δ' or neoclassical effects such as bootstrap current. Simulations of TMs with the modeled equilibrium using resistive MHD codes will also be presented and compared with experiments to determine the sensibility for predicting TM onset. Work supported by US DOE under DE-FC02-04ER54698 and DE-AC52-07NA27344.
Richard-Davis, Gloria; Whittemore, Brianna; Disher, Anthony; Rice, Valerie Montgomery; Lenin, Rathinasamy B; Dollins, Camille; Siegel, Eric R; Eswaran, Hari
2018-01-01
Increased mammographic breast density is a well-established risk factor for breast cancer development, regardless of age or ethnic background. The current gold standard for categorizing breast density consists of a radiologist estimation of percent density according to the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) criteria. This study compares paired qualitative interpretations of breast density on digital mammograms with quantitative measurement of density using Hologic's Food and Drug Administration-approved R2 Quantra volumetric breast density assessment tool. Our goal was to find the best cutoff value of Quantra-calculated breast density for stratifying patients accurately into high-risk and low-risk breast density categories. Screening digital mammograms from 385 subjects, aged 18 to 64 years, were evaluated. These mammograms were interpreted by a radiologist using the ACR's BI-RADS density method, and had quantitative density measured using the R2 Quantra breast density assessment tool. The appropriate cutoff for breast density-based risk stratification using Quantra software was calculated using manually determined BI-RADS scores as a gold standard, in which scores of D3/D4 denoted high-risk densities and D1/D2 denoted low-risk densities. The best cutoff value for risk stratification using Quantra-calculated breast density was found to be 14.0%, yielding a sensitivity of 65%, specificity of 77%, and positive and negative predictive values of 75% and 69%, respectively. Under bootstrap analysis, the best cutoff value had a mean ± SD of 13.70% ± 0.89%. Our study is the first to publish on a North American population that assesses the accuracy of the R2 Quantra system at breast density stratification. Quantitative breast density measures will improve accuracy and reliability of density determination, assisting future researchers to accurately calculate breast cancer risks associated with density increase.
High-beta, steady-state hybrid scenario on DIII-D
Petty, C. C.; Kinsey, J. E.; Holcomb, C. T.; ...
2015-12-17
Here, the potential of the hybrid scenario (first developed as an advanced inductive scenario for high fluence) as a regime for high-beta, steady-state plasmas is demonstrated on the DIII-D tokamak. These experiments show that the beneficial characteristics of hybrids, namely safety factor ≥1 with low central magnetic shear, high stability limits and excellent confinement, are maintained when strong central current drive (electron cyclotron and neutral beam) is applied to increase the calculated non-inductive fraction to ≈100% (≈50% bootstrap current). The best discharges achieve normalized beta of 3.4, IPB98(y,2) confinement factor of 1.4, surface loop voltage of 0.01 V, and nearlymore » equal electron and ion temperatures at low collisionality. A zero-dimensional physics model shows that steady-state hybrid operation with Q fus ~ 5 is feasible in FDF and ITER. The advantage of the hybrid scenario as an Advanced Tokamak regime is that the external current drive can be deposited near the plasma axis where the efficiency is high; additionally, good alignment between the current drive and plasma current profiles is not necessary as the poloidal magnetic flux pumping self-organizes the current density profile in hybrids with an m/n=3/2 tearing mode.« less
Richard-Davis, Gloria; Whittemore, Brianna; Disher, Anthony; Rice, Valerie Montgomery; Lenin, Rathinasamy B; Dollins, Camille; Siegel, Eric R; Eswaran, Hari
2018-01-01
Objective: Increased mammographic breast density is a well-established risk factor for breast cancer development, regardless of age or ethnic background. The current gold standard for categorizing breast density consists of a radiologist estimation of percent density according to the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) criteria. This study compares paired qualitative interpretations of breast density on digital mammograms with quantitative measurement of density using Hologic’s Food and Drug Administration–approved R2 Quantra volumetric breast density assessment tool. Our goal was to find the best cutoff value of Quantra-calculated breast density for stratifying patients accurately into high-risk and low-risk breast density categories. Methods: Screening digital mammograms from 385 subjects, aged 18 to 64 years, were evaluated. These mammograms were interpreted by a radiologist using the ACR’s BI-RADS density method, and had quantitative density measured using the R2 Quantra breast density assessment tool. The appropriate cutoff for breast density–based risk stratification using Quantra software was calculated using manually determined BI-RADS scores as a gold standard, in which scores of D3/D4 denoted high-risk densities and D1/D2 denoted low-risk densities. Results: The best cutoff value for risk stratification using Quantra-calculated breast density was found to be 14.0%, yielding a sensitivity of 65%, specificity of 77%, and positive and negative predictive values of 75% and 69%, respectively. Under bootstrap analysis, the best cutoff value had a mean ± SD of 13.70% ± 0.89%. Conclusions: Our study is the first to publish on a North American population that assesses the accuracy of the R2 Quantra system at breast density stratification. Quantitative breast density measures will improve accuracy and reliability of density determination, assisting future researchers to accurately calculate breast cancer risks associated with density increase. PMID:29511356
Xu, X. Q.; Ma, J. F.; Li, G. Q.
2014-12-29
The latest BOUT++ studies show an emerging understanding of dynamics of edge localized mode(ELM) crashes and the consistent collisionality scaling of ELMenergy losses with the world multi-tokamak database. A series of BOUT++ simulations are conducted to investigate the scaling characteristics of the ELMenergy losses vs collisionality via a density scan. Moreover, the linear results demonstrate that as the pedestal collisionality decreases, the growth rate of the peeling-ballooning modes decreases for high n but increases for low n (1 < n < 5), therefore the width of the growth rate spectrum γ(n) becomes narrower and the peak growth shifts to lowermore » n. For nonlinear BOUT++ simulations show a two-stage process of ELM crash evolution of (i) initial bursts of pressure blob and void creation and (ii) inward void propagation. The inward void propagation stirs the top of pedestal plasma and yields an increasing ELM size with decreasing collisionality after a series of micro-bursts. The pedestal plasma density plays a major role in determining the ELMenergy loss through its effect on the edge bootstrap current and ion diamagnetic stabilization. Finally, the critical trend emerges as a transition (1) linearly from ballooning-dominated states at high collisionality to peeling-dominated states at low collisionality with decreasing density and (2) nonlinearly from turbulence spreading dynamics at high collisionality into avalanche-like dynamics at low collisionality.« less
Transport Barriers in Bootstrap Driven Tokamaks
NASA Astrophysics Data System (ADS)
Staebler, Gary
2017-10-01
Maximizing the bootstrap current in a tokamak, so that it drives a high fraction of the total current, reduces the external power required to drive current by other means. Improved energy confinement, relative to empirical scaling laws, enables a reactor to more fully take advantage of the bootstrap driven tokamak. Experiments have demonstrated improved energy confinement due to the spontaneous formation of an internal transport barrier in high bootstrap fraction discharges. Gyrokinetic analysis, and quasilinear predictive modeling, demonstrates that the observed transport barrier is due to the suppression of turbulence primarily due to the large Shafranov shift. ExB velocity shear does not play a significant role in the transport barrier due to the high safety factor. It will be shown, that the Shafranov shift can produce a bifurcation to improved confinement in regions of positive magnetic shear or a continuous reduction in transport for weak or negative magnetic shear. Operation at high safety factor lowers the pressure gradient threshold for the Shafranov shift driven barrier formation. The ion energy transport is reduced to neoclassical and electron energy and particle transport is reduced, but still turbulent, within the barrier. Deeper into the plasma, very large levels of electron transport are observed. The observed electron temperature profile is shown to be close to the threshold for the electron temperature gradient (ETG) mode. A large ETG driven energy transport is qualitatively consistent with recent multi-scale gyrokinetic simulations showing that reducing the ion scale turbulence can lead to large increase in the electron scale transport. A new saturation model for the quasilinear TGLF transport code, that fits these multi-scale gyrokinetic simulations, can match the data if the impact of zonal flow mixing on the ETG modes is reduced at high safety factor. This work was supported by the U.S. Department of Energy under DE-FG02-95ER54309 and DE-FC02-04ER54698.
Garofalo, Andrea M.; Gong, Xianzu; Grierson, Brian A.; ...
2015-11-16
Recent EAST/DIII-D joint experiments on the high poloidal beta tokamak regime in DIII-D have demonstrated fully noninductive operation with an internal transport barrier (ITB) at large minor radius, at normalized fusion performance increased by ≥30% relative to earlier work. The advancement was enabled by improved understanding of the “relaxation oscillations”, previously attributed to repetitive ITB collapses, and of the fast ion behavior in this regime. It was found that the “relaxation oscillations” are coupled core-edge modes 2 amenable to wall-stabilization, and that fast ion losses which previously dictated a large plasma-wall separation to avoid wall over-heating, can be reduced tomore » classical levels with sufficient plasma density. By using optimized waveforms of the plasma-wall separation and plasma density, fully noninductive plasmas have been sustained for long durations with excellent energy confinement quality, bootstrap fraction ≥ 80%, β N ≤ 4 , β P ≥ 3 , and β T ≥ 2%. Finally, these results bolster the applicability of the high poloidal beta tokamak regime toward the realization of a steady-state fusion reactor.« less
Integrated Scenario Modeling of NSTX Advanced Plasma Configurations
NASA Astrophysics Data System (ADS)
Kessel, Charles; Synakowski, Edward
2003-10-01
The Spherical Torus will provide an attractive fusion energy source if it can demonstrate the following major features: high elongation and triangularity, 100% non-inductive current with a credible path to high bootstrap fractions, non-solenoidal startup and current rampup, high beta with stabilization of RWM instabilities, and sufficiently high energy confinement. NSTX has specific experimental milestones to examine these features, and integrated scenario modeling is helping to understand how these configurations might be produced and what tools are needed to access this operating space. Simulations with the Tokamak Simulation Code (TSC), CURRAY, and JSOLVER/BALMSC/PEST2 have identified fully non-inductively sustained, high beta plasmas that rely on strong plasma shaping accomplished with a PF coil modification, off-axis current drive from Electron Bernstein Waves (EBW), flexible on-axis heating and CD from High Harmonic Fast Wave (HHFW) and Neutral Beam Injection (NBI), and density control. Ideal MHD stability shows that with wall stabilization through plasma rotation and/or RWM feedback coils, a beta of 40% is achievable, with 100% non-inductive current sustained for 4 current diffusion times. Experimental data and theory are combined to produce a best extrapolation to these regimes, which is continuously improved as the discharges approach these parameters, and theoretical/computational methods expand. Further investigations and development for integrated scenario modeling on NSTX is discussed.
Limitations of bootstrap current models
Belli, Emily A.; Candy, Jefferey M.; Meneghini, Orso; ...
2014-03-27
We assess the accuracy and limitations of two analytic models of the tokamak bootstrap current: (1) the well-known Sauter model and (2) a recent modification of the Sauter model by Koh et al. For this study, we use simulations from the first-principles kinetic code NEO as the baseline to which the models are compared. Tests are performed using both theoretical parameter scans as well as core- to-edge scans of real DIII-D and NSTX plasma profiles. The effects of extreme aspect ratio, large impurity fraction, energetic particles, and high collisionality are studied. In particular, the error in neglecting cross-species collisional couplingmore » – an approximation inherent to both analytic models – is quantified. Moreover, the implications of the corrections from kinetic NEO simulations on MHD equilibrium reconstructions is studied via integrated modeling with kinetic EFIT.« less
Transport barriers in bootstrap-driven tokamaks
NASA Astrophysics Data System (ADS)
Staebler, G. M.; Garofalo, A. M.; Pan, C.; McClenaghan, J.; Van Zeeland, M. A.; Lao, L. L.
2018-05-01
Experiments have demonstrated improved energy confinement due to the spontaneous formation of an internal transport barrier in high bootstrap fraction discharges. Gyrokinetic analysis, and quasilinear predictive modeling, demonstrates that the observed transport barrier is caused by the suppression of turbulence primarily from the large Shafranov shift. It is shown that the Shafranov shift can produce a bifurcation to improved confinement in regions of positive magnetic shear or a continuous reduction in transport for weak or negative magnetic shear. Operation at high safety factor lowers the pressure gradient threshold for the Shafranov shift-driven barrier formation. Two self-organized states of the internal and edge transport barrier are observed. It is shown that these two states are controlled by the interaction of the bootstrap current with magnetic shear, and the kinetic ballooning mode instability boundary. Election scale energy transport is predicted to be dominant in the inner 60% of the profile. Evidence is presented that energetic particle-driven instabilities could be playing a role in the thermal energy transport in this region.
Heating and current drive requirements towards steady state operation in ITER
NASA Astrophysics Data System (ADS)
Poli, Francesca; Kessel, Charles; Bonoli, Paul; Batchelor, Donald; Harvey, Bob
2013-10-01
Steady state scenarios envisaged for ITER aim at optimizing the bootstrap current, while maintaining sufficient confinement and stability. Non-inductive scenarios will need to operate with Internal Transport Barriers (ITBs) to reach adequate fusion gain at typical currents of 9 MA. Scenarios are established as relaxed flattop states with time-dependent transport simulations with TSC. The E × B flow shear from toroidal plasma rotation is expected to be low in ITER, with a major role in the ITB dynamics being played by magnetic geometry. Combinations of external sources that maintain weakly reversed shear profiles and ρ (qmin >= 0 . 5 are the focus of this work. Simulations indicate that, with a trade-off of the EC equatorial and upper launcher, the formation and sustainment of ITBs could be demonstrated with the baseline configuration. However, with proper constraints from peeling-ballooning theory on the pedestal width and height, the fusion gain and the maximum non-inductive current (6.2MA) are below the target. Upgrades of the heating and current drive system, like the use of Lower Hybrid current drive, could overcome these limitations. With 30MW of coupled LH in the flattop and operating at the Greenwald density, plasmas can sustain ~ 9 MA and achieve Q ~ 4 . Work supported by the US Department of Energy under DE-AC02-CH0911466.
Jiang, Wenyu; Simon, Richard
2007-12-20
This paper first provides a critical review on some existing methods for estimating the prediction error in classifying microarray data where the number of genes greatly exceeds the number of specimens. Special attention is given to the bootstrap-related methods. When the sample size n is small, we find that all the reviewed methods suffer from either substantial bias or variability. We introduce a repeated leave-one-out bootstrap (RLOOB) method that predicts for each specimen in the sample using bootstrap learning sets of size ln. We then propose an adjusted bootstrap (ABS) method that fits a learning curve to the RLOOB estimates calculated with different bootstrap learning set sizes. The ABS method is robust across the situations we investigate and provides a slightly conservative estimate for the prediction error. Even with small samples, it does not suffer from large upward bias as the leave-one-out bootstrap and the 0.632+ bootstrap, and it does not suffer from large variability as the leave-one-out cross-validation in microarray applications. Copyright (c) 2007 John Wiley & Sons, Ltd.
Theodoratou, Evropi; Farrington, Susan M; Tenesa, Albert; McNeill, Geraldine; Cetnarskyj, Roseanne; Korakakis, Emmanouil; Din, Farhat V N; Porteous, Mary E; Dunlop, Malcolm G; Campbell, Harry
2014-01-01
Colorectal cancer (CRC) accounts for 9.7% of all cancer cases and for 8% of all cancer-related deaths. Established risk factors include personal or family history of CRC as well as lifestyle and dietary factors. We investigated the relationship between CRC and demographic, lifestyle, food and nutrient risk factors through a case-control study that included 2062 patients and 2776 controls from Scotland. Forward and backward stepwise regression was applied and the stability of the models was assessed in 1000 bootstrap samples. The variables that were automatically selected to be included by the forward or backward stepwise regression and whose selection was verified by bootstrap sampling in the current study were family history, dietary energy, 'high-energy snack foods', eggs, juice, sugar-sweetened beverages and white fish (associated with an increased CRC risk) and NSAIDs, coffee and magnesium (associated with a decreased CRC risk). Application of forward and backward stepwise regression in this CRC study identified some already established as well as some novel potential risk factors. Bootstrap findings suggest that examination of the stability of regression models by bootstrap sampling is useful in the interpretation of study findings. 'High-energy snack foods' and high-energy drinks (including sugar-sweetened beverages and fruit juices) as risk factors for CRC have not been reported previously and merit further investigation as such snacks and beverages are important contributors in European and North American diets.
Using the Bootstrap Concept to Build an Adaptable and Compact Subversion Artifice
2003-06-01
however, and the current “second generation” of microkernel implementations has resulted in significantly better performance. Of note is the L4 micro...63 c. GEMSOS Kernel .....................................................................63 d. L4 ... Microkernel ........................................................................64 VI. CONCLUSIONS
Fast, Exact Bootstrap Principal Component Analysis for p > 1 million
Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim
2015-01-01
Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject (p) is much larger than the number of subjects (n), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n-dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n-dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p-dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings (p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) (p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods. PMID:27616801
Chaibub Neto, Elias
2015-01-01
In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson’s sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling. PMID:26125965
NASA Astrophysics Data System (ADS)
Pankin, A. Y.; Rafiq, T.; Kritz, A. H.; Park, G. Y.; Snyder, P. B.; Chang, C. S.
2017-06-01
The effects of plasma shaping on the H-mode pedestal structure are investigated. High fidelity kinetic simulations of the neoclassical pedestal dynamics are combined with the magnetohydrodynamic (MHD) stability conditions for triggering edge localized mode (ELM) instabilities that limit the pedestal width and height in H-mode plasmas. The neoclassical kinetic XGC0 code [Chang et al., Phys. Plasmas 11, 2649 (2004)] is used in carrying out a scan over plasma elongation and triangularity. As plasma profiles evolve, the MHD stability limits of these profiles are analyzed with the ideal MHD ELITE code [Snyder et al., Phys. Plasmas 9, 2037 (2002)]. Simulations with the XGC0 code, which includes coupled ion-electron dynamics, yield predictions for both ion and electron pedestal profiles. The differences in the predicted H-mode pedestal width and height for the DIII-D discharges with different elongation and triangularities are discussed. For the discharges with higher elongation, it is found that the gradients of the plasma profiles in the H-mode pedestal reach semi-steady states. In these simulations, the pedestal slowly continues to evolve to higher pedestal pressures and bootstrap currents until the peeling-ballooning stability conditions are satisfied. The discharges with lower elongation do not reach the semi-steady state, and ELM crashes are triggered at earlier times. The plasma elongation is found to have a stronger stabilizing effect than the plasma triangularity. For the discharges with lower elongation and lower triangularity, the ELM frequency is large, and the H-mode pedestal evolves rapidly. It is found that the temperature of neutrals in the scrape-off-layer (SOL) region can affect the dynamics of the H-mode pedestal buildup. However, the final pedestal profiles are nearly independent of the neutral temperature. The elongation and triangularity affect the pedestal widths of plasma density and electron temperature profiles differently. This provides a new mechanism of controlling the pedestal bootstrap current and the pedestal stability.
Pankin, A. Y.; Rafiq, T.; Kritz, A. H.; ...
2017-06-08
The effects of plasma shaping on the H-mode pedestal structure are investigated. High fidelity kinetic simulations of the neoclassical pedestal dynamics are combined with the magnetohydrodynamic (MHD) stability conditions for triggering edge localized mode (ELM) instabilities that limit the pedestal width and height in H-mode plasmas. We use the neoclassical kinetic XGC0 code [Chang et al., Phys. Plasmas 11, 2649 (2004)] to carry out a scan over plasma elongation and triangularity. As plasma profiles evolve, the MHD stability limits of these profiles are analyzed with the ideal MHD ELITE code [Snyder et al., Phys. Plasmas 9, 2037 (2002)]. In simulationsmore » with the XGC0 code, which includes coupled ion-electron dynamics, yield predictions for both ion and electron pedestal profiles. The differences in the predicted H-mode pedestal width and height for the DIII-D discharges with different elongation and triangularities are discussed. For the discharges with higher elongation, it is found that the gradients of the plasma profiles in the H-mode pedestal reach semi-steady states. In these simulations, the pedestal slowly continues to evolve to higher pedestal pressures and bootstrap currents until the peeling-ballooning stability conditions are satisfied. The discharges with lower elongation do not reach the semi-steady state, and ELM crashes are triggered at earlier times. The plasma elongation is found to have a stronger stabilizing effect than the plasma triangularity. For the discharges with lower elongation and lower triangularity, the ELM frequency is large, and the H-mode pedestal evolves rapidly. It is found that the temperature of neutrals in the scrape-off-layer (SOL) region can affect the dynamics of the H-mode pedestal buildup. But the final pedestal profiles are nearly independent of the neutral temperature. The elongation and triangularity affect the pedestal widths of plasma density and electron temperature profiles differently. This provides a new mechanism of controlling the pedestal bootstrap current and the pedestal stability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pankin, A. Y.; Rafiq, T.; Kritz, A. H.
The effects of plasma shaping on the H-mode pedestal structure are investigated. High fidelity kinetic simulations of the neoclassical pedestal dynamics are combined with the magnetohydrodynamic (MHD) stability conditions for triggering edge localized mode (ELM) instabilities that limit the pedestal width and height in H-mode plasmas. We use the neoclassical kinetic XGC0 code [Chang et al., Phys. Plasmas 11, 2649 (2004)] to carry out a scan over plasma elongation and triangularity. As plasma profiles evolve, the MHD stability limits of these profiles are analyzed with the ideal MHD ELITE code [Snyder et al., Phys. Plasmas 9, 2037 (2002)]. In simulationsmore » with the XGC0 code, which includes coupled ion-electron dynamics, yield predictions for both ion and electron pedestal profiles. The differences in the predicted H-mode pedestal width and height for the DIII-D discharges with different elongation and triangularities are discussed. For the discharges with higher elongation, it is found that the gradients of the plasma profiles in the H-mode pedestal reach semi-steady states. In these simulations, the pedestal slowly continues to evolve to higher pedestal pressures and bootstrap currents until the peeling-ballooning stability conditions are satisfied. The discharges with lower elongation do not reach the semi-steady state, and ELM crashes are triggered at earlier times. The plasma elongation is found to have a stronger stabilizing effect than the plasma triangularity. For the discharges with lower elongation and lower triangularity, the ELM frequency is large, and the H-mode pedestal evolves rapidly. It is found that the temperature of neutrals in the scrape-off-layer (SOL) region can affect the dynamics of the H-mode pedestal buildup. But the final pedestal profiles are nearly independent of the neutral temperature. The elongation and triangularity affect the pedestal widths of plasma density and electron temperature profiles differently. This provides a new mechanism of controlling the pedestal bootstrap current and the pedestal stability.« less
Bootstrapping the O(N) archipelago
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kos, Filip; Poland, David; Simmons-Duffin, David
2015-11-17
We study 3d CFTs with an O(N) global symmetry using the conformal bootstrap for a system of mixed correlators. Specifically, we consider all nonvanishing scalar four-point functions containing the lowest dimension O(N) vector Φ i and the lowest dimension O(N) singlet s, assumed to be the only relevant operators in their symmetry representations. The constraints of crossing symmetry and unitarity for these four-point functions force the scaling dimensions (Δ Φ , Δ s ) to lie inside small islands. Here, we also make rigorous determinations of current two-point functions in the O(2) and O(3) models, with applications to transport inmore » condensed matter systems.« less
Coefficient Omega Bootstrap Confidence Intervals: Nonnormal Distributions
ERIC Educational Resources Information Center
Padilla, Miguel A.; Divers, Jasmin
2013-01-01
The performance of the normal theory bootstrap (NTB), the percentile bootstrap (PB), and the bias-corrected and accelerated (BCa) bootstrap confidence intervals (CIs) for coefficient omega was assessed through a Monte Carlo simulation under conditions not previously investigated. Of particular interests were nonnormal Likert-type and binary items.…
Tests of Independence for Ordinal Data Using Bootstrap.
ERIC Educational Resources Information Center
Chan, Wai; Yung, Yiu-Fai; Bentler, Peter M.; Tang, Man-Lai
1998-01-01
Two bootstrap tests are proposed to test the independence hypothesis in a two-way cross table. Monte Carlo studies are used to compare the traditional asymptotic test with these bootstrap methods, and the bootstrap methods are found superior in two ways: control of Type I error and statistical power. (SLD)
NASA Astrophysics Data System (ADS)
Aschonitis, Vassilis; Diamantopoulou, Maria; Papamichail, Dimitris
2018-05-01
The aim of the study is to propose new modeling approaches for daily estimations of crop coefficient K c for flooded rice ( Oryza sativa L., ssp. indica) under various plant densities. Non-linear regression (NLR) and artificial neural networks (ANN) were used to predict K c based on leaf area index LAI, crop height, wind speed, water albedo, and ponding water depth. Two years of evapotranspiration ET c measurements from lysimeters located in a Mediterranean environment were used in this study. The NLR approach combines bootstrapping and Bayesian sensitivity analysis based on a semi-empirical formula. This approach provided significant information about the hidden role of the same predictor variables in the Levenberg-Marquardt ANN approach, which improved K c predictions. Relationships of production versus ET c were also built and verified by data obtained from Australia. The results of the study showed that the daily K c values, under extremely high plant densities (e.g., for LAI max > 10), can reach extremely high values ( K c > 3) during the reproductive stage. Justifications given in the discussion question both the K c values given by FAO and the energy budget approaches, which assume that ET c cannot exceed a specific threshold defined by the net radiation. These approaches can no longer explain the continuous increase of global rice yields (currently are more than double in comparison to the 1960s) due to the improvement of cultivars and agriculture intensification. The study suggests that the safest method to verify predefined or modeled K c values is through preconstructed relationships of production versus ET c using field measurements.
ERIC Educational Resources Information Center
Fan, Xitao
This paper empirically and systematically assessed the performance of bootstrap resampling procedure as it was applied to a regression model. Parameter estimates from Monte Carlo experiments (repeated sampling from population) and bootstrap experiments (repeated resampling from one original bootstrap sample) were generated and compared. Sample…
ERIC Educational Resources Information Center
Spinella, Sarah
2011-01-01
As result replicability is essential to science and difficult to achieve through external replicability, the present paper notes the insufficiency of null hypothesis statistical significance testing (NHSST) and explains the bootstrap as a plausible alternative, with a heuristic example to illustrate the bootstrap method. The bootstrap relies on…
ERIC Educational Resources Information Center
Nevitt, Jonathan; Hancock, Gregory R.
2001-01-01
Evaluated the bootstrap method under varying conditions of nonnormality, sample size, model specification, and number of bootstrap samples drawn from the resampling space. Results for the bootstrap suggest the resampling-based method may be conservative in its control over model rejections, thus having an impact on the statistical power associated…
Nonparametric bootstrap analysis with applications to demographic effects in demand functions.
Gozalo, P L
1997-12-01
"A new bootstrap proposal, labeled smooth conditional moment (SCM) bootstrap, is introduced for independent but not necessarily identically distributed data, where the classical bootstrap procedure fails.... A good example of the benefits of using nonparametric and bootstrap methods is the area of empirical demand analysis. In particular, we will be concerned with their application to the study of two important topics: what are the most relevant effects of household demographic variables on demand behavior, and to what extent present parametric specifications capture these effects." excerpt
2017-02-01
scale blade servers (Dell PowerEdge) [20]. It must be recognized however, that the findings are distributed over this collection of architectures not...current operating system designs run into millions of lines of code. Moreover, they compound the opportunity for compromise by granting device drivers...properties (e.g. IP & MAC address) so as to invalidate an adversary’s surveillance data. The current running and bootstrapping instances of the micro
Quasi-Axially Symmetric Stellarators with 3 Field Periods
NASA Astrophysics Data System (ADS)
Garabedian, Paul; Ku, Long-Poe
1998-11-01
Compact hybrid configurations with 2 field periods have been studied recently as candidates for a proof of principle experiment at PPPL, cf. A. Reiman et al., Physics design of a high beta quasi-axially symmetric stellarator, J. Plas. Fus. Res. SERIES 1, 429(1998). This enterprise has led us to the discovery of a family of quasi-axially symmetric stellarators with 3 field periods that seem to have significant advantages, although their aspect ratios are a little larger. They have reversed shear and perform better in a local analysis of ballooning modes. Nonlinear equilibrium and stability calculations predict that the average beta limit may be as high as 6% if the bootstrap current turns out to be as big as that expected in comparable tokamaks. The concept relies on a combination of helical fields and bootstrap current to achieve adequate rotational transform at low aspect ratio. A detailed manuscript describing some of this work will be published soon, cf. P.R. Garabedian, Quasi-axially symmetric stellarators, Proc. Natl. Acad. Sci. USA 95 (1998).
Advanced tokamak investigations in full-tungsten ASDEX Upgrade
NASA Astrophysics Data System (ADS)
Bock, A.; Doerk, H.; Fischer, R.; Rittich, D.; Stober, J.; Burckhart, A.; Fable, E.; Geiger, B.; Mlynek, A.; Reich, M.; Zohm, H.; ASDEX Upgrade Team
2018-05-01
The appropriate tailoring of the q-profile is the key to accessing Advanced Tokamak (AT) scenarios, which are of great benefit to future all-metal fusion power plants. Such scenarios depend on low collisionality ν* which permits efficient external current drive and high amounts of intrinsic bootstrap current. At constant pressure, lowering of the electron density ne leads to a strong decrease in the collisionality with increasing electron temperature ν* ˜ Te-3 . Simultaneously, the conditions for low ne also benefit impurity accumulation. This paper reports on how radiative collapses due to central W accumulation were overcome by improved understanding of the changes to recycling and pumping, substantially expanded ECRH capacities for both heating and current drive, and a new solid W divertor capable of withstanding the power loads at low ne. Furthermore, it reports on various improvements to the reliability of the q-profile reconstruction. A candidate steady state scenario for ITER/DEMO (q95 = 5.3, βN = 2.7, fbs > 40%) is presented. The ion temperature profiles are steeper than predicted by TGLF, but nonlinear electromagnetic gyro-kinetic analyses with GENE including fast particle effects matched the experimental heat fluxes. A fully non-inductive scenario at higher q95 = 7.1 for current drive model validation is also discussed. The results show that non-inductive operation is principally compatible with full-metal machines.
Kaufmann, Esther; Wittmann, Werner W.
2016-01-01
The success of bootstrapping or replacing a human judge with a model (e.g., an equation) has been demonstrated in Paul Meehl’s (1954) seminal work and bolstered by the results of several meta-analyses. To date, however, analyses considering different types of meta-analyses as well as the potential dependence of bootstrapping success on the decision domain, the level of expertise of the human judge, and the criterion for what constitutes an accurate decision have been missing from the literature. In this study, we addressed these research gaps by conducting a meta-analysis of lens model studies. We compared the results of a traditional (bare-bones) meta-analysis with findings of a meta-analysis of the success of bootstrap models corrected for various methodological artifacts. In line with previous studies, we found that bootstrapping was more successful than human judgment. Furthermore, bootstrapping was more successful in studies with an objective decision criterion than in studies with subjective or test score criteria. We did not find clear evidence that the success of bootstrapping depended on the decision domain (e.g., education or medicine) or on the judge’s level of expertise (novice or expert). Correction of methodological artifacts increased the estimated success of bootstrapping, suggesting that previous analyses without artifact correction (i.e., traditional meta-analyses) may have underestimated the value of bootstrapping models. PMID:27327085
Enhancement of Edge Stability with Lithium Wall Coatings in NSTX
NASA Astrophysics Data System (ADS)
Maingi, R.; Bell, R. E.; Leblanc, B. P.; Kaita, R.; Kaye, S. M.; Kugel, H. W.; Mansfield, D. K.; Osborne, T. H.
2008-11-01
ELM reduction or elimination while maintaining high confinement is essential for ITER, which has been designed for H-mode operation. Large ELMs are thought to be triggered by exceeding either edge current density and/or pressure gradient limits (peeling, ballooning modes). Stability calculations show that spherical tori should have access to higher pressure gradients and pedestal heights than higher R/a tokamaks, owing to access to second stability regimes[...1]. An ELM-free regime was recently observed in the NSTX following the application of lithium onto the graphite plasma facing components[......2]. ELMs were eliminated in phases[.....3], with the resulting pressure gradients and pedestal widths increasing substantially. Calculations with TRANSP have shown that the edge bootstrap current increased substantially, consistent with second stability access. These ELM-free discharges have a substantial improvement in energy confinement, up to the global βN˜ 5.5 limit. * Supported by US DOE DE-FG02-04ER54520, DE-AC-76CH03073, and DE-FC02-04ER54698. [.1] P. B. Snyder, et. al., Plasma Phys. Contr. Fusion 46 (2004) A131. [2] H. W. Kugel, et. al., Phys. Plasma 15 (2008) #056118. [3] D. M. Mansfield, et. al., J. Nucl. Materials (2009) submitted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solomon, W. M., E-mail: solomon@fusion.gat.com; Bortolon, A.; Grierson, B. A.
A new high pedestal regime (“Super H-mode”) has been predicted and accessed on DIII-D. Super H-mode was first achieved on DIII-D using a quiescent H-mode edge, enabling a smooth trajectory through pedestal parameter space. By exploiting Super H-mode, it has been possible to access high pedestal pressures at high normalized densities. While elimination of Edge localized modes (ELMs) is beneficial for Super H-mode, it may not be a requirement, as recent experiments have maintained high pedestals with ELMs triggered by lithium granule injection. Simulations using TGLF for core transport and the EPED model for the pedestal find that ITER canmore » benefit from the improved performance associated with Super H-mode, with increased values of fusion power and gain possible. Similar studies demonstrate that the Super H-mode pedestal can be advantageous for a steady-state power plant, by providing a path to increasing the bootstrap current while simultaneously reducing the demands on the core physics performance.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solomon, W. M.; Snyder, P. B.; Bortolon, A.
In a new high pedestal regime ("Super H-mode") we predicted and accessed DIII-D. Super H-mode was first achieved on DIII-D using a quiescent H-mode edge, enabling a smooth trajectory through pedestal parameter space. By exploiting Super H-mode, it has been possible to access high pedestal pressures at high normalized densities. And while elimination of Edge localized modes (ELMs) is beneficial for Super H-mode, it may not be a requirement, as recent experiments have maintained high pedestals with ELMs triggered by lithium granule injection. Simulations using TGLF for core transport and the EPED model for the pedestal find that ITER canmore » benefit from the improved performance associated with Super H-mode, with increased values of fusion power and gain possible. In similar studies demonstrate that the Super H-mode pedestal can be advantageous for a steady-state power plant, by providing a path to increasing the bootstrap current while simultaneously reducing the demands on the core physics performance.« less
Efficient bootstrap estimates for tail statistics
NASA Astrophysics Data System (ADS)
Breivik, Øyvind; Aarnes, Ole Johan
2017-03-01
Bootstrap resamples can be used to investigate the tail of empirical distributions as well as return value estimates from the extremal behaviour of the sample. Specifically, the confidence intervals on return value estimates or bounds on in-sample tail statistics can be obtained using bootstrap techniques. However, non-parametric bootstrapping from the entire sample is expensive. It is shown here that it suffices to bootstrap from a small subset consisting of the highest entries in the sequence to make estimates that are essentially identical to bootstraps from the entire sample. Similarly, bootstrap estimates of confidence intervals of threshold return estimates are found to be well approximated by using a subset consisting of the highest entries. This has practical consequences in fields such as meteorology, oceanography and hydrology where return values are calculated from very large gridded model integrations spanning decades at high temporal resolution or from large ensembles of independent and identically distributed model fields. In such cases the computational savings are substantial.
What Teachers Should Know About the Bootstrap: Resampling in the Undergraduate Statistics Curriculum
Hesterberg, Tim C.
2015-01-01
Bootstrapping has enormous potential in statistics education and practice, but there are subtle issues and ways to go wrong. For example, the common combination of nonparametric bootstrapping and bootstrap percentile confidence intervals is less accurate than using t-intervals for small samples, though more accurate for larger samples. My goals in this article are to provide a deeper understanding of bootstrap methods—how they work, when they work or not, and which methods work better—and to highlight pedagogical issues. Supplementary materials for this article are available online. [Received December 2014. Revised August 2015] PMID:27019512
Warton, David I; Thibaut, Loïc; Wang, Yi Alice
2017-01-01
Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)-common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of "model-free bootstrap", adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods.
Xiao, Yongling; Abrahamowicz, Michal
2010-03-30
We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.
Darling, Stephen; Parker, Mary-Jane; Goodall, Karen E; Havelka, Jelena; Allen, Richard J
2014-03-01
When participants carry out visually presented digit serial recall, their performance is better if they are given the opportunity to encode extra visuospatial information at encoding-a phenomenon that has been termed visuospatial bootstrapping. This bootstrapping is the result of integration of information from different modality-specific short-term memory systems and visuospatial knowledge in long term memory, and it can be understood in the context of recent models of working memory that address multimodal binding (e.g., models incorporating an episodic buffer). Here we report a cross-sectional developmental study that demonstrated visuospatial bootstrapping in adults (n=18) and 9-year-old children (n=15) but not in 6-year-old children (n=18). This is the first developmental study addressing visuospatial bootstrapping, and results demonstrate that the developmental trajectory of bootstrapping is different from that of basic verbal and visuospatial working memory. This pattern suggests that bootstrapping (and hence integrative functions such as those associated with the episodic buffer) emerge independent of the development of basic working memory slave systems during childhood. Copyright © 2013 Elsevier Inc. All rights reserved.
A bootstrap based space-time surveillance model with an application to crime occurrences
NASA Astrophysics Data System (ADS)
Kim, Youngho; O'Kelly, Morton
2008-06-01
This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.
Espinar, J.L.
2006-01-01
Questions: What is the observed relationship between biomass and species richness across both spatial and temporal scales in communities of submerged annual macrophytes? Does the number of plots sampled affect detection of hump-shaped pattern? Location: Don??ana National Park, southwestern Spain. Methods: A total of 102 plots were sampled during four hydrological cycles. In each hydrological cycle, the plots were distributed randomly along an environmental flooding gradient in three contrasted microhabitats located in the transition zone just below the upper marsh. In each plot (0.5 m x 0.5 m), plant density and above- and below-ground biomass of submerged vegetation were measured. The hump-shaped model was tested by using a generalized linear model (GLM). A bootstrap procedure was used to test the effect of the number of plots on the ability to detect hump-shaped patterns. Result: The area exhibited low species density with a range of 1 - 9 species and low values of biomass with a range of 0.2 - 87.6 g-DW / 0.25 m2. When data from all years and all microhabitats were combined, the relationships between biomass and species richness showed a hump-shaped pattern. The number of plots was large enough to allow detection of the hump-shaped pattern across microhabitats but it was too small to confirm the hump-shaped pattern within each individual microhabitat. Conclusion: This study provides evidence of hump-shaped patterns across microhabitats when GLM analysis is used. In communities of submerged annual macrophytes in Mediterranean wetlands, the highest species density occurs in intermediate values of biomass. The bootstrap procedure indicates that the number of plots affects the detection of hump-shaped patterns. ?? IAVS; Opulus Press.
Battaglia, Maurizio; Hill, D.P.
2009-01-01
Joint measurements of ground deformation and micro-gravity changes are an indispensable component for any volcano monitoring strategy. A number of analytical mathematical models are available in the literature that can be used to fit geodetic data and infer source location, depth and density. Bootstrap statistical methods allow estimations of the range of the inferred parameters. Although analytical models often assume that the crust is elastic, homogenous and isotropic, they can take into account different source geometries, the influence of topography, and gravity background noise. The careful use of analytical models, together with high quality data sets, can produce valuable insights into the nature of the deformation/gravity source. Here we present a review of various modeling methods, and use the historical unrest at Long Valley caldera (California) from 1982 to 1999 to illustrate the practical application of analytical modeling and bootstrap to constrain the source of unrest. A key question is whether the unrest at Long Valley since the late 1970s can be explained without calling upon an intrusion of magma. The answer, apparently, is no. Our modeling indicates that the inflation source is a slightly tilted prolate ellipsoid (dip angle between 91?? and 105??) at a depth of 6.5 to 7.9??km beneath the caldera resurgent dome with an aspect ratio between 0.44 and 0.60, a volume change from 0.161 to 0.173??km3 and a density of 1241 to 2093??kg/m3. The larger uncertainty of the density estimate reflects the higher noise of gravity measurements. These results are consistent with the intrusion of silicic magma with a significant amount of volatiles beneath the caldera resurgent dome. ?? 2008 Elsevier B.V.
Augmenting Literacy: The Role of Expertise in Digital Writing
ERIC Educational Resources Information Center
Van Ittersum, Derek
2011-01-01
This essay presents a model of reflective use of writing technologies, one that provides a means of more fully exploiting the possibilities of these tools for transforming writing activity. Derived from the work of computer designer Douglas Engelbart, the "bootstrapping" model of reflective use extends current arguments in the field…
ERIC Educational Resources Information Center
Stapleton, Laura M.
2008-01-01
This article discusses replication sampling variance estimation techniques that are often applied in analyses using data from complex sampling designs: jackknife repeated replication, balanced repeated replication, and bootstrapping. These techniques are used with traditional analyses such as regression, but are currently not used with structural…
ERIC Educational Resources Information Center
Harrison, David
1979-01-01
The issue of observability and the relative roles of the senses and reason in understanding the world is reviewed. Eastern "mystical" philosophy serves as a focus in which interpretations of quantum mechanics, as well as the current bootstrap-quark controversy are seen in some slightly different contexts. (Author/GA)
Cosmological explosions from cold dark matter perturbations
NASA Technical Reports Server (NTRS)
Scherrer, Robert J.
1992-01-01
The cosmological-explosion model is examined for a universe dominated by cold dark matter in which explosion seeds are produced from the growth of initial density perturbations of a given form. Fragmentation of the exploding shells is dominated by the dark-matter potential wells rather than the self-gravity of the shells, and particular conditions are required for the explosions to bootstrap up to very large scales. The final distribution of dark matter is strongly correlated with the baryons on small scales, but uncorrelated on large scales.
ERIC Educational Resources Information Center
Enders, Craig K.
2005-01-01
The Bollen-Stine bootstrap can be used to correct for standard error and fit statistic bias that occurs in structural equation modeling (SEM) applications due to nonnormal data. The purpose of this article is to demonstrate the use of a custom SAS macro program that can be used to implement the Bollen-Stine bootstrap with existing SEM software.…
Comparison of Sample Size by Bootstrap and by Formulas Based on Normal Distribution Assumption.
Wang, Zuozhen
2018-01-01
Bootstrapping technique is distribution-independent, which provides an indirect way to estimate the sample size for a clinical trial based on a relatively smaller sample. In this paper, sample size estimation to compare two parallel-design arms for continuous data by bootstrap procedure are presented for various test types (inequality, non-inferiority, superiority, and equivalence), respectively. Meanwhile, sample size calculation by mathematical formulas (normal distribution assumption) for the identical data are also carried out. Consequently, power difference between the two calculation methods is acceptably small for all the test types. It shows that the bootstrap procedure is a credible technique for sample size estimation. After that, we compared the powers determined using the two methods based on data that violate the normal distribution assumption. To accommodate the feature of the data, the nonparametric statistical method of Wilcoxon test was applied to compare the two groups in the data during the process of bootstrap power estimation. As a result, the power estimated by normal distribution-based formula is far larger than that by bootstrap for each specific sample size per group. Hence, for this type of data, it is preferable that the bootstrap method be applied for sample size calculation at the beginning, and that the same statistical method as used in the subsequent statistical analysis is employed for each bootstrap sample during the course of bootstrap sample size estimation, provided there is historical true data available that can be well representative of the population to which the proposed trial is planning to extrapolate.
Application of the Bootstrap Methods in Factor Analysis.
ERIC Educational Resources Information Center
Ichikawa, Masanori; Konishi, Sadanori
1995-01-01
A Monte Carlo experiment was conducted to investigate the performance of bootstrap methods in normal theory maximum likelihood factor analysis when the distributional assumption was satisfied or unsatisfied. Problems arising with the use of bootstrap methods are highlighted. (SLD)
External heating and current drive source requirements towards steady-state operation in ITER
NASA Astrophysics Data System (ADS)
Poli, F. M.; Kessel, C. E.; Bonoli, P. T.; Batchelor, D. B.; Harvey, R. W.; Snyder, P. B.
2014-07-01
Steady state scenarios envisaged for ITER aim at optimizing the bootstrap current, while maintaining sufficient confinement and stability to provide the necessary fusion yield. Non-inductive scenarios will need to operate with internal transport barriers (ITBs) in order to reach adequate fusion gain at typical currents of 9 MA. However, the large pressure gradients associated with ITBs in regions of weak or negative magnetic shear can be conducive to ideal MHD instabilities, reducing the no-wall limit. The E × B flow shear from toroidal plasma rotation is expected to be low in ITER, with a major role in the ITB dynamics being played by magnetic geometry. Combinations of heating and current drive (H/CD) sources that sustain reversed magnetic shear profiles throughout the discharge are the focus of this work. Time-dependent transport simulations indicate that a combination of electron cyclotron (EC) and lower hybrid (LH) waves is a promising route towards steady state operation in ITER. The LH forms and sustains expanded barriers and the EC deposition at mid-radius freezes the bootstrap current profile stabilizing the barrier and leading to confinement levels 50% higher than typical H-mode energy confinement times. Using LH spectra with spectrum centred on parallel refractive index of 1.75-1.85, the performance of these plasma scenarios is close to the ITER target of 9 MA non-inductive current, global confinement gain H98 = 1.6 and fusion gain Q = 5.
Small sample mediation testing: misplaced confidence in bootstrapped confidence intervals.
Koopman, Joel; Howe, Michael; Hollenbeck, John R; Sin, Hock-Peng
2015-01-01
Bootstrapping is an analytical tool commonly used in psychology to test the statistical significance of the indirect effect in mediation models. Bootstrapping proponents have particularly advocated for its use for samples of 20-80 cases. This advocacy has been heeded, especially in the Journal of Applied Psychology, as researchers are increasingly utilizing bootstrapping to test mediation with samples in this range. We discuss reasons to be concerned with this escalation, and in a simulation study focused specifically on this range of sample sizes, we demonstrate not only that bootstrapping has insufficient statistical power to provide a rigorous hypothesis test in most conditions but also that bootstrapping has a tendency to exhibit an inflated Type I error rate. We then extend our simulations to investigate an alternative empirical resampling method as well as a Bayesian approach and demonstrate that they exhibit comparable statistical power to bootstrapping in small samples without the associated inflated Type I error. Implications for researchers testing mediation hypotheses in small samples are presented. For researchers wishing to use these methods in their own research, we have provided R syntax in the online supplemental materials. (c) 2015 APA, all rights reserved.
Bootstrap confidence levels for phylogenetic trees.
Efron, B; Halloran, E; Holmes, S
1996-07-09
Evolutionary trees are often estimated from DNA or RNA sequence data. How much confidence should we have in the estimated trees? In 1985, Felsenstein [Felsenstein, J. (1985) Evolution 39, 783-791] suggested the use of the bootstrap to answer this question. Felsenstein's method, which in concept is a straightforward application of the bootstrap, is widely used, but has been criticized as biased in the genetics literature. This paper concerns the use of the bootstrap in the tree problem. We show that Felsenstein's method is not biased, but that it can be corrected to better agree with standard ideas of confidence levels and hypothesis testing. These corrections can be made by using the more elaborate bootstrap method presented here, at the expense of considerably more computation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Batchelor, D.B.; Carreras, B.A.; Hirshman, S.P.
Significant progress has been made in the development of new modest-size compact stellarator devices that could test optimization principles for the design of a more attractive reactor. These are 3 and 4 field period low-aspect-ratio quasi-omnigenous (QO) stellarators based on an optimization method that targets improved confinement, stability, ease of coil design, low-aspect-ratio, and low bootstrap current.
Working Memory Deficits and Social Problems in Children with ADHD
ERIC Educational Resources Information Center
Kofler, Michael J.; Rapport, Mark D.; Bolden, Jennifer; Sarver, Dustin E.; Raiker, Joseph S.; Alderson, R. Matt
2011-01-01
Social problems are a prevalent feature of ADHD and reflect a major source of functional impairment for these children. The current study examined the impact of working memory deficits on parent- and teacher-reported social problems in a sample of children with ADHD and typically developing boys (N = 39). Bootstrapped, bias-corrected mediation…
ERIC Educational Resources Information Center
Pejovic, Jovana; Molnar, Monika
2017-01-01
Recently it has been proposed that sensitivity to nonarbitrary relationships between speech sounds and objects potentially bootstraps lexical acquisition. However, it is currently unclear whether preverbal infants (e.g., before 6 months of age) with different linguistic profiles are sensitive to such nonarbitrary relationships. Here, the authors…
An algebraic approach to the analytic bootstrap
Alday, Luis F.; Zhiboedov, Alexander
2017-04-27
We develop an algebraic approach to the analytic bootstrap in CFTs. By acting with the Casimir operator on the crossing equation we map the problem of doing large spin sums to any desired order to the problem of solving a set of recursion relations. We compute corrections to the anomalous dimension of large spin operators due to the exchange of a primary and its descendants in the crossed channel and show that this leads to a Borel-summable expansion. Here, we analyse higher order corrections to the microscopic CFT data in the direct channel and its matching to infinite towers ofmore » operators in the crossed channel. We apply this method to the critical O(N ) model. At large N we reproduce the first few terms in the large spin expansion of the known two-loop anomalous dimensions of higher spin currents in the traceless symmetric representation of O(N ) and make further predictions. At small N we present the results for the truncated large spin expansion series of anomalous dimensions of higher spin currents.« less
Progress toward steady-state tokamak operation exploiting the high bootstrap current fraction regime
Ren, Q. L.; Garofalo, A. M.; Gong, X. Z.; ...
2016-06-20
Recent DIII-D experiments have increased the normalized fusion performance of the high bootstrap current fraction tokamak regime toward reactor-relevant steady state operation. The experiments, conducted by a joint team of researchers from the DIII-D and EAST tokamaks, developed a fully noninductive scenario that could be extended on EAST to a demonstration of long pulse steady-state tokamak operation. Improved understanding of scenario stability has led to the achievement of very high values of β p and β N despite strong ITBs. Good confinement has been achieved with reduced toroidal rotation. These high β p plasmas challenge the energy transport understanding, especiallymore » in the electron energy channel. A new turbulent transport model, named 2 TGLF-SAT1, has been developed which improves the transport prediction. Experiments extending results to long pulse on EAST, based on the physics basis developed at DIII-D, have been conducted. Finally, more investigations will be carried out on EAST with more additional auxiliary power to come online in the near term.« less
Coefficient Alpha Bootstrap Confidence Interval under Nonnormality
ERIC Educational Resources Information Center
Padilla, Miguel A.; Divers, Jasmin; Newton, Matthew
2012-01-01
Three different bootstrap methods for estimating confidence intervals (CIs) for coefficient alpha were investigated. In addition, the bootstrap methods were compared with the most promising coefficient alpha CI estimation methods reported in the literature. The CI methods were assessed through a Monte Carlo simulation utilizing conditions…
Pearson-type goodness-of-fit test with bootstrap maximum likelihood estimation.
Yin, Guosheng; Ma, Yanyuan
2013-01-01
The Pearson test statistic is constructed by partitioning the data into bins and computing the difference between the observed and expected counts in these bins. If the maximum likelihood estimator (MLE) of the original data is used, the statistic generally does not follow a chi-squared distribution or any explicit distribution. We propose a bootstrap-based modification of the Pearson test statistic to recover the chi-squared distribution. We compute the observed and expected counts in the partitioned bins by using the MLE obtained from a bootstrap sample. This bootstrap-sample MLE adjusts exactly the right amount of randomness to the test statistic, and recovers the chi-squared distribution. The bootstrap chi-squared test is easy to implement, as it only requires fitting exactly the same model to the bootstrap data to obtain the corresponding MLE, and then constructs the bin counts based on the original data. We examine the test size and power of the new model diagnostic procedure using simulation studies and illustrate it with a real data set.
High-beta steady-state research with integrated modeling in the JT-60 Upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ozeki, T.
2007-05-15
Improvement of high-beta performance and its long sustainment was obtained with ferritic steel tiles in the JT-60 Upgrade (JT-60U) [T. Fujita et al., Phys. Plasmas 50, 104 (2005)], which were installed inside the vacuum vessel to reduce fast ion loss by decreasing the toroidal field ripple. When a separation between the plasma surface and the wall was small, high-beta plasmas reached the ideal wall stability limit, i.e., the ideal magnetohydrodynamics stability limit with the wall stabilization. A small rotation velocity of 0.3% of the Alfven velocity was found to be effective for suppressing the resistive wall mode. Sustainment of themore » high normalized beta value of {beta}{sub N}=2.3 has been extended to 28.6 s ({approx}15 times the current diffusion time) by improvement of the confinement and increase in the net heating power. Based on the research in JT-60U experiments and first-principle simulations, integrated models of core, edge-pedestal, and scrape-off-layer (SOL) divertors were developed, and they clarified complex features of reactor-relevant plasmas. The integrated core plasma model indicated that the small amount of electron cyclotron (EC) current density of about half the bootstrap current density could effectively stabilize the neoclassical tearing mode by the localized EC current accurately aligned to the magnetic island center. The integrated edge-pedestal model clarified that the collisionality dependence of energy loss due to the edge-localized mode was caused by the change in the width of the unstable mode and the SOL transport. The integrated SOL-divertor model clarified the effect of the exhaust slot on the pumping efficiency and the cause of enhanced radiation near the X-point multifaceted asymmetric radiation from edge. Success in these consistent analyses using the integrated code indicates that it is an effective means to investigate complex plasmas and to control the integrated performance.« less
Thibaut, Loïc; Wang, Yi Alice
2017-01-01
Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)—common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of “model-free bootstrap”, adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods. PMID:28738071
Bootstrap Estimates of Standard Errors in Generalizability Theory
ERIC Educational Resources Information Center
Tong, Ye; Brennan, Robert L.
2007-01-01
Estimating standard errors of estimated variance components has long been a challenging task in generalizability theory. Researchers have speculated about the potential applicability of the bootstrap for obtaining such estimates, but they have identified problems (especially bias) in using the bootstrap. Using Brennan's bias-correcting procedures…
Problems with Multivariate Normality: Can the Multivariate Bootstrap Help?
ERIC Educational Resources Information Center
Thompson, Bruce
Multivariate normality is required for some statistical tests. This paper explores the implications of violating the assumption of multivariate normality and illustrates a graphical procedure for evaluating multivariate normality. The logic for using the multivariate bootstrap is presented. The multivariate bootstrap can be used when distribution…
Investigation of geomagnetic induced current at high latitude during the storm-time variation
NASA Astrophysics Data System (ADS)
Falayi, E. O.; Ogunmodimu, O.; Bolaji, O. S.; Ayanda, J. D.; Ojoniyi, O. S.
2017-06-01
During the geomagnetic disturbances, the geomagnetically induced current (GIC) are influenced by the geoelectric field flowing in conductive Earth. In this paper, we studied the variability of GICs, the time derivatives of the geomagnetic field (dB/dt), geomagnetic indices: Symmetric disturbance field in H (SYM-H) index, AU (eastward electrojet) and AL (westward electrojet) indices, Interplanetary parameters such as solar wind speed (v), and interplanetary magnetic field (Bz) during the geomagnetic storms on 31 March 2001, 21 October 2001, 6 November 2001, 29 October 2003, 31 October 2003 and 9 November 2004 with high solar wind speed due to a coronal mass ejection. Wavelet spectrum based approach was employed to analyze the GIC time series in a sequence of time scales of one to twenty four hours. It was observed that there are more concentration of power between the 14-24 h on 31 March 2001, 17-24 h on 21 October 2001, 1-7 h on 6 November 2001, two peaks were observed between 5-8 h and 21-24 h on 29 October 2003, 1-3 h on 31 October 2003 and 18-22 h on 9 November 2004. Bootstrap method was used to obtain regression correlations between the time derivative of the geomagnetic field (dB/dt) and the observed values of the geomagnetic induced current on 31 March 2001, 21 October 2001, 6 November 2001, 29 October 2003, 31 October 2003 and 9 November 2004 which shows a distributed cluster of correlation coefficients at around r = -0.567, -0.717, -0.477, -0.419, -0.210 and r = -0.488 respectively. We observed that high energy wavelet coefficient correlated well with bootstrap correlation, while low energy wavelet coefficient gives low bootstrap correlation. It was noticed that the geomagnetic storm has a influence on GIC and geomagnetic field derivatives (dB/dt). This might be ascribed to the coronal mass ejection with solar wind due to particle acceleration processes in the solar atmosphere.
Transport in the plateau regime in a tokamak pedestal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seol, J.; Shaing, K. C.
In a tokamak H-mode, a strong E Multiplication-Sign B flow shear is generated during the L-H transition. Turbulence in a pedestal is suppressed significantly by this E Multiplication-Sign B flow shear. In this case, neoclassical transport may become important. The neoclassical fluxes are calculated in the plateau regime with the parallel plasma flow using their kinetic definitions. In an axisymmetric tokamak, the neoclassical particles fluxes can be decomposed into the banana-plateau flux and the Pfirsch-Schlueter flux. The banana-plateau particle flux is driven by the parallel viscous force and the Pfirsch-Schlueter flux by the poloidal variation of the friction force. Themore » combined quantity of the radial electric field and the parallel flow is determined by the flux surface averaged parallel momentum balance equation rather than requiring the ambipolarity of the total particle fluxes. In this process, the Pfirsch-Schlueter flux does not appear in the flux surface averaged parallel momentum equation. Only the banana-plateau flux is used to determine the parallel flow in the form of the flux surface averaged parallel viscosity. The heat flux, obtained using the solution of the parallel momentum balance equation, decreases exponentially in the presence of sonic M{sub p} without any enhancement over that in the standard neoclassical theory. Here, M{sub p} is a combination of the poloidal E Multiplication-Sign B flow and the parallel mass flow. The neoclassical bootstrap current in the plateau regime is presented. It indicates that the neoclassical bootstrap current also is related only to the banana-plateau fluxes. Finally, transport fluxes are calculated when M{sub p} is large enough to make the parallel electron viscosity comparable with the parallel ion viscosity. It is found that the bootstrap current has a finite value regardless of the magnitude of M{sub p}.« less
Visceral sensitivity, anxiety, and smoking among treatment-seeking smokers.
Zvolensky, Michael J; Bakhshaie, Jafar; Norton, Peter J; Smits, Jasper A J; Buckner, Julia D; Garey, Lorra; Manning, Kara
2017-12-01
It is widely recognized that smoking is related to abdominal pain and discomfort, as well as gastrointestinal disorders. Research has shown that visceral sensitivity, experiencing anxiety around gastrointestinal sensations, is associated with poorer gastrointestinal health and related health outcomes. Visceral sensitivity also increases anxiety symptoms and mediates the relation with other risk factors, including gastrointestinal distress. No work to date, however, has evaluated visceral sensitivity in the context of smoking despite the strong association between smoking and poor physical and mental health. The current study sought to examine visceral sensitivity as a unique predictor of cigarette dependence, threat-related smoking abstinence expectancies (somatic symptoms and harmful consequences), and perceived barriers for cessation via anxiety symptoms. Eighty-four treatment seeking adult daily smokers (M age =45.1years [SD=10.4]; 71.6% male) participated in this study. There was a statistically significant indirect effect of visceral sensitivity via general anxiety symptoms on cigarette dependence (b=0.02, SE=0.01, Bootstrapped 95% CI [0.006, 0.05]), smoking abstinence somatic expectancies (b=0.10, SE=0.03, Bootstrapped 95% CI [0.03, 0.19]), smoking abstinence harmful experiences (b=0.13, SE=0.05, Bootstrapped 95% CI [0.03, 0.25]), and barriers to cessation (b=0.05, SE=0.06, Bootstrapped 95% CI [0.01, 0.13]). Overall, the present study serves as an initial investigation into the nature of the associations between visceral sensitivity, anxiety symptoms, and clinically significant smoking processes among treatment-seeking smokers. Future work is needed to explore the extent to which anxiety accounts for relations between visceral sensitivity and other smoking processes (e.g., withdrawal, cessation outcome). Copyright © 2017 Elsevier Ltd. All rights reserved.
Unbiased Estimates of Variance Components with Bootstrap Procedures
ERIC Educational Resources Information Center
Brennan, Robert L.
2007-01-01
This article provides general procedures for obtaining unbiased estimates of variance components for any random-model balanced design under any bootstrap sampling plan, with the focus on designs of the type typically used in generalizability theory. The results reported here are particularly helpful when the bootstrap is used to estimate standard…
Explorations in Statistics: the Bootstrap
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2009-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fourth installment of Explorations in Statistics explores the bootstrap. The bootstrap gives us an empirical approach to estimate the theoretical variability among possible values of a sample statistic such as the…
Bootstrapping Confidence Intervals for Robust Measures of Association.
ERIC Educational Resources Information Center
King, Jason E.
A Monte Carlo simulation study was conducted to determine the bootstrap correction formula yielding the most accurate confidence intervals for robust measures of association. Confidence intervals were generated via the percentile, adjusted, BC, and BC(a) bootstrap procedures and applied to the Winsorized, percentage bend, and Pearson correlation…
Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection
NASA Technical Reports Server (NTRS)
Kumar, Sricharan; Srivistava, Ashok N.
2012-01-01
Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.
Parameter exploration for a Compact Advanced Tokamak DEMO
NASA Astrophysics Data System (ADS)
Weisberg, D. B.; Buttery, R. J.; Ferron, J. R.; Garofalo, A. M.; Snyder, P. B.; Turnbull, A. D.; Holcomb, C. T.; McClenaghan, J.; Canik, J.; Park, J.-M.
2017-10-01
A new parameter study has explored a range of design points to assess the physics feasibility for a compact 200MWe advanced tokamak DEMO that combines high beta (βN < 4) and high toroidal field (BT = 6 - 7 T). A unique aspect of this study is the use of a FASTRAN modeling suite that combines integrated transport, pedestal, stability, and heating & current drive calculations to predict steady-state solutions with neutral beam and helicon powered current drive. This study has identified a range of design solutions in a compact (R0 = 4 m), high-field (BT = 6 - 7 T), strongly-shaped (κ = 2 , δ = 0.6) device. Unlike previous proposals, C-AT DEMO takes advantage of high-beta operation as well as emerging advances in magnet technology to demonstrate net electric production in a moderately sized machine. We present results showing that the large bootstrap fraction and low recirculating power enabled by high normalized beta can achieve tolerable heat and neutron load with good H-mode access. The prediction of operating points with simultaneously achieved high-confinement (H98 < 1.3), high-density (fGW < 1.3), and high-beta warrants additional assessment of this approach towards a cost-attractive DEMO device. Work supported by the US DOE under DE-FC02-04ER54698.
Schneider, Kevin; Koblmüller, Stephan; Sefc, Kristina M
2015-11-11
The homoplasy excess test (HET) is a tree-based screen for hybrid taxa in multilocus nuclear phylogenies. Homoplasy between a hybrid taxon and the clades containing the parental taxa reduces bootstrap support in the tree. The HET is based on the expectation that excluding the hybrid taxon from the data set increases the bootstrap support for the parental clades, whereas excluding non-hybrid taxa has little effect on statistical node support. To carry out a HET, bootstrap trees are calculated with taxon-jackknife data sets, that is excluding one taxon (species, population) at a time. Excess increase in bootstrap support for certain nodes upon exclusion of a particular taxon indicates the hybrid (the excluded taxon) and its parents (the clades with increased support).We introduce a new software program, hext, which generates the taxon-jackknife data sets, runs the bootstrap tree calculations, and identifies excess bootstrap increases as outlier values in boxplot graphs. hext is written in r language and accepts binary data (0/1; e.g. AFLP) as well as co-dominant SNP and genotype data.We demonstrate the usefulness of hext in large SNP data sets containing putative hybrids and their parents. For instance, using published data of the genus Vitis (~6,000 SNP loci), hext output supports V. × champinii as a hybrid between V. rupestris and V. mustangensis .With simulated SNP and AFLP data sets, excess increases in bootstrap support were not always connected with the hybrid taxon (false positives), whereas the expected bootstrap signal failed to appear on several occasions (false negatives). Potential causes for both types of spurious results are discussed.With both empirical and simulated data sets, the taxon-jackknife output generated by hext provided additional signatures of hybrid taxa, including changes in tree topology across trees, consistent effects of exclusions of the hybrid and the parent taxa, and moderate (rather than excessive) increases in bootstrap support. hext significantly facilitates the taxon-jackknife approach to hybrid taxon detection, even though the simple test for excess bootstrap increase may not reliably identify hybrid taxa in all applications.
Bootstrap Estimation of Sample Statistic Bias in Structural Equation Modeling.
ERIC Educational Resources Information Center
Thompson, Bruce; Fan, Xitao
This study empirically investigated bootstrap bias estimation in the area of structural equation modeling (SEM). Three correctly specified SEM models were used under four different sample size conditions. Monte Carlo experiments were carried out to generate the criteria against which bootstrap bias estimation should be judged. For SEM fit indices,…
A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment
ERIC Educational Resources Information Center
Finch, Holmes; Monahan, Patrick
2008-01-01
This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…
NASA Astrophysics Data System (ADS)
Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker
2017-10-01
Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c≥ 13/24 for the central charge of such models, which we argue cannot be saturated by an interacting SCFT.
Unconventional Expressions: Productive Syntax in the L2 Acquisition of Formulaic Language
ERIC Educational Resources Information Center
Bardovi-Harlig, Kathleen; Stringer, David
2017-01-01
This article presents a generative analysis of the acquisition of formulaic language as an alternative to current usage-based proposals. One influential view of the role of formulaic expressions in second language (L2) development is that they are a bootstrapping mechanism into the L2 grammar; an initial repertoire of constructions allows for…
Exploration of the Super H-mode regime on DIII-D and potential advantages for burning plasma devices
Solomon, W. M.; Snyder, P. B.; Bortolon, A.; ...
2016-03-25
In a new high pedestal regime ("Super H-mode") we predicted and accessed DIII-D. Super H-mode was first achieved on DIII-D using a quiescent H-mode edge, enabling a smooth trajectory through pedestal parameter space. By exploiting Super H-mode, it has been possible to access high pedestal pressures at high normalized densities. And while elimination of Edge localized modes (ELMs) is beneficial for Super H-mode, it may not be a requirement, as recent experiments have maintained high pedestals with ELMs triggered by lithium granule injection. Simulations using TGLF for core transport and the EPED model for the pedestal find that ITER canmore » benefit from the improved performance associated with Super H-mode, with increased values of fusion power and gain possible. In similar studies demonstrate that the Super H-mode pedestal can be advantageous for a steady-state power plant, by providing a path to increasing the bootstrap current while simultaneously reducing the demands on the core physics performance.« less
Development of a repetitive compact torus injector
NASA Astrophysics Data System (ADS)
Onchi, Takumi; McColl, David; Dreval, Mykola; Rohollahi, Akbar; Xiao, Chijin; Hirose, Akira; Zushi, Hideki
2013-10-01
A system for Repetitive Compact Torus Injection (RCTI) has been developed at the University of Saskatchewan. CTI is a promising fuelling technology to directly fuel the core region of tokamak reactors. In addition to fuelling, CTI has also the potential for (a) optimization of density profile and thus bootstrap current and (b) momentum injection. For steady-state reactor operation, RCTI is necessary. The approach to RCTI is to charge a storage capacitor bank with a large capacitance and quickly charge the CT capacitor bank through a stack of integrated-gate bipolar transistors (IGBTs). When the CT bank is fully charged, the IGBT stack will be turned off to isolate banks, and CT formation/acceleration sequence will start. After formation of each CT, the fast bank will be replenished and a new CT will be formed and accelerated. Circuits for the formation and the acceleration in University of Saskatchewan CT Injector (USCTI) have been modified. Three CT shots at 10 Hz or eight shots at 1.7 Hz have been achieved. This work has been sponsored by the CRC and NSERC, Canada.
Confidence Intervals for the Mean: To Bootstrap or Not to Bootstrap
ERIC Educational Resources Information Center
Calzada, Maria E.; Gardner, Holly
2011-01-01
The results of a simulation conducted by a research team involving undergraduate and high school students indicate that when data is symmetric the student's "t" confidence interval for a mean is superior to the studied non-parametric bootstrap confidence intervals. When data is skewed and for sample sizes n greater than or equal to 10,…
The Beginner's Guide to the Bootstrap Method of Resampling.
ERIC Educational Resources Information Center
Lane, Ginny G.
The bootstrap method of resampling can be useful in estimating the replicability of study results. The bootstrap procedure creates a mock population from a given sample of data from which multiple samples are then drawn. The method extends the usefulness of the jackknife procedure as it allows for computation of a given statistic across a maximal…
Application of a New Resampling Method to SEM: A Comparison of S-SMART with the Bootstrap
ERIC Educational Resources Information Center
Bai, Haiyan; Sivo, Stephen A.; Pan, Wei; Fan, Xitao
2016-01-01
Among the commonly used resampling methods of dealing with small-sample problems, the bootstrap enjoys the widest applications because it often outperforms its counterparts. However, the bootstrap still has limitations when its operations are contemplated. Therefore, the purpose of this study is to examine an alternative, new resampling method…
A Primer on Bootstrap Factor Analysis as Applied to Health Studies Research
ERIC Educational Resources Information Center
Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.
2014-01-01
Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…
Forecasting drought risks for a water supply storage system using bootstrap position analysis
Tasker, Gary; Dunne, Paul
1997-01-01
Forecasting the likelihood of drought conditions is an integral part of managing a water supply storage and delivery system. Position analysis uses a large number of possible flow sequences as inputs to a simulation of a water supply storage and delivery system. For a given set of operating rules and water use requirements, water managers can use such a model to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows a few months ahead conditioned on the current reservoir levels and streamflows. The large number of possible flow sequences are generated using a stochastic streamflow model with a random resampling of innovations. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality and it allows incorporation of long-range weather forecasts into the analysis.
Bootstrap position analysis for forecasting low flow frequency
Tasker, Gary D.; Dunne, P.
1997-01-01
A method of random resampling of residuals from stochastic models is used to generate a large number of 12-month-long traces of natural monthly runoff to be used in a position analysis model for a water-supply storage and delivery system. Position analysis uses the traces to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows conditioned on the current reservoir levels and streamflows. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality, fewer parameters need to be estimated directly from the data, and accounting for parameter uncertainty is easily done. For a given set of operating rules and water-use requirements for a system, water managers can use such a model as a decision-making tool to evaluate different operating rules. ?? ASCE,.
Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A
2017-06-30
Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Cui, Zhongmin; Kolen, Michael J.
2008-01-01
This article considers two methods of estimating standard errors of equipercentile equating: the parametric bootstrap method and the nonparametric bootstrap method. Using a simulation study, these two methods are compared under three sample sizes (300, 1,000, and 3,000), for two test content areas (the Iowa Tests of Basic Skills Maps and Diagrams…
Non-inductive current drive and transport in high βN plasmas in JET
NASA Astrophysics Data System (ADS)
Voitsekhovitch, I.; Alper, B.; Brix, M.; Budny, R. V.; Buratti, P.; Challis, C. D.; Ferron, J.; Giroud, C.; Joffrin, E.; Laborde, L.; Luce, T. C.; McCune, D.; Menard, J.; Murakami, M.; Park, J. M.; JET-EFDA contributors
2009-05-01
A route to stationary MHD stable operation at high βN has been explored at the Joint European Torus (JET) by optimizing the current ramp-up, heating start time and the waveform of neutral beam injection (NBI) power. In these scenarios the current ramp-up has been accompanied by plasma pre-heat (or the NBI has been started before the current flat-top) and NBI power up to 22 MW has been applied during the current flat-top. In the discharges considered transient total βN ≈ 3.3 and stationary (during high power phase) βN ≈ 3 have been achieved by applying the feedback control of βN with the NBI power in configurations with monotonic or flat core safety factor profile and without an internal transport barrier (ITB). The transport and current drive in this scenario is analysed here by using the TRANSP and ASTRA codes. The interpretative analysis performed with TRANSP shows that 50-70% of current is driven non-inductively; half of this current is due to the bootstrap current which has a broad profile since an ITB was deliberately avoided. The GLF23 transport model predicts the temperature profiles within a ±22% discrepancy with the measurements over the explored parameter space. Predictive simulations with this model show that the E × B rotational shear plays an important role for thermal ion transport in this scenario, producing up to a 40% increase of the ion temperature. By applying transport and current drive models validated in self-consistent simulations of given reference scenarios in a wider parameter space, the requirements for fully non-inductive stationary operation at JET are estimated. It is shown that the strong stiffness of the temperature profiles predicted by the GLF23 model restricts the bootstrap current at larger heating power. In this situation full non-inductive operation without an ITB can be rather expensive strongly relying on the external non-inductive current drive sources.
NASA Astrophysics Data System (ADS)
Coupon, Jean; Leauthaud, Alexie; Kilbinger, Martin; Medezinski, Elinor
2017-07-01
SWOT (Super W Of Theta) computes two-point statistics for very large data sets, based on “divide and conquer” algorithms, mainly, but not limited to data storage in binary trees, approximation at large scale, parellelization (open MPI), and bootstrap and jackknife resampling methods “on the fly”. It currently supports projected and 3D galaxy auto and cross correlations, galaxy-galaxy lensing, and weighted histograms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. Brooks; A.H. Reiman; G.H. Neilson
High-beta, low-aspect-ratio (compact) stellarators are promising solutions to the problem of developing a magnetic plasma configuration for magnetic fusion power plants that can be sustained in steady-state without disrupting. These concepts combine features of stellarators and advanced tokamaks and have aspect ratios similar to those of tokamaks (2-4). They are based on computed plasma configurations that are shaped in three dimensions to provide desired stability and transport properties. Experiments are planned as part of a program to develop this concept. A beta = 4% quasi-axisymmetric plasma configuration has been evaluated for the National Compact Stellarator Experiment (NCSX). It has amore » substantial bootstrap current and is shaped to stabilize ballooning, external kink, vertical, and neoclassical tearing modes without feedback or close-fitting conductors. Quasi-omnigeneous plasma configurations stable to ballooning modes at beta = 4% have been evaluated for the Quasi-Omnigeneous Stellarator (QOS) experiment. These equilibria have relatively low bootstrap currents and are insensitive to changes in beta. Coil configurations have been calculated that reconstruct these plasma configurations, preserving their important physics properties. Theory- and experiment-based confinement analyses are used to evaluate the technical capabilities needed to reach target plasma conditions. The physics basis for these complementary experiments is described.« less
Gul, Sehrish; Zou, Xiang; Hassan, Che Hashim; Azam, Muhammad; Zaman, Khalid
2015-12-01
This study investigates the relationship between energy consumption and carbon dioxide emission in the causal framework, as the direction of causality remains has a significant policy implication for developed and developing countries. The study employed maximum entropy bootstrap (Meboot) approach to examine the causal nexus between energy consumption and carbon dioxide emission using bivariate as well as multivariate framework for Malaysia, over a period of 1975-2013. This is a unified approach without requiring the use of conventional techniques based on asymptotical theory such as testing for possible unit root and cointegration. In addition, it can be applied in the presence of non-stationary of any type including structural breaks without any type of data transformation to achieve stationary. Thus, it provides more reliable and robust inferences which are insensitive to time span as well as lag length used. The empirical results show that there is a unidirectional causality running from energy consumption to carbon emission both in the bivariate model and multivariate framework, while controlling for broad money supply and population density. The results indicate that Malaysia is an energy-dependent country and hence energy is stimulus to carbon emissions.
L-H transition and pedestal studies on MAST
NASA Astrophysics Data System (ADS)
Meyer, H.; De Bock, M. F. M.; Conway, N. J.; Freethy, S. J.; Gibson, K.; Hiratsuka, J.; Kirk, A.; Michael, C. A.; Morgan, T.; Scannell, R.; Naylor, G.; Saarelma, S.; Saveliev, A. N.; Shevchenko, V. F.; Suttrop, W.; Temple, D.; Vann, R. G. L.; MAST, the; NBI Teams
2011-11-01
On MAST studies of the profile evolution of the electron temperature (Te), electron density (ne), radial electric field (Er) as well as novel measurements of the ion temperature (Ti) and toroidal current density (jphi) in the pedestal region allow further insight into the processes forming and defining the pedestal such as the H-mode access conditions and MHD stability. This includes studies of fast evolution of Te, ne and Er with Δt = 0.2 ms time resolution and the evolution of pe and jphi through an edge-localized mode (ELM) cycle. Measurements of the H-mode power threshold, PL-H revealed that about 40% more power is required to access H-mode in 4He than in D and that a change in the Z-position of the X-point can change PL-H significantly in single and double null configurations. The profile measurements in the L-mode phase prior to H-mode suggest that neither the gradient nor the value of the mean Te or Er at the plasma edge play a major role in triggering the L-H transition. After the transitions, first the fluctuations are suppressed, then the Er shear layer and the ne pedestal develops followed by the Te pedestal. In the banana regime at low collisionality (νsstarf) ∇Ti ≈ 0 leading to Ti > Te in the pedestal region with Ti ~ 0.3 keV close to the separatrix. A clear correlation of ∇Ti with νsstarf is observed. The measured jphi (using the motional Stark effect) Te and ne are in broad agreement with the common peeling-ballooning stability picture for ELMs and neoclassical calculations of the bootstrap current. The jphi and ∇pe evolution Δt ≈ 2 ms as well as profiles in discharges with counter current neutral beam injection raise questions with respect to this edge stability picture.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kessel, C. E.; Poli, F. M.
2014-03-04
The conservative physics and conservative technology tokamak power plant ARIES-ACT2 has a major radius of 9.75 m at aspect ratio of 4.0, strong shaping with elongation of 2.2 and triangularity of 0.63. The no wall βN reaches ~ 2.4, limited by n=1 external kink mode, and can be extended to 3.2 with a stabilizing shell behind the ring structure shield. The bootstrap current fraction is 77% with a q95 of 8.0, requiring about ~ 4.0 MA of external current drive. This current is supplied with 30 MW of ICRF/FW and 80 MW of negative ion NB. Up to 1.0 MAmore » can be driven with LH with no wall, and 1.5 or more MA can be driven with a stabilizing shell. EC was examined and is most effective for safety factor control over ρ ~ 0.2-0.6 with 20 MW. The pedestal density is ~ 0.65x10 20/m 3 and the temperature is ~ 9.0 keV. The H98 factor is 1.25, n/n Gr = 1.3, and the net power to LH threshold power is 1.3-1.4 in the flattop. Due to the high toroidal field and high central temperature the cyclotron radiation loss was found to be high depending on the first wall reflectivity.« less
NASA Astrophysics Data System (ADS)
Komachi, Mamoru; Kudo, Taku; Shimbo, Masashi; Matsumoto, Yuji
Bootstrapping has a tendency, called semantic drift, to select instances unrelated to the seed instances as the iteration proceeds. We demonstrate the semantic drift of Espresso-style bootstrapping has the same root as the topic drift of Kleinberg's HITS, using a simplified graph-based reformulation of bootstrapping. We confirm that two graph-based algorithms, the von Neumann kernels and the regularized Laplacian, can reduce the effect of semantic drift in the task of word sense disambiguation (WSD) on Senseval-3 English Lexical Sample Task. Proposed algorithms achieve superior performance to Espresso and previous graph-based WSD methods, even though the proposed algorithms have less parameters and are easy to calibrate.
Confidence limit calculation for antidotal potency ratio derived from lethal dose 50
Manage, Ananda; Petrikovics, Ilona
2013-01-01
AIM: To describe confidence interval calculation for antidotal potency ratios using bootstrap method. METHODS: We can easily adapt the nonparametric bootstrap method which was invented by Efron to construct confidence intervals in such situations like this. The bootstrap method is a resampling method in which the bootstrap samples are obtained by resampling from the original sample. RESULTS: The described confidence interval calculation using bootstrap method does not require the sampling distribution antidotal potency ratio. This can serve as a substantial help for toxicologists, who are directed to employ the Dixon up-and-down method with the application of lower number of animals to determine lethal dose 50 values for characterizing the investigated toxic molecules and eventually for characterizing the antidotal protections by the test antidotal systems. CONCLUSION: The described method can serve as a useful tool in various other applications. Simplicity of the method makes it easier to do the calculation using most of the programming software packages. PMID:25237618
Topics in Statistical Calibration
2014-03-27
on a parametric bootstrap where, instead of sampling directly from the residuals , samples are drawn from a normal distribution. This procedure will...addition to centering them (Davison and Hinkley, 1997). When there are outliers in the residuals , the bootstrap distribution of x̂0 can become skewed or...based and inversion methods using the linear mixed-effects model. Then, a simple parametric bootstrap algorithm is proposed that can be used to either
Variable selection under multiple imputation using the bootstrap in a prognostic study
Heymans, Martijn W; van Buuren, Stef; Knol, Dirk L; van Mechelen, Willem; de Vet, Henrica CW
2007-01-01
Background Missing data is a challenging problem in many prognostic studies. Multiple imputation (MI) accounts for imputation uncertainty that allows for adequate statistical testing. We developed and tested a methodology combining MI with bootstrapping techniques for studying prognostic variable selection. Method In our prospective cohort study we merged data from three different randomized controlled trials (RCTs) to assess prognostic variables for chronicity of low back pain. Among the outcome and prognostic variables data were missing in the range of 0 and 48.1%. We used four methods to investigate the influence of respectively sampling and imputation variation: MI only, bootstrap only, and two methods that combine MI and bootstrapping. Variables were selected based on the inclusion frequency of each prognostic variable, i.e. the proportion of times that the variable appeared in the model. The discriminative and calibrative abilities of prognostic models developed by the four methods were assessed at different inclusion levels. Results We found that the effect of imputation variation on the inclusion frequency was larger than the effect of sampling variation. When MI and bootstrapping were combined at the range of 0% (full model) to 90% of variable selection, bootstrap corrected c-index values of 0.70 to 0.71 and slope values of 0.64 to 0.86 were found. Conclusion We recommend to account for both imputation and sampling variation in sets of missing data. The new procedure of combining MI with bootstrapping for variable selection, results in multivariable prognostic models with good performance and is therefore attractive to apply on data sets with missing values. PMID:17629912
Assessing uncertainties in superficial water provision by different bootstrap-based techniques
NASA Astrophysics Data System (ADS)
Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo Mario
2014-05-01
An assessment of water security can incorporate several water-related concepts, characterizing the interactions between societal needs, ecosystem functioning, and hydro-climatic conditions. The superficial freshwater provision level depends on the methods chosen for 'Environmental Flow Requirement' estimations, which integrate the sources of uncertainty in the understanding of how water-related threats to aquatic ecosystem security arise. Here, we develop an uncertainty assessment of superficial freshwater provision based on different bootstrap techniques (non-parametric resampling with replacement). To illustrate this approach, we use an agricultural basin (291 km2) within the Cantareira water supply system in Brazil monitored by one daily streamflow gage (24-year period). The original streamflow time series has been randomly resampled for different times or sample sizes (N = 500; ...; 1000), then applied to the conventional bootstrap approach and variations of this method, such as: 'nearest neighbor bootstrap'; and 'moving blocks bootstrap'. We have analyzed the impact of the sampling uncertainty on five Environmental Flow Requirement methods, based on: flow duration curves or probability of exceedance (Q90%, Q75% and Q50%); 7-day 10-year low-flow statistic (Q7,10); and presumptive standard (80% of the natural monthly mean ?ow). The bootstrap technique has been also used to compare those 'Environmental Flow Requirement' (EFR) methods among themselves, considering the difference between the bootstrap estimates and the "true" EFR characteristic, which has been computed averaging the EFR values of the five methods and using the entire streamflow record at monitoring station. This study evaluates the bootstrapping strategies, the representativeness of streamflow series for EFR estimates and their confidence intervals, in addition to overview of the performance differences between the EFR methods. The uncertainties arisen during EFR methods assessment will be propagated through water security indicators referring to water scarcity and vulnerability, seeking to provide meaningful support to end-users and water managers facing the incorporation of uncertainties in the decision making process.
Flynn-Evans, Erin E.; Lockley, Steven W.
2016-01-01
Study Objectives: There is currently no questionnaire-based pre-screening tool available to detect non-24-hour sleep-wake rhythm disorder (N24HSWD) among blind patients. Our goal was to develop such a tool, derived from gold standard, objective hormonal measures of circadian entrainment status, for the detection of N24HSWD among those with visual impairment. Methods: We evaluated the contribution of 40 variables in their ability to predict N24HSWD among 127 blind women, classified using urinary 6-sulfatoxymelatonin period, an objective marker of circadian entrainment status in this population. We subjected the 40 candidate predictors to 1,000 bootstrapped iterations of a logistic regression forward selection model to predict N24HSWD, with model inclusion set at the p < 0.05 level. We removed any predictors that were not selected at least 1% of the time in the 1,000 bootstrapped models and applied a second round of 1,000 bootstrapped logistic regression forward selection models to the remaining 23 candidate predictors. We included all questions that were selected at least 10% of the time in the final model. We subjected the selected predictors to a final logistic regression model to predict N24SWD over 1,000 bootstrapped models to calculate the concordance statistic and adjusted optimism of the final model. We used this information to generate a predictive model and determined the sensitivity and specificity of the model. Finally, we applied the model to a cohort of 1,262 blind women who completed the survey, but did not collect urine samples. Results: The final model consisted of eight questions. The concordance statistic, adjusted for bootstrapping, was 0.85. The positive predictive value was 88%, the negative predictive value was 79%. Applying this model to our larger dataset of women, we found that 61% of those without light perception, and 27% with some degree of light perception, would be referred for further screening for N24HSWD. Conclusions: Our model has predictive utility sufficient to serve as a pre-screening questionnaire for N24HSWD among the blind. Citation: Flynn-Evans EE, Lockley SW. A pre-screening questionnaire to predict non-24-hour sleep-wake rhythm disorder (N24HSWD) among the blind. J Clin Sleep Med 2016;12(5):703–710. PMID:26951421
Counting conformal correlators
NASA Astrophysics Data System (ADS)
Kravchuk, Petr; Simmons-Duffin, David
2018-02-01
We introduce simple group-theoretic techniques for classifying conformallyinvariant tensor structures. With them, we classify tensor structures of general n-point functions of non-conserved operators, and n ≥ 4-point functions of general conserved currents, with or without permutation symmetries, and in any spacetime dimension d. Our techniques are useful for bootstrap applications. The rules we derive simultaneously count tensor structures for flat-space scattering amplitudes in d + 1 dimensions.
NASA Astrophysics Data System (ADS)
Stefanikova, E.; Frassinetti, L.; Saarelma, S.; Loarte, A.; Nunes, I.; Garzotti, L.; Lomas, P.; Rimini, F.; Drewelow, P.; Kruezi, U.; Lomanowski, B.; de la Luna, E.; Meneses, L.; Peterka, M.; Viola, B.; Giroud, C.; Maggi, C.; contributors, JET
2018-05-01
The electron temperature and density pedestals tend to vary in their relative radial positions, as observed in DIII-D (Beurskens et al 2011 Phys. Plasmas 18 056120) and ASDEX Upgrade (Dunne et al 2017 Plasma Phys. Control. Fusion 59 14017). This so-called relative shift has an impact on the pedestal magnetohydrodynamic (MHD) stability and hence on the pedestal height (Osborne et al 2015 Nucl. Fusion 55 063018). The present work studies the effect of the relative shift on pedestal stability of JET ITER-like wall (JET-ILW) baseline low triangularity (δ) unseeded plasmas, and similar JET-C discharges. As shown in this paper, the increase of the pedestal relative shift is correlated with the reduction of the normalized pressure gradient, therefore playing a strong role in pedestal stability. Furthermore, JET-ILW tends to have a larger relative shift compared to JET carbon wall (JET-C), suggesting a possible role of the plasma facing materials in affecting the density profile location. Experimental results are then compared with stability analysis performed in terms of the peeling-ballooning model and with pedestal predictive model EUROPED (Saarelma et al 2017 Plasma Phys. Control. Fusion). Stability analysis is consistent with the experimental findings, showing an improvement of the pedestal stability, when the relative shift is reduced. This has been ascribed mainly to the increase of the edge bootstrap current, and to minor effects related to the increase of the pedestal pressure gradient and narrowing of the pedestal pressure width. Pedestal predictive model EUROPED shows a qualitative agreement with experiment, especially for low values of the relative shift.
Toma, Tudor; Bosman, Robert-Jan; Siebes, Arno; Peek, Niels; Abu-Hanna, Ameen
2010-08-01
An important problem in the Intensive Care is how to predict on a given day of stay the eventual hospital mortality for a specific patient. A recent approach to solve this problem suggested the use of frequent temporal sequences (FTSs) as predictors. Methods following this approach were evaluated in the past by inducing a model from a training set and validating the prognostic performance on an independent test set. Although this evaluative approach addresses the validity of the specific models induced in an experiment, it falls short of evaluating the inductive method itself. To achieve this, one must account for the inherent sources of variation in the experimental design. The main aim of this work is to demonstrate a procedure based on bootstrapping, specifically the .632 bootstrap procedure, for evaluating inductive methods that discover patterns, such as FTSs. A second aim is to apply this approach to find out whether a recently suggested inductive method that discovers FTSs of organ functioning status is superior over a traditional method that does not use temporal sequences when compared on each successive day of stay at the Intensive Care Unit. The use of bootstrapping with logistic regression using pre-specified covariates is known in the statistical literature. Using inductive methods of prognostic models based on temporal sequence discovery within the bootstrap procedure is however novel at least in predictive models in the Intensive Care. Our results of applying the bootstrap-based evaluative procedure demonstrate the superiority of the FTS-based inductive method over the traditional method in terms of discrimination as well as accuracy. In addition we illustrate the insights gained by the analyst into the discovered FTSs from the bootstrap samples. Copyright 2010 Elsevier Inc. All rights reserved.
Elkomy, Mohammed H; Elmenshawe, Shahira F; Eid, Hussein M; Ali, Ahmed M A
2016-11-01
This work aimed at investigating the potential of solid lipid nanoparticles (SLN) as carriers for topical delivery of Ketoprofen (KP); evaluating a novel technique incorporating Artificial Neural Network (ANN) and clustered bootstrap for optimization of KP-loaded SLN (KP-SLN); and demonstrating a longitudinal dose response (LDR) modeling-based approach to compare the activity of topical non-steroidal anti-inflammatory drug formulations. KP-SLN was fabricated by a modified emulsion/solvent evaporation method. Box-Behnken design was implemented to study the influence of glycerylpalmitostearate-to-KP ratio, Tween 80, and lecithin concentrations on particle size, entrapment efficiency, and amount of drug permeated through rat skin in 24 hours. Following clustered bootstrap ANN optimization, the optimized KP-SLN was incorporated into an aqueous gel and evaluated for rheology, in vitro release, permeability, skin irritation and in vivo activity using carrageenan-induced rat paw edema model and LDR mathematical model to analyze the time course of anti-inflammatory effect at various application durations. Lipid-to-drug ratio of 7.85 [bootstrap 95%CI: 7.63-8.51], Tween 80 of 1.27% [bootstrap 95%CI: 0.601-2.40%], and Lecithin of 0.263% [bootstrap 95%CI: 0.263-0.328%] were predicted to produce optimal characteristics. Compared with profenid® gel, the optimized KP-SLN gel exhibited slower release, faster permeability, better texture properties, greater efficacy, and similar potency. SLNs are safe and effective permeation enhancers. ANN coupled with clustered bootstrap is a useful method for finding optimal solutions and estimating uncertainty associated with them. LDR models allow mechanistic understanding of comparative in vivo performances of different topical formulations, and help design efficient dermatological bioequivalence assessment methods.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things.
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-03-11
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-01-01
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length. PMID:26978362
Vorburger, Robert S; Habeck, Christian G; Narkhede, Atul; Guzman, Vanessa A; Manly, Jennifer J; Brickman, Adam M
2016-01-01
Diffusion tensor imaging suffers from an intrinsic low signal-to-noise ratio. Bootstrap algorithms have been introduced to provide a non-parametric method to estimate the uncertainty of the measured diffusion parameters. To quantify the variability of the principal diffusion direction, bootstrap-derived metrics such as the cone of uncertainty have been proposed. However, bootstrap-derived metrics are not independent of the underlying diffusion profile. A higher mean diffusivity causes a smaller signal-to-noise ratio and, thus, increases the measurement uncertainty. Moreover, the goodness of the tensor model, which relies strongly on the complexity of the underlying diffusion profile, influences bootstrap-derived metrics as well. The presented simulations clearly depict the cone of uncertainty as a function of the underlying diffusion profile. Since the relationship of the cone of uncertainty and common diffusion parameters, such as the mean diffusivity and the fractional anisotropy, is not linear, the cone of uncertainty has a different sensitivity. In vivo analysis of the fornix reveals the cone of uncertainty to be a predictor of memory function among older adults. No significant correlation occurs with the common diffusion parameters. The present work not only demonstrates the cone of uncertainty as a function of the actual diffusion profile, but also discloses the cone of uncertainty as a sensitive predictor of memory function. Future studies should incorporate bootstrap-derived metrics to provide more comprehensive analysis.
Using Cluster Bootstrapping to Analyze Nested Data With a Few Clusters.
Huang, Francis L
2018-04-01
Cluster randomized trials involving participants nested within intact treatment and control groups are commonly performed in various educational, psychological, and biomedical studies. However, recruiting and retaining intact groups present various practical, financial, and logistical challenges to evaluators and often, cluster randomized trials are performed with a low number of clusters (~20 groups). Although multilevel models are often used to analyze nested data, researchers may be concerned of potentially biased results due to having only a few groups under study. Cluster bootstrapping has been suggested as an alternative procedure when analyzing clustered data though it has seen very little use in educational and psychological studies. Using a Monte Carlo simulation that varied the number of clusters, average cluster size, and intraclass correlations, we compared standard errors using cluster bootstrapping with those derived using ordinary least squares regression and multilevel models. Results indicate that cluster bootstrapping, though more computationally demanding, can be used as an alternative procedure for the analysis of clustered data when treatment effects at the group level are of primary interest. Supplementary material showing how to perform cluster bootstrapped regressions using R is also provided.
Bootstrapping Least Squares Estimates in Biochemical Reaction Networks
Linder, Daniel F.
2015-01-01
The paper proposes new computational methods of computing confidence bounds for the least squares estimates (LSEs) of rate constants in mass-action biochemical reaction network and stochastic epidemic models. Such LSEs are obtained by fitting the set of deterministic ordinary differential equations (ODEs), corresponding to the large volume limit of a reaction network, to network’s partially observed trajectory treated as a continuous-time, pure jump Markov process. In the large volume limit the LSEs are asymptotically Gaussian, but their limiting covariance structure is complicated since it is described by a set of nonlinear ODEs which are often ill-conditioned and numerically unstable. The current paper considers two bootstrap Monte-Carlo procedures, based on the diffusion and linear noise approximations for pure jump processes, which allow one to avoid solving the limiting covariance ODEs. The results are illustrated with both in-silico and real data examples from the LINE 1 gene retrotranscription model and compared with those obtained using other methods. PMID:25898769
Percolation in education and application in the 21st century
NASA Astrophysics Data System (ADS)
Adler, Joan; Elfenbaum, Shaked; Sharir, Liran
2017-03-01
Percolation, "so simple you could teach it to your wife" (Chuck Newman, last century) is an ideal system to introduce young students to phase transitions. Two recent projects in the Computational Physics group at the Technion make this easy. One is a set of analog models to be mounted on our walls and enable visitors to switch between samples to see which mixtures of glass and metal objects have a percolating current. The second is a website enabling the creation of stereo samples of two and three dimensional clusters (suited for viewing with Oculus rift) on desktops, tablets and smartphones. Although there have been many physical applications for regular percolation in the past, for Bootstrap Percolation, where only sites with sufficient occupied neighbours remain active, there have not been a surfeit of condensed matter applications. We have found that the creation of diamond membranes for quantum computers can be modeled with a bootstrap process of graphitization in diamond, enabling prediction of optimal processing procedures.
NASA Astrophysics Data System (ADS)
Georgopoulos, A. P.; Tan, H.-R. M.; Lewis, S. M.; Leuthold, A. C.; Winskowski, A. M.; Lynch, J. K.; Engdahl, B.
2010-02-01
Traumatic experiences can produce post-traumatic stress disorder (PTSD) which is a debilitating condition and for which no biomarker currently exists (Institute of Medicine (US) 2006 Posttraumatic Stress Disorder: Diagnosis and Assessment (Washington, DC: National Academies)). Here we show that the synchronous neural interactions (SNI) test which assesses the functional interactions among neural populations derived from magnetoencephalographic (MEG) recordings (Georgopoulos A P et al 2007 J. Neural Eng. 4 349-55) can successfully differentiate PTSD patients from healthy control subjects. Externally cross-validated, bootstrap-based analyses yielded >90% overall accuracy of classification. In addition, all but one of 18 patients who were not receiving medications for their disease were correctly classified. Altogether, these findings document robust differences in brain function between the PTSD and control groups that can be used for differential diagnosis and which possess the potential for assessing and monitoring disease progression and effects of therapy.
NASA Astrophysics Data System (ADS)
Brandic, Ivona; Music, Dejan; Dustdar, Schahram
Nowadays, novel computing paradigms as for example Cloud Computing are gaining more and more on importance. In case of Cloud Computing users pay for the usage of the computing power provided as a service. Beforehand they can negotiate specific functional and non-functional requirements relevant for the application execution. However, providing computing power as a service bears different research challenges. On one hand dynamic, versatile, and adaptable services are required, which can cope with system failures and environmental changes. On the other hand, human interaction with the system should be minimized. In this chapter we present the first results in establishing adaptable, versatile, and dynamic services considering negotiation bootstrapping and service mediation achieved in context of the Foundations of Self-Governing ICT Infrastructures (FoSII) project. We discuss novel meta-negotiation and SLA mapping solutions for Cloud services bridging the gap between current QoS models and Cloud middleware and representing important prerequisites for the establishment of autonomic Cloud services.
Phylogenetic relationships among arecoid palms (Arecaceae: Arecoideae)
Baker, William J.; Norup, Maria V.; Clarkson, James J.; Couvreur, Thomas L. P.; Dowe, John L.; Lewis, Carl E.; Pintaud, Jean-Christophe; Savolainen, Vincent; Wilmot, Tomas; Chase, Mark W.
2011-01-01
Background and Aims The Arecoideae is the largest and most diverse of the five subfamilies of palms (Arecaceae/Palmae), containing >50 % of the species in the family. Despite its importance, phylogenetic relationships among Arecoideae are poorly understood. Here the most densely sampled phylogenetic analysis of Arecoideae available to date is presented. The results are used to test the current classification of the subfamily and to identify priority areas for future research. Methods DNA sequence data for the low-copy nuclear genes PRK and RPB2 were collected from 190 palm species, covering 103 (96 %) genera of Arecoideae. The data were analysed using the parsimony ratchet, maximum likelihood, and both likelihood and parsimony bootstrapping. Key Results and Conclusions Despite the recovery of paralogues and pseudogenes in a small number of taxa, PRK and RPB2 were both highly informative, producing well-resolved phylogenetic trees with many nodes well supported by bootstrap analyses. Simultaneous analyses of the combined data sets provided additional resolution and support. Two areas of incongruence between PRK and RPB2 were strongly supported by the bootstrap relating to the placement of tribes Chamaedoreeae, Iriarteeae and Reinhardtieae; the causes of this incongruence remain uncertain. The current classification within Arecoideae was strongly supported by the present data. Of the 14 tribes and 14 sub-tribes in the classification, only five sub-tribes from tribe Areceae (Basseliniinae, Linospadicinae, Oncospermatinae, Rhopalostylidinae and Verschaffeltiinae) failed to receive support. Three major higher level clades were strongly supported: (1) the RRC clade (Roystoneeae, Reinhardtieae and Cocoseae), (2) the POS clade (Podococceae, Oranieae and Sclerospermeae) and (3) the core arecoid clade (Areceae, Euterpeae, Geonomateae, Leopoldinieae, Manicarieae and Pelagodoxeae). However, new data sources are required to elucidate ambiguities that remain in phylogenetic relationships among and within the major groups of Arecoideae, as well as within the Areceae, the largest tribe in the palm family. PMID:21325340
Prenatal Drug Exposure and Adolescent Cortisol Reactivity: Association with Behavioral Concerns.
Buckingham-Howes, Stacy; Mazza, Dayna; Wang, Yan; Granger, Douglas A; Black, Maureen M
2016-09-01
To examine stress reactivity in a sample of adolescents with prenatal drug exposure (PDE) by examining the consequences of PDE on stress-related adrenocortical reactivity, behavioral problems, and drug experimentation during adolescence. Participants (76 PDE, 61 non-drug exposed [NE]; 99% African-American; 50% male; mean age = 14.17 yr, SD = 1.17) provided a urine sample, completed a drug use questionnaire, and provided saliva samples (later assayed for cortisol) before and after a mild laboratory stress task. Caregivers completed the Behavior Assessment System for Children, Second Edition (BASC II) and reported their relationship to the adolescent. The NE group was more likely to exhibit task-related cortisol reactivity compared to the PDE group. Overall behavior problems and drug experimentation were comparable across groups with no differences between PDE and NE groups. In unadjusted mediation analyses, cortisol reactivity mediated the association between PDE and BASC II aggression scores (95% bootstrap confidence interval [CI], 0.04-4.28), externalizing problems scores (95% bootstrap CI, 0.03-4.50), and drug experimentation (95% bootstrap CI, 0.001-0.54). The associations remain with the inclusion of gender as a covariate but not when age is included. Findings support and expand current research in cortisol reactivity and PDE by demonstrating that cortisol reactivity attenuates the association between PDE and behavioral problems (aggression) and drug experimentation. If replicated, PDE may have long-lasting effects on stress-sensitive physiological mechanisms associated with behavioral problems (aggression) and drug experimentation in adolescence.
NASA Astrophysics Data System (ADS)
Isayama, A.
2005-05-01
Recent results from steady-state sustainment of high-β plasma experiments in the Japan Atomic Energy Research Institute Tokamak-60 Upgrade (JT-60U) tokamak [A. Kitsunezaki et al., Fusion Sci. Technol. 42, 179 (2002)] are described. Extension of discharge duration to 65s (formerly 15s) has enabled physics research with long time scale. In long-duration high-β research, the normalized beta βN=2.5, which is comparable to that in the steady-state operation in International Thermonuclear Experimental Reactor (ITER) [R. Aymar, P. Barabaschi, and Y. Shimomura, Plasma Phys. Controlled Fusion 44, 519 (2002)], has been sustained for about 15s with confinement enhancement factor H89PL above 2, where the duration is about 80 times energy confinement time and ˜10 times current diffusion time (τR). In the scenario aiming at longer duration with βN˜1.9, which is comparable to that in the ITER standard operation scenario, duration has been extended to 24s (˜15τR). Also, from the viewpoint of collisionality and Larmor radius of the plasmas, these results are obtained in the ITER-relevant regime with a few times larger than the ITER values. No serious effect of current diffusion on instabilities is observed in the region of βN≲2.5, and in fact neoclassical tearing modes (NTMs), which limit the achievable β in the stationary high-βp H-mode discharges, are suppressed throughout the discharge. In high-β research with the duration of several times τR, a high-β plasma with βN˜2.9-3 has been sustained for 5-6s with two scenarios for NTM suppression: (a) NTM avoidance by modification of pressure and current profiles, and (b) NTM stabilization with electron cyclotron current drive (ECCD)/electron cyclotron heating (ECH). NTM stabilization with the second harmonic X-mode ECCD/ECH has been performed, and it is found that EC current density comparable to bootstrap current density at the mode location is required for complete stabilization. Structure of a magnetic island associated with an m /n=3/2 NTM has been measured in detail (m and n are poloidal and toroidal mode numbers, respectively). By applying newly developed analysis method using motional Stark effect (MSE) diagnostic, where change in current density is directly evaluated from change in MSE pitch angle without equilibrium reconstruction, localized decrease/increase in current density at the mode rational surface is observed for NTM growth/suppression. In addition, it is found that characteristic structure of electron temperature perturbation profile is deformed during NTM stabilization. Hypothesis that temperature increase inside the magnetic island well explains the experimental observations. It is also found that the characteristic structure is not formed for the case of ECCD/ECH before the mode, while the structure is seen for the case with ECCD/ECH just after the mode onset, suggesting the stronger stabilization effect of the early EC wave injection.
A hybrid model for computing nonthermal ion distributions in a long mean-free-path plasma
NASA Astrophysics Data System (ADS)
Tang, Xianzhu; McDevitt, Chris; Guo, Zehua; Berk, Herb
2014-10-01
Non-thermal ions, especially the suprathermal ones, are known to make a dominant contribution to a number of important physics such as the fusion reactivity in controlled fusion, the ion heat flux, and in the case of a tokamak, the ion bootstrap current. Evaluating the deviation from a local Maxwellian distribution of these non-thermal ions can be a challenging task in the context of a global plasma fluid model that evolves the plasma density, flow, and temperature. Here we describe a hybrid model for coupling such constrained kinetic calculation to global plasma fluid models. The key ingredient is a non-perturbative treatment of the tail ions where the ion Knudsen number approaches or surpasses order unity. This can be sharply constrasted with the standard Chapman-Enskog approach which relies on a perturbative treatment that is frequently invalidated. The accuracy of our coupling scheme is controlled by the precise criteria for matching the non-perturbative kinetic model to perturbative solutions in both configuration space and velocity space. Although our specific application examples will be drawn from laboratory controlled fusion experiments, the general approach is applicable to space and astrophysical plasmas as well. Work supported by DOE.
NASA Astrophysics Data System (ADS)
LeBlanc, B.; Batha, S.; Bell, R.; Bernabei, S.; Blush, L.; de la Luna, E.; Doerner, R.; Dunlap, J.; England, A.; Garcia, I.; Ignat, D.; Isler, R.; Jones, S.; Kaita, R.; Kaye, S.; Kugel, H.; Levinton, F.; Luckhardt, S.; Mutoh, T.; Okabayashi, M.; Ono, M.; Paoletti, F.; Paul, S.; Petravich, G.; Post-Zwicker, A.; Sauthoff, N.; Schmitz, L.; Sesnic, S.; Takahashi, H.; Talvard, M.; Tighe, W.; Tynan, G.; von Goeler, S.; Woskov, P.; Zolfaghari, A.
1995-03-01
Application of Ion Bernstein Wave Heating (IBWH) into the Princeton Beta Experiment-Modification (PBX-M) [Phys. Fluids B 2, 1271 (1990)] tokamak stabilizes sawtooth oscillations and generates peaked density profiles. A transport barrier, spatially correlated with the IBWH power deposition profile, is observed in the core of IBWH-assisted neutral beam injection (NBI) discharges. A precursor to the fully developed barrier is seen in the soft x-ray data during edge localized mode (ELM) activity. Sustained IBWH operation is conducive to a regime where the barrier supports large ∇ne, ∇Te, ∇νφ, and ∇Ti, delimiting the confinement zone. This regime is reminiscent of the H(high) mode, but with a confinement zone moved inward. The core region has better than H-mode confinement while the peripheral region is L(low)-mode-like. The peaked profile enhances NBI core deposition and increases nuclear reactivity. An increase in central Ti results from χi reduction (compared to the H mode) and better beam penetration. Bootstrap current fractions of up to 0.32-0.35 locally and 0.28 overall were obtained when an additional NBI burst is applied to this plasma.
Bootstrap investigation of the stability of a Cox regression model.
Altman, D G; Andersen, P K
1989-07-01
We describe a bootstrap investigation of the stability of a Cox proportional hazards regression model resulting from the analysis of a clinical trial of azathioprine versus placebo in patients with primary biliary cirrhosis. We have considered stability to refer both to the choice of variables included in the model and, more importantly, to the predictive ability of the model. In stepwise Cox regression analyses of 100 bootstrap samples using 17 candidate variables, the most frequently selected variables were those selected in the original analysis, and no other important variable was identified. Thus there was no reason to doubt the model obtained in the original analysis. For each patient in the trial, bootstrap confidence intervals were constructed for the estimated probability of surviving two years. It is shown graphically that these intervals are markedly wider than those obtained from the original model.
NASA Astrophysics Data System (ADS)
Artemenko, M. V.; Chernetskaia, I. E.; Kalugina, N. M.; Shchekina, E. N.
2018-04-01
This article describes the solution of the actual problem of the productive formation of a cortege of informative measured features of the object of observation and / or control using author's algorithms for the use of bootstraps and counter-bootstraps technologies for processing the results of measurements of various states of the object on the basis of different volumes of the training sample. The work that is presented in this paper considers aggregation by specific indicators of informative capacity by linear, majority, logical and “greedy” methods, applied both individually and integrally. The results of the computational experiment are discussed, and in conclusion is drawn that the application of the proposed methods contributes to an increase in the efficiency of classification of the states of the object from the results of measurements.
How bootstrap can help in forecasting time series with more than one seasonal pattern
NASA Astrophysics Data System (ADS)
Cordeiro, Clara; Neves, M. Manuela
2012-09-01
The search for the future is an appealing challenge in time series analysis. The diversity of forecasting methodologies is inevitable and is still in expansion. Exponential smoothing methods are the launch platform for modelling and forecasting in time series analysis. Recently this methodology has been combined with bootstrapping revealing a good performance. The algorithm (Boot. EXPOS) using exponential smoothing and bootstrap methodologies, has showed promising results for forecasting time series with one seasonal pattern. In case of more than one seasonal pattern, the double seasonal Holt-Winters methods and the exponential smoothing methods were developed. A new challenge was now to combine these seasonal methods with bootstrap and carry over a similar resampling scheme used in Boot. EXPOS procedure. The performance of such partnership will be illustrated for some well-know data sets existing in software.
Phu, Jack; Bui, Bang V; Kalloniatis, Michael; Khuu, Sieu K
2018-03-01
The number of subjects needed to establish the normative limits for visual field (VF) testing is not known. Using bootstrap resampling, we determined whether the ground truth mean, distribution limits, and standard deviation (SD) could be approximated using different set size ( x ) levels, in order to provide guidance for the number of healthy subjects required to obtain robust VF normative data. We analyzed the 500 Humphrey Field Analyzer (HFA) SITA-Standard results of 116 healthy subjects and 100 HFA full threshold results of 100 psychophysically experienced healthy subjects. These VFs were resampled (bootstrapped) to determine mean sensitivity, distribution limits (5th and 95th percentiles), and SD for different ' x ' and numbers of resamples. We also used the VF results of 122 glaucoma patients to determine the performance of ground truth and bootstrapped results in identifying and quantifying VF defects. An x of 150 (for SITA-Standard) and 60 (for full threshold) produced bootstrapped descriptive statistics that were no longer different to the original distribution limits and SD. Removing outliers produced similar results. Differences between original and bootstrapped limits in detecting glaucomatous defects were minimized at x = 250. Ground truth statistics of VF sensitivities could be approximated using set sizes that are significantly smaller than the original cohort. Outlier removal facilitates the use of Gaussian statistics and does not significantly affect the distribution limits. We provide guidance for choosing the cohort size for different levels of error when performing normative comparisons with glaucoma patients.
A bootstrap estimation scheme for chemical compositional data with nondetects
Palarea-Albaladejo, J; Martín-Fernández, J.A; Olea, Ricardo A.
2014-01-01
The bootstrap method is commonly used to estimate the distribution of estimators and their associated uncertainty when explicit analytic expressions are not available or are difficult to obtain. It has been widely applied in environmental and geochemical studies, where the data generated often represent parts of whole, typically chemical concentrations. This kind of constrained data is generically called compositional data, and they require specialised statistical methods to properly account for their particular covariance structure. On the other hand, it is not unusual in practice that those data contain labels denoting nondetects, that is, concentrations falling below detection limits. Nondetects impede the implementation of the bootstrap and represent an additional source of uncertainty that must be taken into account. In this work, a bootstrap scheme is devised that handles nondetects by adding an imputation step within the resampling process and conveniently propagates their associated uncertainly. In doing so, it considers the constrained relationships between chemical concentrations originated from their compositional nature. Bootstrap estimates using a range of imputation methods, including new stochastic proposals, are compared across scenarios of increasing difficulty. They are formulated to meet compositional principles following the log-ratio approach, and an adjustment is introduced in the multivariate case to deal with nonclosed samples. Results suggest that nondetect bootstrap based on model-based imputation is generally preferable. A robust approach based on isometric log-ratio transformations appears to be particularly suited in this context. Computer routines in the R statistical programming language are provided.
Kessel, C. E.; Poli, F. M.; Ghantous, K.; ...
2015-01-01
Here, the advanced physics and advanced technology tokamak power plant ARIES-ACT1 has a major radius of 6.25 m at an aspect ratio of 4.0, toroidal field of 6.0 T, strong shaping with elongation of 2.2, and triangularity of 0.63. The broadest pressure cases reached wall-stabilized β N ~ 5.75, limited by n = 3 external kink mode requiring a conducting shell at b/a = 0.3, requiring plasma rotation, feedback, and/or kinetic stabilization. The medium pressure peaking case reaches β N = 5.28 with B T = 6.75, while the peaked pressure case reaches β N < 5.15. Fast particle magnetohydrodynamicmore » stability shows that the alpha particles are unstable, but this leads to redistribution to larger minor radius rather than loss from the plasma. Edge and divertor plasma modeling shows that 75% of the power to the divertor can be radiated with an ITER-like divertor geometry, while >95% can be radiated in a stable detached mode with an orthogonal target and wide slot geometry. The bootstrap current fraction is 91% with a q95 of 4.5, requiring ~1.1 MA of external current drive. This current is supplied with 5 MW of ion cyclotron radio frequency/fast wave and 40 MW of lower hybrid current drive. Electron cyclotron is most effective for safety factor control over ρ~0.2 to 0.6 with 20 MW. The pedestal density is ~0.9×10 20/m 3, and the temperature is ~4.4 keV. The H98 factor is 1.65, n/n Gr = 1.0, and the ratio of net power to threshold power is 2.8 to 3.0 in the flattop.« less
Recent Heating and Current Drive results on JET
NASA Astrophysics Data System (ADS)
Tuccillo, A. A.; Baranov, Y.; Barbato, E.; Bibet, Ph.; Castaldo, C.; Cesario, R.; Cocilovo, V.; Crisanti, F.; De Angelis, R.; Ekedahl, A. C.; Figueiredo, A.; Graham, M.; Granucci, G.; Hartmann, D.; Heikkinen, J.; Hellsten, T.; Imbeaux, F.; Jones, T. T. H.; Johnson, T.; Kirov, K. V.; Lamalle, P.; Laxaback, M.; Leuterer, F.; Litaudon, X.; Maget, P.; Mailloux, J.; Mantsinen, M. J.; Mayoral, M. L.; Meo, F.; Monakhov, I.; Nguyen, F.; Noterdaeme, J.-M.; Pericoli-Ridolfini, V.; Podda, S.; Panaccione, L.; Righi, E.; Rimini, F.; Sarazin, Y.; Sibley, A.; Staebler, A.; Tala, T.; Van Eester, D.
2001-10-01
An overview is presented of the results obtained on JET by the Heating and Current Drive Task Force (TF-H) in the period May 2000—March 2001. A strongly improved Lower Hybrid (LH) coupling was achieved by optimizing the plasma shape and by controlling the local edge density via the injection of CD4. Up to 4 MW have been coupled in type III ELMy H-mode and/or on Internal Transport Barrier (ITB) plasmas with reflection coefficients as low as 4%. Long lasting quasi steady-state ITBs have been obtained by adding the LH current to the bootstrap and beam driven components. Furthermore the use of LH in the pre-heat phase results in electron temperature in excess of 10 keV, deep negative magnetic shear and strongly reduced power threshold for ITB formation. Preliminary results on ICRF coupling are reported including the effect of CD4 injection and the commissioning of the wide band matching system on ELMy plasmas. IC CD scenarios have been studied in H and 3He minority and used to modify the stability of the sawtooth to influence the formation of seed islands for the appearance of NTM. Up to 3 MW of IC power was coupled in the high magnetic field fast wave CD scenario. Preliminary MSE measurements indicate differences in the current profiles between -90° and +90° phasing. Careful measurements of the toroidal rotation, in plasmas heated by ICRF only show some dependence on the position of the resonance layer. Finally the use of ICRF minority heating under real-time control, in response to measured plasma parameters to simulate the effect of alpha particles, is presented. ICRF heating results in ITER non-activated scenarios are reported in a companion paper.
Feder, Paul I; Ma, Zhenxu J; Bull, Richard J; Teuschler, Linda K; Rice, Glenn
2009-01-01
In chemical mixtures risk assessment, the use of dose-response data developed for one mixture to estimate risk posed by a second mixture depends on whether the two mixtures are sufficiently similar. While evaluations of similarity may be made using qualitative judgments, this article uses nonparametric statistical methods based on the "bootstrap" resampling technique to address the question of similarity among mixtures of chemical disinfectant by-products (DBP) in drinking water. The bootstrap resampling technique is a general-purpose, computer-intensive approach to statistical inference that substitutes empirical sampling for theoretically based parametric mathematical modeling. Nonparametric, bootstrap-based inference involves fewer assumptions than parametric normal theory based inference. The bootstrap procedure is appropriate, at least in an asymptotic sense, whether or not the parametric, distributional assumptions hold, even approximately. The statistical analysis procedures in this article are initially illustrated with data from 5 water treatment plants (Schenck et al., 2009), and then extended using data developed from a study of 35 drinking-water utilities (U.S. EPA/AMWA, 1989), which permits inclusion of a greater number of water constituents and increased structure in the statistical models.
Seol, Hyunsoo
2016-06-01
The purpose of this study was to apply the bootstrap procedure to evaluate how the bootstrapped confidence intervals (CIs) for polytomous Rasch fit statistics might differ according to sample sizes and test lengths in comparison with the rule-of-thumb critical value of misfit. A total of 25 simulated data sets were generated to fit the Rasch measurement and then a total of 1,000 replications were conducted to compute the bootstrapped CIs under each of 25 testing conditions. The results showed that rule-of-thumb critical values for assessing the magnitude of misfit were not applicable because the infit and outfit mean square error statistics showed different magnitudes of variability over testing conditions and the standardized fit statistics did not exactly follow the standard normal distribution. Further, they also do not share the same critical range for the item and person misfit. Based on the results of the study, the bootstrapped CIs can be used to identify misfitting items or persons as they offer a reasonable alternative solution, especially when the distributions of the infit and outfit statistics are not well known and depend on sample size. © The Author(s) 2016.
NASA Astrophysics Data System (ADS)
Poli, Francesca M.; Kessel, Charles E.
2013-05-01
Plasmas with internal transport barriers (ITBs) are a potential and attractive route to steady-state operation in ITER. These plasmas exhibit radially localized regions of improved confinement with steep pressure gradients in the plasma core, which drive large bootstrap current and generate hollow current profiles and negative magnetic shear. This work examines the formation and sustainment of ITBs in ITER with electron cyclotron heating and current drive. The time-dependent transport simulations indicate that, with a trade-off of the power delivered to the equatorial and to the upper launcher, the sustainment of steady-state ITBs can be demonstrated in ITER with the baseline heating configuration.
1993-09-10
1993). A bootstrap generalizedlikelihood ratio test in discriminant analysis, Proc. 15th Annual Seismic Research Symposium, in press. I Hedlin, M., J... ratio indicate that the event does not belong to the first class. The bootstrap technique is used here as well to set the critical value of the test ...Methodist University. Baek, J., H. L. Gray, W. A. Woodward and M.D. Fisk (1993). A Bootstrap Generalized Likelihood Ratio Test in Discriminant
Cui, Ming; Xu, Lili; Wang, Huimin; Ju, Shaoqing; Xu, Shuizhu; Jing, Rongrong
2017-12-01
Measurement uncertainty (MU) is a metrological concept, which can be used for objectively estimating the quality of test results in medical laboratories. The Nordtest guide recommends an approach that uses both internal quality control (IQC) and external quality assessment (EQA) data to evaluate the MU. Bootstrap resampling is employed to simulate the unknown distribution based on the mathematical statistics method using an existing small sample of data, where the aim is to transform the small sample into a large sample. However, there have been no reports of the utilization of this method in medical laboratories. Thus, this study applied the Nordtest guide approach based on bootstrap resampling for estimating the MU. We estimated the MU for the white blood cell (WBC) count, red blood cell (RBC) count, hemoglobin (Hb), and platelets (Plt). First, we used 6months of IQC data and 12months of EQA data to calculate the MU according to the Nordtest method. Second, we combined the Nordtest method and bootstrap resampling with the quality control data and calculated the MU using MATLAB software. We then compared the MU results obtained using the two approaches. The expanded uncertainty results determined for WBC, RBC, Hb, and Plt using the bootstrap resampling method were 4.39%, 2.43%, 3.04%, and 5.92%, respectively, and 4.38%, 2.42%, 3.02%, and 6.00% with the existing quality control data (U [k=2]). For WBC, RBC, Hb, and Plt, the differences between the results obtained using the two methods were lower than 1.33%. The expanded uncertainty values were all less than the target uncertainties. The bootstrap resampling method allows the statistical analysis of the MU. Combining the Nordtest method and bootstrap resampling is considered a suitable alternative method for estimating the MU. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Exploring the Replicability of a Study's Results: Bootstrap Statistics for the Multivariate Case.
ERIC Educational Resources Information Center
Thompson, Bruce
1995-01-01
Use of the bootstrap method in a canonical correlation analysis to evaluate the replicability of a study's results is illustrated. More confidence may be vested in research results that replicate. (SLD)
The Role of GRAIL Orbit Determination in Preprocessing of Gravity Science Measurements
NASA Technical Reports Server (NTRS)
Kruizinga, Gerhard; Asmar, Sami; Fahnestock, Eugene; Harvey, Nate; Kahan, Daniel; Konopliv, Alex; Oudrhiri, Kamal; Paik, Meegyeong; Park, Ryan; Strekalov, Dmitry;
2013-01-01
The Gravity Recovery And Interior Laboratory (GRAIL) mission has constructed a lunar gravity field with unprecedented uniform accuracy on the farside and nearside of the Moon. GRAIL lunar gravity field determination begins with preprocessing of the gravity science measurements by applying corrections for time tag error, general relativity, measurement noise and biases. Gravity field determination requires the generation of spacecraft ephemerides of an accuracy not attainable with the pre-GRAIL lunar gravity fields. Therefore, a bootstrapping strategy was developed, iterating between science data preprocessing and lunar gravity field estimation in order to construct sufficiently accurate orbit ephemerides.This paper describes the GRAIL measurements, their dependence on the spacecraft ephemerides and the role of orbit determination in the bootstrapping strategy. Simulation results will be presented that validate the bootstrapping strategy followed by bootstrapping results for flight data, which have led to the latest GRAIL lunar gravity fields.
The economics of bootstrapping space industries - Development of an analytic computer model
NASA Technical Reports Server (NTRS)
Goldberg, A. H.; Criswell, D. R.
1982-01-01
A simple economic model of 'bootstrapping' industrial growth in space and on the Moon is presented. An initial space manufacturing facility (SMF) is assumed to consume lunar materials to enlarge the productive capacity in space. After reaching a predetermined throughput, the enlarged SMF is devoted to products which generate revenue continuously in proportion to the accumulated output mass (such as space solar power stations). Present discounted value and physical estimates for the general factors of production (transport, capital efficiency, labor, etc.) are combined to explore optimum growth in terms of maximized discounted revenues. It is found that 'bootstrapping' reduces the fractional cost to a space industry of transport off-Earth, permits more efficient use of a given transport fleet. It is concluded that more attention should be given to structuring 'bootstrapping' scenarios in which 'learning while doing' can be more fully incorporated in program analysis.
Towards a bootstrap approach to higher orders of epsilon expansion
NASA Astrophysics Data System (ADS)
Dey, Parijat; Kaviraj, Apratim
2018-02-01
We employ a hybrid approach in determining the anomalous dimension and OPE coefficient of higher spin operators in the Wilson-Fisher theory. First we do a large spin analysis for CFT data where we use results obtained from the usual and the Mellin bootstrap and also from Feynman diagram literature. This gives new predictions at O( ɛ 4) and O( ɛ 5) for anomalous dimensions and OPE coefficients, and also provides a cross-check for the results from Mellin bootstrap. These higher orders get contributions from all higher spin operators in the crossed channel. We also use the bootstrap in Mellin space method for ϕ 3 in d = 6 - ɛ CFT where we calculate general higher spin OPE data. We demonstrate a higher loop order calculation in this approach by summing over contributions from higher spin operators of the crossed channel in the same spirit as before.
Point Set Denoising Using Bootstrap-Based Radial Basis Function.
Liew, Khang Jie; Ramli, Ahmad; Abd Majid, Ahmad
2016-01-01
This paper examines the application of a bootstrap test error estimation of radial basis functions, specifically thin-plate spline fitting, in surface smoothing. The presence of noisy data is a common issue of the point set model that is generated from 3D scanning devices, and hence, point set denoising is one of the main concerns in point set modelling. Bootstrap test error estimation, which is applied when searching for the smoothing parameters of radial basis functions, is revisited. The main contribution of this paper is a smoothing algorithm that relies on a bootstrap-based radial basis function. The proposed method incorporates a k-nearest neighbour search and then projects the point set to the approximated thin-plate spline surface. Therefore, the denoising process is achieved, and the features are well preserved. A comparison of the proposed method with other smoothing methods is also carried out in this study.
Abstract: Inference and Interval Estimation for Indirect Effects With Latent Variable Models.
Falk, Carl F; Biesanz, Jeremy C
2011-11-30
Models specifying indirect effects (or mediation) and structural equation modeling are both popular in the social sciences. Yet relatively little research has compared methods that test for indirect effects among latent variables and provided precise estimates of the effectiveness of different methods. This simulation study provides an extensive comparison of methods for constructing confidence intervals and for making inferences about indirect effects with latent variables. We compared the percentile (PC) bootstrap, bias-corrected (BC) bootstrap, bias-corrected accelerated (BC a ) bootstrap, likelihood-based confidence intervals (Neale & Miller, 1997), partial posterior predictive (Biesanz, Falk, and Savalei, 2010), and joint significance tests based on Wald tests or likelihood ratio tests. All models included three reflective latent variables representing the independent, dependent, and mediating variables. The design included the following fully crossed conditions: (a) sample size: 100, 200, and 500; (b) number of indicators per latent variable: 3 versus 5; (c) reliability per set of indicators: .7 versus .9; (d) and 16 different path combinations for the indirect effect (α = 0, .14, .39, or .59; and β = 0, .14, .39, or .59). Simulations were performed using a WestGrid cluster of 1680 3.06GHz Intel Xeon processors running R and OpenMx. Results based on 1,000 replications per cell and 2,000 resamples per bootstrap method indicated that the BC and BC a bootstrap methods have inflated Type I error rates. Likelihood-based confidence intervals and the PC bootstrap emerged as methods that adequately control Type I error and have good coverage rates.
Combining test statistics and models in bootstrapped model rejection: it is a balancing act
2014-01-01
Background Model rejections lie at the heart of systems biology, since they provide conclusive statements: that the corresponding mechanistic assumptions do not serve as valid explanations for the experimental data. Rejections are usually done using e.g. the chi-square test (χ2) or the Durbin-Watson test (DW). Analytical formulas for the corresponding distributions rely on assumptions that typically are not fulfilled. This problem is partly alleviated by the usage of bootstrapping, a computationally heavy approach to calculate an empirical distribution. Bootstrapping also allows for a natural extension to estimation of joint distributions, but this feature has so far been little exploited. Results We herein show that simplistic combinations of bootstrapped tests, like the max or min of the individual p-values, give inconsistent, i.e. overly conservative or liberal, results. A new two-dimensional (2D) approach based on parametric bootstrapping, on the other hand, is found both consistent and with a higher power than the individual tests, when tested on static and dynamic examples where the truth is known. In the same examples, the most superior test is a 2D χ2vsχ2, where the second χ2-value comes from an additional help model, and its ability to describe bootstraps from the tested model. This superiority is lost if the help model is too simple, or too flexible. If a useful help model is found, the most powerful approach is the bootstrapped log-likelihood ratio (LHR). We show that this is because the LHR is one-dimensional, because the second dimension comes at a cost, and because LHR has retained most of the crucial information in the 2D distribution. These approaches statistically resolve a previously published rejection example for the first time. Conclusions We have shown how to, and how not to, combine tests in a bootstrap setting, when the combination is advantageous, and when it is advantageous to include a second model. These results also provide a deeper insight into the original motivation for formulating the LHR, for the more general setting of nonlinear and non-nested models. These insights are valuable in cases when accuracy and power, rather than computational speed, are prioritized. PMID:24742065
Trends and Correlation Estimation in Climate Sciences: Effects of Timescale Errors
NASA Astrophysics Data System (ADS)
Mudelsee, M.; Bermejo, M. A.; Bickert, T.; Chirila, D.; Fohlmeister, J.; Köhler, P.; Lohmann, G.; Olafsdottir, K.; Scholz, D.
2012-12-01
Trend describes time-dependence in the first moment of a stochastic process, and correlation measures the linear relation between two random variables. Accurately estimating the trend and correlation, including uncertainties, from climate time series data in the uni- and bivariate domain, respectively, allows first-order insights into the geophysical process that generated the data. Timescale errors, ubiquitious in paleoclimatology, where archives are sampled for proxy measurements and dated, poses a problem to the estimation. Statistical science and the various applied research fields, including geophysics, have almost completely ignored this problem due to its theoretical almost-intractability. However, computational adaptations or replacements of traditional error formulas have become technically feasible. This contribution gives a short overview of such an adaptation package, bootstrap resampling combined with parametric timescale simulation. We study linear regression, parametric change-point models and nonparametric smoothing for trend estimation. We introduce pairwise-moving block bootstrap resampling for correlation estimation. Both methods share robustness against autocorrelation and non-Gaussian distributional shape. We shortly touch computing-intensive calibration of bootstrap confidence intervals and consider options to parallelize the related computer code. Following examples serve not only to illustrate the methods but tell own climate stories: (1) the search for climate drivers of the Agulhas Current on recent timescales, (2) the comparison of three stalagmite-based proxy series of regional, western German climate over the later part of the Holocene, and (3) trends and transitions in benthic oxygen isotope time series from the Cenozoic. Financial support by Deutsche Forschungsgemeinschaft (FOR 668, FOR 1070, MU 1595/4-1) and the European Commission (MC ITN 238512, MC ITN 289447) is acknowledged.
A Critical Meta-Analysis of Lens Model Studies in Human Judgment and Decision-Making
Kaufmann, Esther; Reips, Ulf-Dietrich; Wittmann, Werner W.
2013-01-01
Achieving accurate judgment (‘judgmental achievement’) is of utmost importance in daily life across multiple domains. The lens model and the lens model equation provide useful frameworks for modeling components of judgmental achievement and for creating tools to help decision makers (e.g., physicians, teachers) reach better judgments (e.g., a correct diagnosis, an accurate estimation of intelligence). Previous meta-analyses of judgment and decision-making studies have attempted to evaluate overall judgmental achievement and have provided the basis for evaluating the success of bootstrapping (i.e., replacing judges by linear models that guide decision making). However, previous meta-analyses have failed to appropriately correct for a number of study design artifacts (e.g., measurement error, dichotomization), which may have potentially biased estimations (e.g., of the variability between studies) and led to erroneous interpretations (e.g., with regards to moderator variables). In the current study we therefore conduct the first psychometric meta-analysis of judgmental achievement studies that corrects for a number of study design artifacts. We identified 31 lens model studies (N = 1,151, k = 49) that met our inclusion criteria. We evaluated overall judgmental achievement as well as whether judgmental achievement depended on decision domain (e.g., medicine, education) and/or the level of expertise (expert vs. novice). We also evaluated whether using corrected estimates affected conclusions with regards to the success of bootstrapping with psychometrically-corrected models. Further, we introduce a new psychometric trim-and-fill method to estimate the effect sizes of potentially missing studies correct psychometric meta-analyses for effects of publication bias. Comparison of the results of the psychometric meta-analysis with the results of a traditional meta-analysis (which only corrected for sampling error) indicated that artifact correction leads to a) an increase in values of the lens model components, b) reduced heterogeneity between studies, and c) increases the success of bootstrapping. We argue that psychometric meta-analysis is useful for accurately evaluating human judgment and show the success of bootstrapping. PMID:24391781
A critical meta-analysis of lens model studies in human judgment and decision-making.
Kaufmann, Esther; Reips, Ulf-Dietrich; Wittmann, Werner W
2013-01-01
Achieving accurate judgment ('judgmental achievement') is of utmost importance in daily life across multiple domains. The lens model and the lens model equation provide useful frameworks for modeling components of judgmental achievement and for creating tools to help decision makers (e.g., physicians, teachers) reach better judgments (e.g., a correct diagnosis, an accurate estimation of intelligence). Previous meta-analyses of judgment and decision-making studies have attempted to evaluate overall judgmental achievement and have provided the basis for evaluating the success of bootstrapping (i.e., replacing judges by linear models that guide decision making). However, previous meta-analyses have failed to appropriately correct for a number of study design artifacts (e.g., measurement error, dichotomization), which may have potentially biased estimations (e.g., of the variability between studies) and led to erroneous interpretations (e.g., with regards to moderator variables). In the current study we therefore conduct the first psychometric meta-analysis of judgmental achievement studies that corrects for a number of study design artifacts. We identified 31 lens model studies (N = 1,151, k = 49) that met our inclusion criteria. We evaluated overall judgmental achievement as well as whether judgmental achievement depended on decision domain (e.g., medicine, education) and/or the level of expertise (expert vs. novice). We also evaluated whether using corrected estimates affected conclusions with regards to the success of bootstrapping with psychometrically-corrected models. Further, we introduce a new psychometric trim-and-fill method to estimate the effect sizes of potentially missing studies correct psychometric meta-analyses for effects of publication bias. Comparison of the results of the psychometric meta-analysis with the results of a traditional meta-analysis (which only corrected for sampling error) indicated that artifact correction leads to a) an increase in values of the lens model components, b) reduced heterogeneity between studies, and c) increases the success of bootstrapping. We argue that psychometric meta-analysis is useful for accurately evaluating human judgment and show the success of bootstrapping.
Simulating realistic predator signatures in quantitative fatty acid signature analysis
Bromaghin, Jeffrey F.
2015-01-01
Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.
Migration of the ATLAS Metadata Interface (AMI) to Web 2.0 and cloud
NASA Astrophysics Data System (ADS)
Odier, J.; Albrand, S.; Fulachier, J.; Lambert, F.
2015-12-01
The ATLAS Metadata Interface (AMI), a mature application of more than 10 years of existence, is currently under adaptation to some recently available technologies. The web interfaces, which previously manipulated XML documents using XSL transformations, are being migrated to Asynchronous JavaScript (AJAX). Web development is considerably simplified by the introduction of a framework based on JQuery and Twitter Bootstrap. Finally, the AMI services are being migrated to an OpenStack cloud infrastructure.
REVIEW ARTICLE: Major results from the stellarator Wendelstein 7-AS
NASA Astrophysics Data System (ADS)
Hirsch, M.; Baldzuhn, J.; Beidler, C.; Brakel, R.; Burhenn, R.; Dinklage, A.; Ehmler, H.; Endler, M.; Erckmann, V.; Feng, Y.; Geiger, J.; Giannone, L.; Grieger, G.; Grigull, P.; Hartfuß, H.-J.; Hartmann, D.; Jaenicke, R.; König, R.; Laqua, H. P.; Maaßberg, H.; McCormick, K.; Sardei, F.; Speth, E.; Stroth, U.; Wagner, F.; Weller, A.; Werner, A.; Wobig, H.; Zoletnik, S.; W7-AS Team
2008-05-01
Wendelstein 7-AS was the first modular stellarator device to test some basic elements of stellarator optimization: a reduced Shafranov shift and improved stability properties resulted in β-values up to 3.4% (at 0.9 T). This operational limit was determined by power balance and impurity radiation without noticeable degradation of stability or a violent collapse. The partial reduction of neoclassical transport could be verified in agreement with calculations indicating the feasibility of the concept of drift optimization. A full neoclassical optimization, in particular a minimization of the bootstrap current was beyond the scope of this project. A variety of non-ohmic heating and current drive scenarios by ICRH, NBI and in particular, ECRH were tested and compared successfully with their theoretical predictions. Besides, new heating schemes of overdense plasmas were developed such as RF mode conversion heating—Ordinary mode, Extraordinary mode, Bernstein-wave (OXB) heating—or 2nd harmonic O-mode (O2) heating. The energy confinement was about a factor of 2 above ISS95 without degradation near operational boundaries. A number of improved confinement regimes such as core electron-root confinement with central Te <= 7 keV and regimes with strongly sheared radial electric field at the plasma edge resulting in Ti <= 1.7 keV were obtained. As the first non-tokamak device, W7-AS achieved the H-mode and moreover developed a high density H-mode regime (HDH) with strongly reduced impurity confinement that allowed quasi-steady-state operation (τ ≈ 65 · τE) at densities \\bar {n}_{\\rme} \\cong 4 \\times 10^{20}\\,\\mbox{m}^{-3} (at 2.5 T). The first island divertor was tested successfully and operated with stable partial detachment in agreement with numerical simulations. With these results W7-AS laid the physics background for operation of an optimized low-shear steady-state stellarator.
QUANTITATIVE TESTS OF ELMS AS INTERMEDIATE N PEELING-BALLOONING MODES
DOE Office of Scientific and Technical Information (OSTI.GOV)
LAO, LL; SNYDER, PB; LEONARD, AW
2002-07-01
OAK A271 QUANTITATIVE TESTS OF ELMS AS INTERMEDIATE N PEELING-BALLOONING MODES. Two of the major issues crucial for the design of the next generation tokamak burning plasma devices are the predictability of the edge pedestal height and control of the divertor heat load in H-mode configurations. Both of these are strongly impacted by edge localized modes (ELMs) and their size. A working model for ELMs is that they are intermediate toroidal mode number, n {approx} 5-30, peeling-ballooning modes driven by the large edge pedestal pressure gradient P{prime} and the associated large edge bootstrap current density J{sub BS}. the interplay betweenmore » P{prime} and J{sub BS} as a discharge evolves can excite peeling-ballooning modes over a wide spectrum of n. The pedestal current density plays a dual role by stabilizing the high n ballooning modes via opening access to second stability but providing free energy to drive the intermediate n peeling modes. This makes a systematic evaluation of this model particularly challenging. This paper describes recent quantitative tests of this model using experimental data from the DIII-D and the JT-60U tokamaks. These tests are made possible by recent improvements to the ELITE MHD stability code, which allow an efficient evaluation of the unstable peeling-ballooning modes, as well as by improvements to other diagnostic and analysis techniques. Some of the key testable features of this model are: (1) ELMs are triggered when the growth rates of intermediate n MHD modes become significantly large; (2) ELM sizes are related to the radial widths of the unstable modes; (3) the unstable modes have a strong ballooning character localized in the outboard bad curvature region; (4) at high collisionality, ELM size generally becomes smaller because J{sub BS} is reduced.« less
Finite Beta Boundary Magnetic Fields of NCSX
NASA Astrophysics Data System (ADS)
Grossman, A.; Kaiser, T.; Mioduszewski, P.
2004-11-01
The magnetic field between the plasma surface and wall of the National Compact Stellarator (NCSX), which uses quasi-symmetry to combine the best features of the tokamak and stellarator in a configuration of low aspect ratio is mapped via field line tracing in a range of finite beta in which part of the rotational transform is generated by the bootstrap current. We adopt the methodology developed for W7-X, in which an equilibrium solution is computed by an inverse equilibrium solver based on an energy minimizing variational moments code, VMEC2000[1], which solves directly for the shape of the flux surfaces given the external coils and their currents as well as a bootstrap current provided by a separate transport calculation. The VMEC solution and the Biot-Savart vacuum fields are coupled to the magnetic field solver for finite-beta equilibrium (MFBE2001)[2] code to determine the magnetic field on a 3D grid over a computational domain. It is found that the edge plasma is more stellarator-like, with a complex 3D structure, and less like the ordered 2D symmetric structure of a tokamak. The field lines make a transition from ergodically covering a surface to ergodically covering a volume, as the distance from the last closed magnetic surface is increased. The results are compared with the PIES[3] calculations. [1] S.P. Hirshman et al. Comput. Phys. Commun. 43 (1986) 143. [2] E. Strumberger, et al. Nucl. Fusion 42 (2002) 827. [3] A.H. Reiman and H.S. Greenside, Comput. Phys. Commun. 43, 157 (1986).
Bootstrap Methods: A Very Leisurely Look.
ERIC Educational Resources Information Center
Hinkle, Dennis E.; Winstead, Wayland H.
The Bootstrap method, a computer-intensive statistical method of estimation, is illustrated using a simple and efficient Statistical Analysis System (SAS) routine. The utility of the method for generating unknown parameters, including standard errors for simple statistics, regression coefficients, discriminant function coefficients, and factor…
Bootstrapping Student Understanding of What Is Going on in Econometrics.
ERIC Educational Resources Information Center
Kennedy, Peter E.
2001-01-01
Explains that econometrics is an intellectual game played by rules based on the sampling distribution concept. Contains explanations for why many students are uncomfortable with econometrics. Encourages instructors to use explain-how-to-bootstrap exercises to promote student understanding. (RLH)
2013-01-01
Background Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Methods Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups of patients compared to the most discriminating (reference) measure. Statistical significance of RV was quantified by the 95% bootstrap confidence interval. Simulations examined the effects of sample size, denominator F-statistic, correlation between comparator and reference measures, and number of bootstrap replicates. Results The statistical significance of the RV increased as the magnitude of denominator F-statistic increased or as the correlation between comparator and reference measures increased. A denominator F-statistic of 57 conveyed sufficient power (80%) to detect an RV of 0.6 for two measures correlated at r = 0.7. Larger denominator F-statistics or higher correlations provided greater power. Larger sample size with a fixed denominator F-statistic or more bootstrap replicates (beyond 500) had minimal impact. Conclusions The bootstrap is valuable for establishing the statistical significance of RV estimates. A reasonably large denominator F-statistic (F > 57) is required for adequate power when using the RV to compare the validity of measures with small or moderate correlations (r < 0.7). Substantially greater power can be achieved when comparing measures of a very high correlation (r > 0.9). PMID:23721463
Geochemical landscapes of the conterminous United States; new map presentations for 22 elements
Gustavsson, N.; Bolviken, B.; Smith, D.B.; Severson, R.C.
2001-01-01
Geochemical maps of the conterminous United States have been prepared for seven major elements (Al, Ca, Fe, K, Mg, Na, and Ti) and 15 trace elements (As, Ba, Cr, Cu, Hg, Li, Mn, Ni, Pb, Se, Sr, V, Y, Zn, and Zr). The maps are based on an ultra low-density geochemical survey consisting of 1,323 samples of soils and other surficial materials collected from approximately 1960-1975. The data were published by Boerngen and Shacklette (1981) and black-and-white point-symbol geochemical maps were published by Shacklette and Boerngen (1984). The data have been reprocessed using weighted-median and Bootstrap procedures for interpolation and smoothing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kopeliovich, B. Z.; Potashnikova, I. K.; Schmidt, Ivan
A bootstrap equation for self-quenched gluon shadowing leads to a reduced magnitude of broadening for partons propagating through a nucleus. Saturation of small-x gluons in a nucleus, which has the form of transverse momentum broadening of projectile gluons in pA collisions in the nuclear rest frame, leads to a modification of the parton distribution functions in the beam compared with pp collisions. In nucleus-nucleus collisions all participating nucleons acquire enhanced gluon density at small x, which boosts further the saturation scale. Solution of the reciprocity equations for central collisions of two heavy nuclei demonstrate a significant, up to several times,more » enhancement of Q{sub sA}{sup 2}, in AA compared with pA collisions.« less
Four Bootstrap Confidence Intervals for the Binomial-Error Model.
ERIC Educational Resources Information Center
Lin, Miao-Hsiang; Hsiung, Chao A.
1992-01-01
Four bootstrap methods are identified for constructing confidence intervals for the binomial-error model. The extent to which similar results are obtained and the theoretical foundation of each method and its relevance and ranges of modeling the true score uncertainty are discussed. (SLD)
Nonparametric Regression and the Parametric Bootstrap for Local Dependence Assessment.
ERIC Educational Resources Information Center
Habing, Brian
2001-01-01
Discusses ideas underlying nonparametric regression and the parametric bootstrap with an overview of their application to item response theory and the assessment of local dependence. Illustrates the use of the method in assessing local dependence that varies with examinee trait levels. (SLD)
Application of the Bootstrap Statistical Method in Deriving Vibroacoustic Specifications
NASA Technical Reports Server (NTRS)
Hughes, William O.; Paez, Thomas L.
2006-01-01
This paper discusses the Bootstrap Method for specification of vibroacoustic test specifications. Vibroacoustic test specifications are necessary to properly accept or qualify a spacecraft and its components for the expected acoustic, random vibration and shock environments seen on an expendable launch vehicle. Traditionally, NASA and the U.S. Air Force have employed methods of Normal Tolerance Limits to derive these test levels based upon the amount of data available, and the probability and confidence levels desired. The Normal Tolerance Limit method contains inherent assumptions about the distribution of the data. The Bootstrap is a distribution-free statistical subsampling method which uses the measured data themselves to establish estimates of statistical measures of random sources. This is achieved through the computation of large numbers of Bootstrap replicates of a data measure of interest and the use of these replicates to derive test levels consistent with the probability and confidence desired. The comparison of the results of these two methods is illustrated via an example utilizing actual spacecraft vibroacoustic data.
The Reliability and Stability of an Inferred Phylogenetic Tree from Empirical Data.
Katsura, Yukako; Stanley, Craig E; Kumar, Sudhir; Nei, Masatoshi
2017-03-01
The reliability of a phylogenetic tree obtained from empirical data is usually measured by the bootstrap probability (Pb) of interior branches of the tree. If the bootstrap probability is high for most branches, the tree is considered to be reliable. If some interior branches show relatively low bootstrap probabilities, we are not sure that the inferred tree is really reliable. Here, we propose another quantity measuring the reliability of the tree called the stability of a subtree. This quantity refers to the probability of obtaining a subtree (Ps) of an inferred tree obtained. We then show that if the tree is to be reliable, both Pb and Ps must be high. We also show that Ps is given by a bootstrap probability of the subtree with the closest outgroup sequence, and computer program RESTA for computing the Pb and Ps values will be presented. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Closure of the operator product expansion in the non-unitary bootstrap
DOE Office of Scientific and Technical Information (OSTI.GOV)
Esterlis, Ilya; Fitzpatrick, A. Liam; Ramirez, David M.
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a specialmore » case of the “Gliozzi” bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.« less
Lin, Jyh-Jiuan; Chang, Ching-Hui; Pal, Nabendu
2015-01-01
To test the mutual independence of two qualitative variables (or attributes), it is a common practice to follow the Chi-square tests (Pearson's as well as likelihood ratio test) based on data in the form of a contingency table. However, it should be noted that these popular Chi-square tests are asymptotic in nature and are useful when the cell frequencies are "not too small." In this article, we explore the accuracy of the Chi-square tests through an extensive simulation study and then propose their bootstrap versions that appear to work better than the asymptotic Chi-square tests. The bootstrap tests are useful even for small-cell frequencies as they maintain the nominal level quite accurately. Also, the proposed bootstrap tests are more convenient than the Fisher's exact test which is often criticized for being too conservative. Finally, all test methods are applied to a few real-life datasets for demonstration purposes.
Closure of the operator product expansion in the non-unitary bootstrap
Esterlis, Ilya; Fitzpatrick, A. Liam; Ramirez, David M.
2016-11-07
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a specialmore » case of the “Gliozzi” bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.« less
NASA Astrophysics Data System (ADS)
Poli, Francesca
2012-10-01
Steady state scenarios envisaged for ITER aim at optimizing the bootstrap current, while maintaining sufficient confinement and stability to provide the necessary fusion yield. Non-inductive scenarios will need to operate with Internal Transport Barriers (ITBs) in order to reach adequate fusion gain at typical currents of 9 MA. However, the large pressure gradients associated with ITBs in regions of weak or negative magnetic shear can be conducive to ideal MHD instabilities in a wide range of βN, reducing the no-wall limit. Scenarios are established as relaxed flattop states with time-dependent transport simulations with TSC [1]. Fully non-inductive configurations with current in the range of 7-10 MA and various heating mixes (NB, EC, IC and LH) have been studied against variations of the pressure profile peaking and of the Greenwald fraction. It is found that stable equilibria have qmin> 2 and moderate ITBs at 2/3 of the minor radius [2]. The ExB flow shear from toroidal plasma rotation is expected to be low in ITER, with a major role in the ITB dynamics being played by magnetic geometry. Combinations of H&CD sources that maintain reverse or weak magnetic shear profiles throughout the discharge and ρ(qmin)>=0.5 are the focus of this work. The ITER EC upper launcher, designed for NTM control, can provide enough current drive off-axis to sustain moderate ITBs at mid-radius and maintain a non-inductive current of 8-9MA and H98>=1.5 with the day one heating mix. LH heating and current drive is effective in modifying the current profile off-axis, facilitating the formation of stronger ITBs in the rampup phase, their sustainment at larger radii and larger bootstrap fraction. The implications for steady state operation and fusion performance are discussed.[4pt] [1] Jardin S.C. et al, J. Comput. Phys. 66 (1986) 481[0pt] [2] Poli F.M. et al, Nucl. Fusion 52 (2012) 063027.
NASA Astrophysics Data System (ADS)
Max-Moerbeck, W.; Richards, J. L.; Hovatta, T.; Pavlidou, V.; Pearson, T. J.; Readhead, A. C. S.
2014-11-01
We present a practical implementation of a Monte Carlo method to estimate the significance of cross-correlations in unevenly sampled time series of data, whose statistical properties are modelled with a simple power-law power spectral density. This implementation builds on published methods; we introduce a number of improvements in the normalization of the cross-correlation function estimate and a bootstrap method for estimating the significance of the cross-correlations. A closely related matter is the estimation of a model for the light curves, which is critical for the significance estimates. We present a graphical and quantitative demonstration that uses simulations to show how common it is to get high cross-correlations for unrelated light curves with steep power spectral densities. This demonstration highlights the dangers of interpreting them as signs of a physical connection. We show that by using interpolation and the Hanning sampling window function we are able to reduce the effects of red-noise leakage and to recover steep simple power-law power spectral densities. We also introduce the use of a Neyman construction for the estimation of the errors in the power-law index of the power spectral density. This method provides a consistent way to estimate the significance of cross-correlations in unevenly sampled time series of data.
Confidence Interval Coverage for Cohen's Effect Size Statistic
ERIC Educational Resources Information Center
Algina, James; Keselman, H. J.; Penfield, Randall D.
2006-01-01
Kelley compared three methods for setting a confidence interval (CI) around Cohen's standardized mean difference statistic: the noncentral-"t"-based, percentile (PERC) bootstrap, and biased-corrected and accelerated (BCA) bootstrap methods under three conditions of nonnormality, eight cases of sample size, and six cases of population…
A Bootstrap Procedure of Propensity Score Estimation
ERIC Educational Resources Information Center
Bai, Haiyan
2013-01-01
Propensity score estimation plays a fundamental role in propensity score matching for reducing group selection bias in observational data. To increase the accuracy of propensity score estimation, the author developed a bootstrap propensity score. The commonly used propensity score matching methods: nearest neighbor matching, caliper matching, and…
Projecting High Beta Steady-State Scenarios from DIII-D Advanced Tokamk Discharges
NASA Astrophysics Data System (ADS)
Park, J. M.
2013-10-01
Fusion power plant studies based on steady-state tokamak operation suggest that normalized beta in the range of 4-6 is needed for economic viability. DIII-D is exploring a range of candidate high beta scenarios guided by FASTRAN modeling in a repeated cycle of experiment and modeling validation. FASTRAN is a new iterative numerical procedure coupled to the Integrated Plasma Simulator (IPS) that integrates models of core transport, heating and current drive, equilibrium and stability self-consistently to find steady state (d / dt = 0) solutions, and reproduces most features of DIII-D high beta discharges with a stationary current profile. Separately, modeling components such as core transport (TGLF) and off-axis neutral beam current drive (NUBEAM) show reasonable agreement with experiment. Projecting forward to scenarios possible on DIII-D with future upgrades, two self-consistent noninductive scenarios at βN > 4 are found: high qmin and high internal inductance li. Both have bootstrap current fraction fBS > 0 . 5 and rely on the planned addition of a second off-axis neutral beamline and increased electron cyclotron heating. The high qmin > 2 scenario achieves stable operation at βN as high as 5 by a very broad current density profile to improve the ideal-wall stabilization of low-n instabilities along with confinement enhancement from low magnetic shear. The li near 1 scenario does not depend on ideal-wall stabilization. Improved confinement from strong magnetic shear makes up for the lower pedestal needed to maintain li high. The tradeoff between increasing li and reduced edge pedestal determines the achievable βN (near 4) and fBS (near 0.5). This modeling identifies the necessary upgrades to achieve target scenarios and clarifies the pros and cons of particular scenarios to better inform the development of steady-state fusion. Supported by the US Department of Energy under DE-AC05-00OR22725 & DE-FC02-04ER54698.
Pérez-Hernández, Oscar; Giesler, Loren J.
2014-01-01
Soil texture has been commonly associated with the population density of Heterodera glycines (soybean cyst nematode: SCN), but such an association has been mainly described in terms of textural classes. In this study, multivariate analysis and a generalized linear modeling approach were used to elucidate the quantitative relationship of soil texture with the observed SCN population density reduction after annual corn rotation in Nebraska. Forty-five commercial production fields were sampled in 2009, 2010, and 2011 and SCN population density (eggs/100 cm3 of soil) for each field was determined before (Pi) and after (Pf) annual corn rotation from ten 3 × 3-m sampling grids. Principal components analysis revealed that, compared with silt and clay, sand had a stronger association with SCN Pi and Pf. Cluster analysis using the average linkage method and confirmed through 1,000 bootstrap simulations identified two groups: one corresponding to predominant silt-and-clay fields and other to sand-predominant fields. This grouping suggested that SCN relative percent population decline was higher in the sandy than in the silt-and-clay predominant group. However, when groups were compared for their SCN population density reduction using Pf as the response, Pi as a covariate, and incorporating the year and field variability, a negative binomial generalized linear model indicated that the SCN population density reduction was not statistically different between the sand-predominant field group and the silt-and-clay predominant group. PMID:24987160
On heat loading, novel divertors, and fusion reactors
NASA Astrophysics Data System (ADS)
Kotschenreuther, M.; Valanju, P. M.; Mahajan, S. M.; Wiley, J. C.
2007-07-01
The limited thermal power handling capacity of the standard divertors (used in current as well as projected tokamaks) is likely to force extremely high (˜90%) radiation fractions frad in tokamak fusion reactors that have heating powers considerably larger than ITER [D. J. Campbell, Phys. Plasmas 8, 2041 (2001)]. Such enormous values of necessary frad could have serious and debilitating consequences on the core confinement, stability, and dependability for a fusion power reactor, especially in reactors with Internal Transport Barriers. A new class of divertors, called X-divertors (XD), which considerably enhance the divertor thermal capacity through a flaring of the field lines only near the divertor plates, may be necessary and sufficient to overcome these problems and lead to a dependable fusion power reactor with acceptable economics. X-divertors will lower the bar on the necessary confinement to bring it in the range of the present experimental results. Its ability to reduce the radiative burden imparts the X-divertor with a key advantage. Lower radiation demands allow sharply peaked density profiles that enhance the bootstrap fraction creating the possibility for a highly increased beta for the same beta normal discharges. The X-divertor emerges as a beta-enhancer capable of raising it by up to roughly a factor of 2.
Fusion Plasma Performance and Confinement Studies on JT-60 and JT-60U
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamada, Y.; Fujita, T.; Ishida, S.
2002-09-15
Fusion plasma performance and confinement studies on JT-60 and JT-60U are reviewed. With the main aim of providing a physics basis for ITER and the steady-state tokamak reactors, JT-60/JT-60U has been developing and optimizing the operational concepts, and extending the discharge regimes toward sustainment of high integrated performance in the reactor relevant parameter regime. In addition to achievement of high fusion plasma performances such as the equivalent breakeven condition (Q{sub DT}{sup eq} up to 1.25) and a high fusion triple product n{sub D}(0){tau}{sub E}T{sub i}(0) = 1.5 x 10{sup 21} m{sup -3}skeV, JT-60U has demonstrated the integrated performance of highmore » confinement, high {beta}{sub N}, full non-inductive current drive with a large fraction of bootstrap current. These favorable performances have been achieved in the two advanced operation regimes, the reversed magnetic shear (RS) and the weak magnetic shear (high-{beta}{sub p}) ELMy H modes characterized by both internal transport barriers (ITB) and edge transport barriers (ETB). The key factors in optimizing these plasmas towards high integrated performance are control of profiles of current, pressure, rotation, etc. utilizing a variety of heating, current drive, torque input, and particle control capabilities and high triangularity operation. As represented by discovery of ITBs (density ITB in the central pellet mode, ion temperature ITB in the high-{beta}{sub p} mode, and electron temperature ITB in the reversed shear mode), confinement studies in JT-60/JT-60U have been emphasizing freedom and also restriction of radial profiles of temperature and density. In addition to characterization of confinement and analyses of transport properties of the OH, the L-mode, the H-mode, the pellet mode, the high-{beta}{sub p} mode, and the RS mode, JT-60U has clarified formation conditions, spatial structures and dynamics of edge and internal transport barriers, and evaluated effects of repetitive MHD events on confinement such as sawteeth and ELMs. Through these studies, JT-60U has demonstrated applicability of the high confinement modes to ITER and the steady-state tokamak reactors.« less
Bootstrapping Methods Applied for Simulating Laboratory Works
ERIC Educational Resources Information Center
Prodan, Augustin; Campean, Remus
2005-01-01
Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…
ERIC Educational Resources Information Center
Zhang, Guangjian; Preacher, Kristopher J.; Luo, Shanhong
2010-01-01
This article is concerned with using the bootstrap to assign confidence intervals for rotated factor loadings and factor correlations in ordinary least squares exploratory factor analysis. Coverage performances of "SE"-based intervals, percentile intervals, bias-corrected percentile intervals, bias-corrected accelerated percentile…
Bootstrap Estimation and Testing for Variance Equality.
ERIC Educational Resources Information Center
Olejnik, Stephen; Algina, James
The purpose of this study was to develop a single procedure for comparing population variances which could be used for distribution forms. Bootstrap methodology was used to estimate the variability of the sample variance statistic when the population distribution was normal, platykurtic and leptokurtic. The data for the study were generated and…
Bootstrapping the Syntactic Bootstrapper: Probabilistic Labeling of Prosodic Phrases
ERIC Educational Resources Information Center
Gutman, Ariel; Dautriche, Isabelle; Crabbé, Benoît; Christophe, Anne
2015-01-01
The "syntactic bootstrapping" hypothesis proposes that syntactic structure provides children with cues for learning the meaning of novel words. In this article, we address the question of how children might start acquiring some aspects of syntax before they possess a sizeable lexicon. The study presents two models of early syntax…
ERIC Educational Resources Information Center
Larwin, Karen H.; Larwin, David A.
2011-01-01
Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…
Porter, Teresita M; Gibson, Joel F; Shokralla, Shadi; Baird, Donald J; Golding, G Brian; Hajibabaei, Mehrdad
2014-01-01
Current methods to identify unknown insect (class Insecta) cytochrome c oxidase (COI barcode) sequences often rely on thresholds of distances that can be difficult to define, sequence similarity cut-offs, or monophyly. Some of the most commonly used metagenomic classification methods do not provide a measure of confidence for the taxonomic assignments they provide. The aim of this study was to use a naïve Bayesian classifier (Wang et al. Applied and Environmental Microbiology, 2007; 73: 5261) to automate taxonomic assignments for large batches of insect COI sequences such as data obtained from high-throughput environmental sequencing. This method provides rank-flexible taxonomic assignments with an associated bootstrap support value, and it is faster than the blast-based methods commonly used in environmental sequence surveys. We have developed and rigorously tested the performance of three different training sets using leave-one-out cross-validation, two field data sets, and targeted testing of Lepidoptera, Diptera and Mantodea sequences obtained from the Barcode of Life Data system. We found that type I error rates, incorrect taxonomic assignments with a high bootstrap support, were already relatively low but could be lowered further by ensuring that all query taxa are actually present in the reference database. Choosing bootstrap support cut-offs according to query length and summarizing taxonomic assignments to more inclusive ranks can also help to reduce error while retaining the maximum number of assignments. Additionally, we highlight gaps in the taxonomic and geographic representation of insects in public sequence databases that will require further work by taxonomists to improve the quality of assignments generated using any method.
An efficient pseudomedian filter for tiling microrrays.
Royce, Thomas E; Carriero, Nicholas J; Gerstein, Mark B
2007-06-07
Tiling microarrays are becoming an essential technology in the functional genomics toolbox. They have been applied to the tasks of novel transcript identification, elucidation of transcription factor binding sites, detection of methylated DNA and several other applications in several model organisms. These experiments are being conducted at increasingly finer resolutions as the microarray technology enjoys increasingly greater feature densities. The increased densities naturally lead to increased data analysis requirements. Specifically, the most widely employed algorithm for tiling array analysis involves smoothing observed signals by computing pseudomedians within sliding windows, a O(n2logn) calculation in each window. This poor time complexity is an issue for tiling array analysis and could prove to be a real bottleneck as tiling microarray experiments become grander in scope and finer in resolution. We therefore implemented Monahan's HLQEST algorithm that reduces the runtime complexity for computing the pseudomedian of n numbers to O(nlogn) from O(n2logn). For a representative tiling microarray dataset, this modification reduced the smoothing procedure's runtime by nearly 90%. We then leveraged the fact that elements within sliding windows remain largely unchanged in overlapping windows (as one slides across genomic space) to further reduce computation by an additional 43%. This was achieved by the application of skip lists to maintaining a sorted list of values from window to window. This sorted list could be maintained with simple O(log n) inserts and deletes. We illustrate the favorable scaling properties of our algorithms with both time complexity analysis and benchmarking on synthetic datasets. Tiling microarray analyses that rely upon a sliding window pseudomedian calculation can require many hours of computation. We have eased this requirement significantly by implementing efficient algorithms that scale well with genomic feature density. This result not only speeds the current standard analyses, but also makes possible ones where many iterations of the filter may be required, such as might be required in a bootstrap or parameter estimation setting. Source code and executables are available at http://tiling.gersteinlab.org/pseudomedian/.
An efficient pseudomedian filter for tiling microrrays
Royce, Thomas E; Carriero, Nicholas J; Gerstein, Mark B
2007-01-01
Background Tiling microarrays are becoming an essential technology in the functional genomics toolbox. They have been applied to the tasks of novel transcript identification, elucidation of transcription factor binding sites, detection of methylated DNA and several other applications in several model organisms. These experiments are being conducted at increasingly finer resolutions as the microarray technology enjoys increasingly greater feature densities. The increased densities naturally lead to increased data analysis requirements. Specifically, the most widely employed algorithm for tiling array analysis involves smoothing observed signals by computing pseudomedians within sliding windows, a O(n2logn) calculation in each window. This poor time complexity is an issue for tiling array analysis and could prove to be a real bottleneck as tiling microarray experiments become grander in scope and finer in resolution. Results We therefore implemented Monahan's HLQEST algorithm that reduces the runtime complexity for computing the pseudomedian of n numbers to O(nlogn) from O(n2logn). For a representative tiling microarray dataset, this modification reduced the smoothing procedure's runtime by nearly 90%. We then leveraged the fact that elements within sliding windows remain largely unchanged in overlapping windows (as one slides across genomic space) to further reduce computation by an additional 43%. This was achieved by the application of skip lists to maintaining a sorted list of values from window to window. This sorted list could be maintained with simple O(log n) inserts and deletes. We illustrate the favorable scaling properties of our algorithms with both time complexity analysis and benchmarking on synthetic datasets. Conclusion Tiling microarray analyses that rely upon a sliding window pseudomedian calculation can require many hours of computation. We have eased this requirement significantly by implementing efficient algorithms that scale well with genomic feature density. This result not only speeds the current standard analyses, but also makes possible ones where many iterations of the filter may be required, such as might be required in a bootstrap or parameter estimation setting. Source code and executables are available at . PMID:17555595
Overview of physics research on the TCV tokamak
NASA Astrophysics Data System (ADS)
Fasoli, A.; TCV Team
2009-10-01
The Tokamak à Configuration Variable (TCV) tokamak is equipped with high-power (4.5 MW), real-time-controllable EC systems and flexible shaping, and plays an important role in fusion research by broadening the parameter range of reactor relevant regimes, by investigating tokamak physics questions and by developing new control tools. Steady-state discharges are achieved, in which the current is entirely self-generated through the bootstrap mechanism, a fundamental ingredient for ITER steady-state operation. The discharge remains quiescent over several current redistribution times, demonstrating that a self-consistent, 'bootstrap-aligned' equilibrium state is possible. Electron internal transport barrier regimes sustained by EC current drive have also been explored. MHD activity is shown to be crucial in scenarios characterized by large and slow oscillations in plasma confinement, which in turn can be modified by small Ohmic current perturbations altering the barrier strength. In studies of the relation between anomalous transport and plasma shape, the observed dependences of the electron thermal diffusivity on triangularity (direct) and collisionality (inverse) are qualitatively reproduced by non-linear gyro-kinetic simulations and shown to be governed by TEM turbulence. Parallel SOL flows are studied for their importance for material migration. Flow profiles are measured using a reciprocating Mach probe by changing from lower to upper single-null diverted equilibria and shifting the plasmas vertically. The dominant, field-direction-dependent Pfirsch-Schlüter component is found to be in good agreement with theoretical predictions. A field-direction-independent component is identified and is consistent with flows generated by transient over-pressure due to ballooning-like interchange turbulence. Initial high-resolution infrared images confirm that ELMs have a filamentary structure, while fast, localized radiation measurements reveal that ELM activity first appears in the X-point region. Real time control techniques are currently being applied to EC multiple independent power supplies and beam launchers, e.g. to control the plasma current in fully non-inductive conditions, and the plasma elongation through current broadening by far-off-axis heating at constant shaping field.
Bootstrapping N=2 chiral correlators
NASA Astrophysics Data System (ADS)
Lemos, Madalena; Liendo, Pedro
2016-01-01
We apply the numerical bootstrap program to chiral operators in four-dimensional N=2 SCFTs. In the first part of this work we study four-point functions in which all fields have the same conformal dimension. We give special emphasis to bootstrapping a specific theory: the simplest Argyres-Douglas fixed point with no flavor symmetry. In the second part we generalize our setup and consider correlators of fields with unequal dimension. This is an example of a mixed correlator and allows us to probe new regions in the parameter space of N=2 SCFTs. In particular, our results put constraints on relations in the Coulomb branch chiral ring and on the curvature of the Zamolodchikov metric.
The effect of anisotropic heat transport on magnetic islands in 3-D configurations
NASA Astrophysics Data System (ADS)
Schlutt, M. G.; Hegna, C. C.
2012-08-01
An analytic theory of nonlinear pressure-induced magnetic island formation using a boundary layer analysis is presented. This theory extends previous work by including the effects of finite parallel heat transport and is applicable to general three dimensional magnetic configurations. In this work, particular attention is paid to the role of finite parallel heat conduction in the context of pressure-induced island physics. It is found that localized currents that require self-consistent deformation of the pressure profile, such as resistive interchange and bootstrap currents, are attenuated by finite parallel heat conduction when the magnetic islands are sufficiently small. However, these anisotropic effects do not change saturated island widths caused by Pfirsch-Schlüter current effects. Implications for finite pressure-induced island healing are discussed.
Exploring the Replicability of a Study's Results: Bootstrap Statistics for the Multivariate Case.
ERIC Educational Resources Information Center
Thompson, Bruce
Conventional statistical significance tests do not inform the researcher regarding the likelihood that results will replicate. One strategy for evaluating result replication is to use a "bootstrap" resampling of a study's data so that the stability of results across numerous configurations of the subjects can be explored. This paper…
Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization
ERIC Educational Resources Information Center
Lock, Robin H.; Lock, Patti Frazer
2008-01-01
Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…
Computing Robust, Bootstrap-Adjusted Fit Indices for Use with Nonnormal Data
ERIC Educational Resources Information Center
Walker, David A.; Smith, Thomas J.
2017-01-01
Nonnormality of data presents unique challenges for researchers who wish to carry out structural equation modeling. The subsequent SPSS syntax program computes bootstrap-adjusted fit indices (comparative fit index, Tucker-Lewis index, incremental fit index, and root mean square error of approximation) that adjust for nonnormality, along with the…
Forgetski Vygotsky: Or, a Plea for Bootstrapping Accounts of Learning
ERIC Educational Resources Information Center
Luntley, Michael
2017-01-01
This paper argues that sociocultural accounts of learning fail to answer the key question about learning--how is it possible? Accordingly, we should adopt an individualist bootstrapping methodology in providing a theory of learning. Such a methodology takes seriously the idea that learning is staged and distinguishes between a non-comprehending…
Higher curvature gravities, unlike GR, cannot be bootstrapped from their (usual) linearizations
NASA Astrophysics Data System (ADS)
Deser, S.
2017-12-01
We show that higher curvature order gravities, in particular the propagating quadratic curvature models, cannot be derived by self-coupling from their linear, flat space, forms, except through an unphysical version of linearization; only GR can. Separately, we comment on an early version of the self-coupling bootstrap.
The new version of EPA’s positive matrix factorization (EPA PMF) software, 5.0, includes three error estimation (EE) methods for analyzing factor analytic solutions: classical bootstrap (BS), displacement of factor elements (DISP), and bootstrap enhanced by displacement (BS-DISP)...
Bootsie: estimation of coefficient of variation of AFLP data by bootstrap analysis
USDA-ARS?s Scientific Manuscript database
Bootsie is an English-native replacement for ASG Coelho’s “DBOOT” utility for estimating coefficient of variation of a population of AFLP marker data using bootstrapping. Bootsie improves on DBOOT by supporting batch processing, time-to-completion estimation, built-in graphs, and a suite of export t...
How to Bootstrap a Human Communication System
ERIC Educational Resources Information Center
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified…
Li, Hao; Dong, Siping
2015-01-01
China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. © The Author(s) 2015.
Weak percolation on multiplex networks
NASA Astrophysics Data System (ADS)
Baxter, Gareth J.; Dorogovtsev, Sergey N.; Mendes, José F. F.; Cellai, Davide
2014-04-01
Bootstrap percolation is a simple but nontrivial model. It has applications in many areas of science and has been explored on random networks for several decades. In single-layer (simplex) networks, it has been recently observed that bootstrap percolation, which is defined as an incremental process, can be seen as the opposite of pruning percolation, where nodes are removed according to a connectivity rule. Here we propose models of both bootstrap and pruning percolation for multiplex networks. We collectively refer to these two models with the concept of "weak" percolation, to distinguish them from the somewhat classical concept of ordinary ("strong") percolation. While the two models coincide in simplex networks, we show that they decouple when considering multiplexes, giving rise to a wealth of critical phenomena. Our bootstrap model constitutes the simplest example of a contagion process on a multiplex network and has potential applications in critical infrastructure recovery and information security. Moreover, we show that our pruning percolation model may provide a way to diagnose missing layers in a multiplex network. Finally, our analytical approach allows us to calculate critical behavior and characterize critical clusters.
Calia, Clara; Darling, Stephen; Havelka, Jelena; Allen, Richard J
2018-05-01
Immediate serial recall of digits is better when the digits are shown by highlighting them in a familiar array, such as a phone keypad, compared with presenting them serially in a single location, a pattern referred to as "visuospatial bootstrapping." This pattern implies the establishment of temporary links between verbal and spatial working memory, alongside access to information in long-term memory. However, the role of working memory control processes like those implied by the "Central Executive" in bootstrapping has not been directly investigated. Here, we report a study addressing this issue, focusing on executive processes of attentional shifting. Tasks in which information has to be sequenced are thought to be heavily dependent on shifting. Memory for digits presented in keypads versus single locations was assessed under two secondary task load conditions, one with and one without a sequencing requirement, and hence differing in the degree to which they invoke shifting. Results provided clear evidence that multimodal binding (visuospatial bootstrapping) can operate independently of this form of executive control process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, L. J.; Kotschenreuther, M. T.; Valanju, P.
2014-06-15
The diamagnetic drift effects on the low-n magnetohydrodynamic instabilities at the high-mode (H-mode) pedestal are investigated in this paper with the inclusion of bootstrap current for equilibrium and rotation effects for stability, where n is the toroidal mode number. The AEGIS (Adaptive EiGenfunction Independent Solutions) code [L. J. Zheng and M. T. Kotschenreuther, J. Comp. Phys. 211 (2006)] is extended to include the diamagnetic drift effects. This can be viewed as the lowest order approximation of the finite Larmor radius effects in consideration of the pressure gradient steepness at the pedestal. The H-mode discharges at Jointed European Torus is reconstructedmore » numerically using the VMEC code [P. Hirshman and J. C. Whitson, Phys. Fluids 26, 3553 (1983)], with bootstrap current taken into account. Generally speaking, the diamagnetic drift effects are stabilizing. Our results show that the effectiveness of diamagnetic stabilization depends sensitively on the safe factor value (q{sub s}) at the safety-factor reversal or plateau region. The diamagnetic stabilization are weaker, when q{sub s} is larger than an integer; while stronger, when q{sub s} is smaller or less larger than an integer. We also find that the diamagnetic drift effects also depend sensitively on the rotation direction. The diamagnetic stabilization in the co-rotation case is stronger than in the counter rotation case with respect to the ion diamagnetic drift direction.« less
Generalized Bootstrap Method for Assessment of Uncertainty in Semivariogram Inference
Olea, R.A.; Pardo-Iguzquiza, E.
2011-01-01
The semivariogram and its related function, the covariance, play a central role in classical geostatistics for modeling the average continuity of spatially correlated attributes. Whereas all methods are formulated in terms of the true semivariogram, in practice what can be used are estimated semivariograms and models based on samples. A generalized form of the bootstrap method to properly model spatially correlated data is used to advance knowledge about the reliability of empirical semivariograms and semivariogram models based on a single sample. Among several methods available to generate spatially correlated resamples, we selected a method based on the LU decomposition and used several examples to illustrate the approach. The first one is a synthetic, isotropic, exhaustive sample following a normal distribution, the second example is also a synthetic but following a non-Gaussian random field, and a third empirical sample consists of actual raingauge measurements. Results show wider confidence intervals than those found previously by others with inadequate application of the bootstrap. Also, even for the Gaussian example, distributions for estimated semivariogram values and model parameters are positively skewed. In this sense, bootstrap percentile confidence intervals, which are not centered around the empirical semivariogram and do not require distributional assumptions for its construction, provide an achieved coverage similar to the nominal coverage. The latter cannot be achieved by symmetrical confidence intervals based on the standard error, regardless if the standard error is estimated from a parametric equation or from bootstrap. ?? 2010 International Association for Mathematical Geosciences.
Approximate sample sizes required to estimate length distributions
Miranda, L.E.
2007-01-01
The sample sizes required to estimate fish length were determined by bootstrapping from reference length distributions. Depending on population characteristics and species-specific maximum lengths, 1-cm length-frequency histograms required 375-1,200 fish to estimate within 10% with 80% confidence, 2.5-cm histograms required 150-425 fish, proportional stock density required 75-140 fish, and mean length required 75-160 fish. In general, smaller species, smaller populations, populations with higher mortality, and simpler length statistics required fewer samples. Indices that require low sample sizes may be suitable for monitoring population status, and when large changes in length are evident, additional sampling effort may be allocated to more precisely define length status with more informative estimators. ?? Copyright by the American Fisheries Society 2007.
Precision and relative effectiveness of a purse seine for sampling age-0 river herring in lakes
Devine, Matthew T.; Roy, Allison; Whiteley, Andrew R.; Gahagan, Benjamin I.; Armstrong, Michael P.; Jordaan, Adrian
2018-01-01
Stock assessments for anadromous river herring, collectively Alewife Alosa pseudoharengus and Blueback Herring A. aestivalis, lack adequate demographic information, particularly with respect to early life stages. Although sampling adult river herring is increasingly common throughout their range, currently no standardized, field‐based, analytical methods exist for estimating juvenile abundance in freshwater lakes. The objective of this research was to evaluate the relative effectiveness and sampling precision of a purse seine for estimating densities of age‐0 river herring in freshwater lakes. We used a purse seine to sample age‐0 river herring in June–September 2015 and June–July 2016 in 16 coastal freshwater lakes in the northeastern USA. Sampling effort varied from two seine hauls to more than 50 seine hauls per lake. Catch rates were highest in June and July, and sampling precision was maximized in July. Sampling at night (versus day) in open water (versus littoral areas) was most effective for capturing newly hatched larvae and juveniles up to ca. 100 mm TL. Bootstrap simulation results indicated that sampling precision of CPUE estimates increased with sampling effort, and there was a clear threshold beyond which increased effort resulted in negligible increases in precision. The effort required to produce precise CPUE estimates, as determined by the CV, was dependent on lake size; river herring densities could be estimated with up to 10 purse‐seine hauls (one‐two nights) in a small lake (<50 ha) and 15–20 hauls (two‐three nights) in a large lake (>50 ha). Fish collection techniques using a purse seine as described in this paper are likely to be effective for estimating recruit abundance of river herring in freshwater lakes across their range.
NASA Astrophysics Data System (ADS)
Al-Mudhafar, W. J.
2013-12-01
Precisely prediction of rock facies leads to adequate reservoir characterization by improving the porosity-permeability relationships to estimate the properties in non-cored intervals. It also helps to accurately identify the spatial facies distribution to perform an accurate reservoir model for optimal future reservoir performance. In this paper, the facies estimation has been done through Multinomial logistic regression (MLR) with respect to the well logs and core data in a well in upper sandstone formation of South Rumaila oil field. The entire independent variables are gamma rays, formation density, water saturation, shale volume, log porosity, core porosity, and core permeability. Firstly, Robust Sequential Imputation Algorithm has been considered to impute the missing data. This algorithm starts from a complete subset of the dataset and estimates sequentially the missing values in an incomplete observation by minimizing the determinant of the covariance of the augmented data matrix. Then, the observation is added to the complete data matrix and the algorithm continues with the next observation with missing values. The MLR has been chosen to estimate the maximum likelihood and minimize the standard error for the nonlinear relationships between facies & core and log data. The MLR is used to predict the probabilities of the different possible facies given each independent variable by constructing a linear predictor function having a set of weights that are linearly combined with the independent variables by using a dot product. Beta distribution of facies has been considered as prior knowledge and the resulted predicted probability (posterior) has been estimated from MLR based on Baye's theorem that represents the relationship between predicted probability (posterior) with the conditional probability and the prior knowledge. To assess the statistical accuracy of the model, the bootstrap should be carried out to estimate extra-sample prediction error by randomly drawing datasets with replacement from the training data. Each sample has the same size of the original training set and it can be conducted N times to produce N bootstrap datasets to re-fit the model accordingly to decrease the squared difference between the estimated and observed categorical variables (facies) leading to decrease the degree of uncertainty.
H-mode fueling optimization with the supersonic deuterium jet in NSTX
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soukhanovskii, V A; Bell, M G; Bell, R E
2008-06-18
High-performance, long-pulse 0.7-1.2 MA 6-7 MW NBI-heated small-ELM H-mode plasma discharges are developed in the National Spherical Torus Experiment (NSTX) as prototypes for confinement and current drive extrapolations to future spherical tori. It is envisioned that innovative lithium coating techniques for H-mode density pumping and a supersonic deuterium jet for plasma refueling will be used to achieve the low pedestal collisionality and low n{sub e}/n{sub G} fractions (0.3-0.6), both of which being essential conditions for maximizing the non-inductive (bootstrap and beam driven) current fractions. The low field side supersonic gas injector (SGI) on NSTX consists of a small converging-diverging graphitemore » Laval nozzle and a piezoelectric gas valve. The nozzle is capable of producing a deuterium jet with Mach number M {le} 4, estimated gas density at the nozzle exit n {le} 5 x 10{sup 23} m{sup -3}, estimated temperature T {ge} 70 K, and flow velocity v = 2:4 km/s. The nozzle Reynolds number Reis {approx_equal} 6000. The nozzle and the valve are enclosed in a protective carbon fiber composite shroud and mounted on a movable probe at a midplane port location. Despite the beneficial L-mode fueling experience with supersonic jets in limiter tokamaks, there is a limited experience with fueling of high-performance H-mode divertor discharges and the associated density, MHD stability, and MARFE limits. In initial supersonic deuterium jet fueling experiments in NSTX, a reliable H-mode access, a low NBI power threshold, P{sub LH} {le} 2 MW, and a high fueling efficiency (0.1-0.4) have been demonstrated. Progress has also been made toward a better control of the injected fueling gas by decreasing the uncontrolled high field side (HFS) injector fueling rate by up to 95 % and complementing it with the supersonic jet fueling. These results motivated recent upgrades to the SGI gas delivery and control systems. The new SGI-Upgrade (SGI-U) capabilities include multi-pulse ms-scale controls and a reservoir gas pressure up to P{sub 0} = 5000 Torr. In this paper we summarize recent progress toward optimization of H-mode fueling in NSTX using the SGI-U.« less
Multi-megawatt, gigajoule plasma operation in Tore Supra
NASA Astrophysics Data System (ADS)
Dumont, R. J.; Goniche, M.; Ekedahl, A.; Saoutic, B.; Artaud, J.-F.; Basiuk, V.; Bourdelle, C.; Corre, Y.; Decker, J.; Elbèze, D.; Giruzzi, G.; Hoang, G.-T.; Imbeaux, F.; Joffrin, E.; Litaudon, X.; Lotte, Ph; Maget, P.; Mazon, D.; Nilsson, E.; The Tore Supra Team
2014-07-01
Integrating several important technological elements required for long pulse operation in magnetic fusion devices, the Tore Supra tokamak routinely addresses the physics and technology issues related to this endeavor and, as a result, contributes essential information on critical issues for ITER. During the last experimental campaign, components of the radiofrequency system including an ITER relevant launcher (passive active multijunction (PAM)) and continuous wave/3.7 GHz klystrons, have been extensively qualified, and then used to develop steady state scenarios in which the lower hybrid (LH), ion cyclotron (IC) and electron cyclotron (EC) systems have been combined in fully stationary shots (duration ˜150 s, injected power up to ˜8 MW, injected/extracted energy up to ˜1 GJ). Injection of LH power in the 5.0-6.0 MW range has extended the domain of accessible plasma parameters to higher densities and non-inductive currents. These discharges exhibit steady electron internal transport barriers (ITBs). We report here on various issues relevant to the steady state operation of future devices, ranging from operational aspects and limitations related to the achievement of long pulses in a fully actively cooled fusion device (e.g. overheating due to fast particle losses), to more fundamental plasma physics topics. The latter include a beneficial influence of IC resonance heating on the magnetohydrodynamic (MHD) stability in these discharges, which has been studied in detail. Another interesting observation is the appearance of oscillations of the central temperature with typical periods of the order of one to several seconds, caused by a nonlinear interplay between LH deposition, MHD activity and bootstrap current in the presence of an ITB.
Sieberg, Christine B; Manganella, Juliana; Manalo, Gem; Simons, Laura E; Hresko, M Timothy
2017-12-01
There is a need to better assess patient satisfaction and surgical outcomes. The purpose of the current study is to identify how preoperative expectations can impact postsurgical satisfaction among youth with adolescent idiopathic scoliosis undergoing spinal fusion surgery. The present study includes patients with adolescent idiopathic scoliosis undergoing spinal fusion surgery enrolled in a prospective, multicentered registry examining postsurgical outcomes. The Scoliosis Research Society Questionnaire-Version 30, which assesses pain, self-image, mental health, and satisfaction with management, along with the Spinal Appearance Questionnaire, which measures surgical expectations was administered to 190 patients before surgery and 1 and 2 years postoperatively. Regression analyses with bootstrapping (with n=5000 bootstrap samples) were conducted with 99% bias-corrected confidence intervals to examine the extent to which preoperative expectations for spinal appearance mediated the relationship between presurgical mental health and pain and 2-year postsurgical satisfaction. Results indicate that preoperative mental health, pain, and expectations are predictive of postsurgical satisfaction. With the shifting health care system, physicians may want to consider patient mental health, pain, and expectations before surgery to optimize satisfaction and ultimately improve clinical care and patient outcomes. Level I-prognostic study.
Ishwaran, Hemant; Lu, Min
2018-06-04
Random forests are a popular nonparametric tree ensemble procedure with broad applications to data analysis. While its widespread popularity stems from its prediction performance, an equally important feature is that it provides a fully nonparametric measure of variable importance (VIMP). A current limitation of VIMP, however, is that no systematic method exists for estimating its variance. As a solution, we propose a subsampling approach that can be used to estimate the variance of VIMP and for constructing confidence intervals. The method is general enough that it can be applied to many useful settings, including regression, classification, and survival problems. Using extensive simulations, we demonstrate the effectiveness of the subsampling estimator and in particular find that the delete-d jackknife variance estimator, a close cousin, is especially effective under low subsampling rates due to its bias correction properties. These 2 estimators are highly competitive when compared with the .164 bootstrap estimator, a modified bootstrap procedure designed to deal with ties in out-of-sample data. Most importantly, subsampling is computationally fast, thus making it especially attractive for big data settings. Copyright © 2018 John Wiley & Sons, Ltd.
Palmer, Tom M; Holmes, Michael V; Keating, Brendan J; Sheehan, Nuala A
2017-01-01
Abstract Mendelian randomization studies use genotypes as instrumental variables to test for and estimate the causal effects of modifiable risk factors on outcomes. Two-stage residual inclusion (TSRI) estimators have been used when researchers are willing to make parametric assumptions. However, researchers are currently reporting uncorrected or heteroscedasticity-robust standard errors for these estimates. We compared several different forms of the standard error for linear and logistic TSRI estimates in simulations and in real-data examples. Among others, we consider standard errors modified from the approach of Newey (1987), Terza (2016), and bootstrapping. In our simulations Newey, Terza, bootstrap, and corrected 2-stage least squares (in the linear case) standard errors gave the best results in terms of coverage and type I error. In the real-data examples, the Newey standard errors were 0.5% and 2% larger than the unadjusted standard errors for the linear and logistic TSRI estimators, respectively. We show that TSRI estimators with modified standard errors have correct type I error under the null. Researchers should report TSRI estimates with modified standard errors instead of reporting unadjusted or heteroscedasticity-robust standard errors. PMID:29106476
Peace of Mind, Academic Motivation, and Academic Achievement in Filipino High School Students.
Datu, Jesus Alfonso D
2017-04-09
Recent literature has recognized the advantageous role of low-arousal positive affect such as feelings of peacefulness and internal harmony in collectivist cultures. However, limited research has explored the benefits of low-arousal affective states in the educational setting. The current study examined the link of peace of mind (PoM) to academic motivation (i.e., amotivation, controlled motivation, and autonomous motivation) and academic achievement among 525 Filipino high school students. Findings revealed that PoM was positively associated with academic achievement β = .16, p < .05, autonomous motivation β = .48, p < .001, and controlled motivation β = .25, p < .01. As expected, PoM was negatively related to amotivation β = -.19, p < .05, and autonomous motivation was positively associated with academic achievement β = .52, p < .01. Furthermore, the results of bias-corrected bootstrap analyses at 95% confidence interval based on 5,000 bootstrapped resamples demonstrated that peace of mind had an indirect influence on academic achievement through the mediating effects of autonomous motivation. In terms of the effect sizes, the findings showed that PoM explained about 1% to 18% of the variance in academic achievement and motivation. The theoretical and practical implications of the results are elucidated.
NASA Technical Reports Server (NTRS)
Patterson, Richard; Hammoud, Ahmad
2009-01-01
Electronic systems designed for use in deep space and planetary exploration missions are expected to encounter extreme temperatures and wide thermal swings. Silicon-based devices are limited in their wide-temperature capability and usually require extra measures, such as cooling or heating mechanisms, to provide adequate ambient temperature for proper operation. Silicon-On-Insulator (SOI) technology, on the other hand, lately has been gaining wide spread use in applications where high temperatures are encountered. Due to their inherent design, SOI-based integrated circuit chips are able to operate at temperatures higher than those of the silicon devices by virtue of reducing leakage currents, eliminating parasitic junctions, and limiting internal heating. In addition, SOI devices provide faster switching, consume less power, and offer improved radiation-tolerance. Very little data, however, exist on the performance of such devices and circuits under cryogenic temperatures. In this work, the performance of an SOI bootstrapped, full-bridge driver integrated circuit was evaluated under extreme temperatures and thermal cycling. The investigations were carried out to establish a baseline on the functionality and to determine suitability of this device for use in space exploration missions under extreme temperature conditions.
Pulling Econometrics Students up by Their Bootstraps
ERIC Educational Resources Information Center
O'Hara, Michael E.
2014-01-01
Although the concept of the sampling distribution is at the core of much of what we do in econometrics, it is a concept that is often difficult for students to grasp. The thought process behind bootstrapping provides a way for students to conceptualize the sampling distribution in a way that is intuitive and visual. However, teaching students to…
Accuracy assessment of percent canopy cover, cover type, and size class
H. T. Schreuder; S. Bain; R. C. Czaplewski
2003-01-01
Truth for vegetation cover percent and type is obtained from very large-scale photography (VLSP), stand structure as measured by size classes, and vegetation types from a combination of VLSP and ground sampling. We recommend using the Kappa statistic with bootstrap confidence intervals for overall accuracy, and similarly bootstrap confidence intervals for percent...
ERIC Educational Resources Information Center
Barner, David; Chow, Katherine; Yang, Shu-Ju
2009-01-01
We explored children's early interpretation of numerals and linguistic number marking, in order to test the hypothesis (e.g., Carey (2004). Bootstrapping and the origin of concepts. "Daedalus", 59-68) that children's initial distinction between "one" and other numerals (i.e., "two," "three," etc.) is bootstrapped from a prior distinction between…
A Class of Population Covariance Matrices in the Bootstrap Approach to Covariance Structure Analysis
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Hayashi, Kentaro; Yanagihara, Hirokazu
2007-01-01
Model evaluation in covariance structure analysis is critical before the results can be trusted. Due to finite sample sizes and unknown distributions of real data, existing conclusions regarding a particular statistic may not be applicable in practice. The bootstrap procedure automatically takes care of the unknown distribution and, for a given…
ERIC Educational Resources Information Center
Hand, Michael L.
1990-01-01
Use of the bootstrap resampling technique (BRT) is assessed in its application to resampling analysis associated with measurement of payment allocation errors by federally funded Family Assistance Programs. The BRT is applied to a food stamp quality control database in Oregon. This analysis highlights the outlier-sensitivity of the…
Donald B.K. English
2000-01-01
In this paper I use bootstrap procedures to develop confidence intervals for estimates of total industrial output generated per thousand tourist visits. Mean expenditures from replicated visitor expenditure data included weights to correct for response bias. Impacts were estimated with IMPLAN. Ninety percent interval endpoints were 6 to 16 percent above or below the...
Comparison of Methods for Estimating Low Flow Characteristics of Streams
Tasker, Gary D.
1987-01-01
Four methods for estimating the 7-day, 10-year and 7-day, 20-year low flows for streams are compared by the bootstrap method. The bootstrap method is a Monte Carlo technique in which random samples are drawn from an unspecified sampling distribution defined from observed data. The nonparametric nature of the bootstrap makes it suitable for comparing methods based on a flow series for which the true distribution is unknown. Results show that the two methods based on hypothetical distribution (Log-Pearson III and Weibull) had lower mean square errors than did the G. E. P. Box-D. R. Cox transformation method or the Log-W. C. Boughton method which is based on a fit of plotting positions.
Advanced Tokamak Investigations in Full-Tungsten ASDEX Upgrade
NASA Astrophysics Data System (ADS)
Bock, Alexander
2017-10-01
The tailoring of the q-profile is the foundation of Advanced Tokamak (AT) scenarios. It depends on low collisionality ν* which permits efficient external current drive and high amounts of intrinsic bootstrap current. At constant pressure, lowering ne leads to a strong decrease of ν* Te - 3 . After the conversion of ASDEX Upgrade to fully W-coated plasma facing components, radiative collapses of H-modes with little gas puffing due to central W accumulation could only be avoided partially with central ECRH. Also, operation at high β with low ne presented a challenge for the divertor. Together, these issues prevented meaningful AT investigations. To overcome this, several major feats have been accomplished: Access to lower ne was achieved through a better understanding of the changes to recycling and pumping, and optionally the density pump-out phenomenon due to RMPs. ECRH capacities were substantially expanded for both heating and current drive, and a solid W divertor capable of withstanding the power loads was installed. A major overhaul improved the reliability of the current profile diagnostics. This contribution will detail the efforts needed to re-access AT scenarios and report on the development of candidate steady state scenarios for ITER/DEMO. Starting from the `hybrid scenario,' a non-inductive scenario (q95 = 5.3 , βN = 2.7 , fbs > 40 %) was developed. It can be sustained for many τE, limited only by technical boundaries, and is also independent of the ramp-up scenario. The β-limit is set by ideal modes that convert into NTMs. The Ti-profiles are steeper than predicted by TGLF, but nonlinear electromagnetic gyro-kinetic analyses with GENE including fast particle effects matched the experimental heat fluxes. We will also report on scenarios at higher q95, similar to the EAST/DIII-D steady state scenario. The extrapolation of these scenarios to ITER/DEMO will be discussed.
NASA Astrophysics Data System (ADS)
Chatziantonaki, Ioanna; Tsironis, Christos; Isliker, Heinz; Vlahos, Loukas
2013-11-01
The most promising technique for the control of neoclassical tearing modes in tokamak experiments is the compensation of the missing bootstrap current with an electron-cyclotron current drive (ECCD). In this frame, the dynamics of magnetic islands has been studied extensively in terms of the modified Rutherford equation (MRE), including the presence of a current drive, either analytically described or computed by numerical methods. In this article, a self-consistent model for the dynamic evolution of the magnetic island and the driven current is derived, which takes into account the island's magnetic topology and its effect on the current drive. The model combines the MRE with a ray-tracing approach to electron-cyclotron wave-propagation and absorption. Numerical results exhibit a decrease in the time required for complete stabilization with respect to the conventional computation (not taking into account the island geometry), which increases by increasing the initial island size and radial misalignment of the deposition.
Wang, Changyou; Wang, Ziyang; Zhang, Yong; Su, Rongguo
2017-05-24
The ecotoxicological effects of Ciprofloxacin hydrochloride (CIP) were tested on population densities of plankton assemblages consisting of two algae (Isochrysis galbana and Platymonas subcordiformis) and a rotifer (Brachionus plicatilis). The I. galbana showed a significant decrease in densities when concentrations of CIP were above 2.0 mg L -1 in single-species tests, while P. subcordiformis and B. plicatilis were stable in densities when CIP were less than10.0 mg L -1 . The equilibrium densities of I. galbana in community test increased with CIP concentrations after falling to a trough at 5.0 mg L -1 , showed a completely different pattern of P. subcordiformis which decreased with CIP concentrations after reaching a peak at 30.0 mg L -1 . The observed beneficial effect was a result of interspecies interactions of trophic cascade that buffered for more severe direct effects of toxicants. The community test-based NOEC of CIP (2.0 mg L -1 ), embodying the indirect effects, was different from the extrapolated one derived by single-species tests (0.5 mg L -1 ), but all lacked confidence interval. A CIP threshold concentration of obvious relevance to ecological interaction was calculated with a simplified plankton ecological model, achieving a value of 1.26 mg L -1 with a 95% bootstrapping confidence interval from 1.18 to 1.31 mg L -1 .
Exact Mass-Coupling Relation for the Homogeneous Sine-Gordon Model.
Bajnok, Zoltán; Balog, János; Ito, Katsushi; Satoh, Yuji; Tóth, Gábor Zsolt
2016-05-06
We derive the exact mass-coupling relation of the simplest multiscale quantum integrable model, i.e., the homogeneous sine-Gordon model with two mass scales. The relation is obtained by comparing the perturbed conformal field theory description of the model valid at short distances to the large distance bootstrap description based on the model's integrability. In particular, we find a differential equation for the relation by constructing conserved tensor currents, which satisfy a generalization of the Θ sum rule Ward identity. The mass-coupling relation is written in terms of hypergeometric functions.
Technical and scale efficiency in public and private Irish nursing homes - a bootstrap DEA approach.
Ni Luasa, Shiovan; Dineen, Declan; Zieba, Marta
2016-10-27
This article provides methodological and empirical insights into the estimation of technical efficiency in the nursing home sector. Focusing on long-stay care and using primary data, we examine technical and scale efficiency in 39 public and 73 private Irish nursing homes by applying an input-oriented data envelopment analysis (DEA). We employ robust bootstrap methods to validate our nonparametric DEA scores and to integrate the effects of potential determinants in estimating the efficiencies. Both the homogenous and two-stage double bootstrap procedures are used to obtain confidence intervals for the bias-corrected DEA scores. Importantly, the application of the double bootstrap approach affords true DEA technical efficiency scores after adjusting for the effects of ownership, size, case-mix, and other determinants such as location, and quality. Based on our DEA results for variable returns to scale technology, the average technical efficiency score is 62 %, and the mean scale efficiency is 88 %, with nearly all units operating on the increasing returns to scale part of the production frontier. Moreover, based on the double bootstrap results, Irish nursing homes are less technically efficient, and more scale efficient than the conventional DEA estimates suggest. Regarding the efficiency determinants, in terms of ownership, we find that private facilities are less efficient than the public units. Furthermore, the size of the nursing home has a positive effect, and this reinforces our finding that Irish homes produce at increasing returns to scale. Also, notably, we find that a tendency towards quality improvements can lead to poorer technical efficiency performance.
Empirical single sample quantification of bias and variance in Q-ball imaging.
Hainline, Allison E; Nath, Vishwesh; Parvathaneni, Prasanna; Blaber, Justin A; Schilling, Kurt G; Anderson, Adam W; Kang, Hakmook; Landman, Bennett A
2018-02-06
The bias and variance of high angular resolution diffusion imaging methods have not been thoroughly explored in the literature and may benefit from the simulation extrapolation (SIMEX) and bootstrap techniques to estimate bias and variance of high angular resolution diffusion imaging metrics. The SIMEX approach is well established in the statistics literature and uses simulation of increasingly noisy data to extrapolate back to a hypothetical case with no noise. The bias of calculated metrics can then be computed by subtracting the SIMEX estimate from the original pointwise measurement. The SIMEX technique has been studied in the context of diffusion imaging to accurately capture the bias in fractional anisotropy measurements in DTI. Herein, we extend the application of SIMEX and bootstrap approaches to characterize bias and variance in metrics obtained from a Q-ball imaging reconstruction of high angular resolution diffusion imaging data. The results demonstrate that SIMEX and bootstrap approaches provide consistent estimates of the bias and variance of generalized fractional anisotropy, respectively. The RMSE for the generalized fractional anisotropy estimates shows a 7% decrease in white matter and an 8% decrease in gray matter when compared with the observed generalized fractional anisotropy estimates. On average, the bootstrap technique results in SD estimates that are approximately 97% of the true variation in white matter, and 86% in gray matter. Both SIMEX and bootstrap methods are flexible, estimate population characteristics based on single scans, and may be extended for bias and variance estimation on a variety of high angular resolution diffusion imaging metrics. © 2018 International Society for Magnetic Resonance in Medicine.
Uncertainty Estimation using Bootstrapped Kriging Predictions for Precipitation Isoscapes
NASA Astrophysics Data System (ADS)
Ma, C.; Bowen, G. J.; Vander Zanden, H.; Wunder, M.
2017-12-01
Isoscapes are spatial models representing the distribution of stable isotope values across landscapes. Isoscapes of hydrogen and oxygen in precipitation are now widely used in a diversity of fields, including geology, biology, hydrology, and atmospheric science. To generate isoscapes, geostatistical methods are typically applied to extend predictions from limited data measurements. Kriging is a popular method in isoscape modeling, but quantifying the uncertainty associated with the resulting isoscapes is challenging. Applications that use precipitation isoscapes to determine sample origin require estimation of uncertainty. Here we present a simple bootstrap method (SBM) to estimate the mean and uncertainty of the krigged isoscape and compare these results with a generalized bootstrap method (GBM) applied in previous studies. We used hydrogen isotopic data from IsoMAP to explore these two approaches for estimating uncertainty. We conducted 10 simulations for each bootstrap method and found that SBM results in more kriging predictions (9/10) compared to GBM (4/10). Prediction from SBM was closer to the original prediction generated without bootstrapping and had less variance than GBM. SBM was tested on different datasets from IsoMAP with different numbers of observation sites. We determined that predictions from the datasets with fewer than 40 observation sites using SBM were more variable than the original prediction. The approaches we used for estimating uncertainty will be compiled in an R package that is under development. We expect that these robust estimates of precipitation isoscape uncertainty can be applied in diagnosing the origin of samples ranging from various type of waters to migratory animals, food products, and humans.
Bootstrapping rapidity anomalous dimensions for transverse-momentum resummation
Li, Ye; Zhu, Hua Xing
2017-01-11
Soft function relevant for transverse-momentum resummation for Drell-Yan or Higgs production at hadron colliders are computed through to three loops in the expansion of strong coupling, with the help of bootstrap technique and supersymmetric decomposition. The corresponding rapidity anomalous dimension is extracted. Furthermore, an intriguing relation between anomalous dimensions for transverse-momentum resummation and threshold resummation is found.
H. T. Schreuder; M. S. Williams
2000-01-01
In simulation sampling from forest populations using sample sizes of 20, 40, and 60 plots respectively, confidence intervals based on the bootstrap (accelerated, percentile, and t-distribution based) were calculated and compared with those based on the classical t confidence intervals for mapped populations and subdomains within those populations. A 68.1 ha mapped...
ERIC Educational Resources Information Center
Ural, A. Engin; Yuret, Deniz; Ketrez, F. Nihan; Kocbas, Dilara; Kuntay, Aylin C.
2009-01-01
The syntactic bootstrapping mechanism of verb learning was evaluated against child-directed speech in Turkish, a language with rich morphology, nominal ellipsis and free word order. Machine-learning algorithms were run on transcribed caregiver speech directed to two Turkish learners (one hour every two weeks between 0;9 to 1;10) of different…
ERIC Educational Resources Information Center
Seco, Guillermo Vallejo; Izquierdo, Marcelino Cuesta; Garcia, M. Paula Fernandez; Diez, F. Javier Herrero
2006-01-01
The authors compare the operating characteristics of the bootstrap-F approach, a direct extension of the work of Berkovits, Hancock, and Nevitt, with Huynh's improved general approximation (IGA) and the Brown-Forsythe (BF) multivariate approach in a mixed repeated measures design when normality and multisample sphericity assumptions do not hold.…
Sample-based estimation of tree species richness in a wet tropical forest compartment
Steen Magnussen; Raphael Pelissier
2007-01-01
Petersen's capture-recapture ratio estimator and the well-known bootstrap estimator are compared across a range of simulated low-intensity simple random sampling with fixed-area plots of 100 m? in a rich wet tropical forest compartment with 93 tree species in the Western Ghats of India. Petersen's ratio estimator was uniformly superior to the bootstrap...
Common Ground between Form and Content: The Pragmatic Solution to the Bootstrapping Problem
ERIC Educational Resources Information Center
Oller, John W.
2005-01-01
The frame of reference for this article is second or foreign language (L2 or FL) acquisition, but the pragmatic bootstrapping hypothesis applies to language processing and acquisition in any context or modality. It is relevant to teaching children to read. It shows how connections between target language surface forms and their content can be made…
2006-06-13
with arithmetic mean ( UPGMA ) using random tie breaking and uncorrected pairwise distances in MacVector 7.0 (Oxford Molecular). Numbers on branches...denote the UPGMA bootstrap percentage using a highly stringent number (1000) of replications (Felsenstein, 1985). All bootstrap values are 50%, as shown
A Comparison of Single Sample and Bootstrap Methods to Assess Mediation in Cluster Randomized Trials
ERIC Educational Resources Information Center
Pituch, Keenan A.; Stapleton, Laura M.; Kang, Joo Youn
2006-01-01
A Monte Carlo study examined the statistical performance of single sample and bootstrap methods that can be used to test and form confidence interval estimates of indirect effects in two cluster randomized experimental designs. The designs were similar in that they featured random assignment of clusters to one of two treatment conditions and…
Multilingual Phoneme Models for Rapid Speech Processing System Development
2006-09-01
processes are used to develop an Arabic speech recognition system starting from monolingual English models, In- ternational Phonetic Association (IPA...clusters. It was found that multilingual bootstrapping methods out- perform monolingual English bootstrapping methods on the Arabic evaluation data initially...International Phonetic Alphabet . . . . . . . . . 7 2.3.2 Multilingual vs. Monolingual Speech Recognition 7 2.3.3 Data-Driven Approaches
Ramírez-Prado, Dolores; Cortés, Ernesto; Aguilar-Segura, María Soledad; Gil-Guillén, Vicente Francisco
2016-01-01
In January 2012, a review of the cases of chromosome 15q24 microdeletion syndrome was published. However, this study did not include inferential statistics. The aims of the present study were to update the literature search and calculate confidence intervals for the prevalence of each phenotype using bootstrap methodology. Published case reports of patients with the syndrome that included detailed information about breakpoints and phenotype were sought and 36 were included. Deletions in megabase (Mb) pairs were determined to calculate the size of the interstitial deletion of the phenotypes studied in 2012. To determine confidence intervals for the prevalence of the phenotype and the interstitial loss, we used bootstrap methodology. Using the bootstrap percentiles method, we found wide variability in the prevalence of the different phenotypes (3–100%). The mean interstitial deletion size was 2.72 Mb (95% CI [2.35–3.10 Mb]). In comparison with our work, which expanded the literature search by 45 months, there were differences in the prevalence of 17% of the phenotypes, indicating that more studies are needed to analyze this rare disease. PMID:26925314
van Walraven, Carl
2017-04-01
Diagnostic codes used in administrative databases cause bias due to misclassification of patient disease status. It is unclear which methods minimize this bias. Serum creatinine measures were used to determine severe renal failure status in 50,074 hospitalized patients. The true prevalence of severe renal failure and its association with covariates were measured. These were compared to results for which renal failure status was determined using surrogate measures including the following: (1) diagnostic codes; (2) categorization of probability estimates of renal failure determined from a previously validated model; or (3) bootstrap methods imputation of disease status using model-derived probability estimates. Bias in estimates of severe renal failure prevalence and its association with covariates were minimal when bootstrap methods were used to impute renal failure status from model-based probability estimates. In contrast, biases were extensive when renal failure status was determined using codes or methods in which model-based condition probability was categorized. Bias due to misclassification from inaccurate diagnostic codes can be minimized using bootstrap methods to impute condition status using multivariable model-derived probability estimates. Copyright © 2017 Elsevier Inc. All rights reserved.
Reference interval computation: which method (not) to choose?
Pavlov, Igor Y; Wilson, Andrew R; Delgado, Julio C
2012-07-11
When different methods are applied to reference interval (RI) calculation the results can sometimes be substantially different, especially for small reference groups. If there are no reliable RI data available, there is no way to confirm which method generates results closest to the true RI. We randomly drawn samples obtained from a public database for 33 markers. For each sample, RIs were calculated by bootstrapping, parametric, and Box-Cox transformed parametric methods. Results were compared to the values of the population RI. For approximately half of the 33 markers, results of all 3 methods were within 3% of the true reference value. For other markers, parametric results were either unavailable or deviated considerably from the true values. The transformed parametric method was more accurate than bootstrapping for sample size of 60, very close to bootstrapping for sample size 120, but in some cases unavailable. We recommend against using parametric calculations to determine RIs. The transformed parametric method utilizing Box-Cox transformation would be preferable way of RI calculation, if it satisfies normality test. If not, the bootstrapping is always available, and is almost as accurate and precise as the transformed parametric method. Copyright © 2012 Elsevier B.V. All rights reserved.
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution
Imai, Mutsumi; Kita, Sotaro
2014-01-01
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. PMID:25092666
Estimating uncertainty in respondent-driven sampling using a tree bootstrap method.
Baraff, Aaron J; McCormick, Tyler H; Raftery, Adrian E
2016-12-20
Respondent-driven sampling (RDS) is a network-based form of chain-referral sampling used to estimate attributes of populations that are difficult to access using standard survey tools. Although it has grown quickly in popularity since its introduction, the statistical properties of RDS estimates remain elusive. In particular, the sampling variability of these estimates has been shown to be much higher than previously acknowledged, and even methods designed to account for RDS result in misleadingly narrow confidence intervals. In this paper, we introduce a tree bootstrap method for estimating uncertainty in RDS estimates based on resampling recruitment trees. We use simulations from known social networks to show that the tree bootstrap method not only outperforms existing methods but also captures the high variability of RDS, even in extreme cases with high design effects. We also apply the method to data from injecting drug users in Ukraine. Unlike other methods, the tree bootstrap depends only on the structure of the sampled recruitment trees, not on the attributes being measured on the respondents, so correlations between attributes can be estimated as well as variability. Our results suggest that it is possible to accurately assess the high level of uncertainty inherent in RDS.
A bootstrap lunar base: Preliminary design review 2
NASA Technical Reports Server (NTRS)
1987-01-01
A bootstrap lunar base is the gateway to manned solar system exploration and requires new ideas and new designs on the cutting edge of technology. A preliminary design for a Bootstrap Lunar Base, the second provided by this contractor, is presented. An overview of the work completed is discussed as well as the technical, management, and cost strategies to complete the program requirements. The lunar base design stresses the transforming capabilities of its lander vehicles to aid in base construction. The design also emphasizes modularity and expandability in the base configuration to support the long-term goals of scientific research and profitable lunar resource exploitation. To successfully construct, develop, and inhabit a permanent lunar base, however, several technological advancements must first be realized. Some of these technological advancements are also discussed.
Spheres, charges, instantons, and bootstrap: A five-dimensional odyssey
NASA Astrophysics Data System (ADS)
Chang, Chi-Ming; Fluder, Martin; Lin, Ying-Hsuan; Wang, Yifan
2018-03-01
We combine supersymmetric localization and the conformal bootstrap to study five-dimensional superconformal field theories. To begin, we classify the admissible counter-terms and derive a general relation between the five-sphere partition function and the conformal and flavor central charges. Along the way, we discover a new superconformal anomaly in five dimensions. We then propose a precise triple factorization formula for the five-sphere partition function, that incorporates instantons and is consistent with flavor symmetry enhancement. We numerically evaluate the central charges for the rank-one Seiberg and Morrison-Seiberg theories, and find strong evidence for their saturation of bootstrap bounds, thereby determining the spectra of long multiplets in these theories. Lastly, our results provide new evidence for the F-theorem and possibly a C-theorem in five-dimensional superconformal theories.
Integrated modelling of steady-state scenarios and heating and current drive mixes for ITER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murakami, Masanori; Park, Jin Myung; Giruzzi, G.
2011-01-01
Recent progress on ITER steady-state (SS) scenario modelling by the ITPA-IOS group is reviewed. Code-to-code benchmarks as the IOS group's common activities for the two SS scenarios (weak shear scenario and internal transport barrier scenario) are discussed in terms of transport, kinetic profiles, and heating and current drive (CD) sources using various transport codes. Weak magnetic shear scenarios integrate the plasma core and edge by combining a theory-based transport model (GLF23) with scaled experimental boundary profiles. The edge profiles (at normalized radius rho = 0.8-1.0) are adopted from an edge-localized mode-averaged analysis of a DIII-D ITER demonstration discharge. A fullymore » noninductive SS scenario is achieved with fusion gain Q = 4.3, noninductive fraction f(NI) = 100%, bootstrap current fraction f(BS) = 63% and normalized beta beta(N) = 2.7 at plasma current I(p) = 8MA and toroidal field B(T) = 5.3 T using ITER day-1 heating and CD capability. Substantial uncertainties come from outside the radius of setting the boundary conditions (rho = 0.8). The present simulation assumed that beta(N)(rho) at the top of the pedestal (rho = 0.91) is about 25% above the peeling-ballooning threshold. ITER will have a challenge to achieve the boundary, considering different operating conditions (T(e)/T(i) approximate to 1 and density peaking). Overall, the experimentally scaled edge is an optimistic side of the prediction. A number of SS scenarios with different heating and CD mixes in a wide range of conditions were explored by exploiting the weak-shear steady-state solution procedure with the GLF23 transport model and the scaled experimental edge. The results are also presented in the operation space for DT neutron power versus stationary burn pulse duration with assumed poloidal flux availability at the beginning of stationary burn, indicating that the long pulse operation goal (3000s) at I(p) = 9 MA is possible. Source calculations in these simulations have been revised for electron cyclotron current drive including parallel momentum conservation effects and for neutral beam current drive with finite orbit and magnetic pitch effects.« less
ERIC Educational Resources Information Center
Wagstaff, David A.; Elek, Elvira; Kulis, Stephen; Marsiglia, Flavio
2009-01-01
A nonparametric bootstrap was used to obtain an interval estimate of Pearson's "r," and test the null hypothesis that there was no association between 5th grade students' positive substance use expectancies and their intentions to not use substances. The students were participating in a substance use prevention program in which the unit of…
Bootstrapping a five-loop amplitude using Steinmann relations
Caron-Huot, Simon; Dixon, Lance J.; McLeod, Andrew; ...
2016-12-05
Here, the analytic structure of scattering amplitudes is restricted by Steinmann relations, which enforce the vanishing of certain discontinuities of discontinuities. We show that these relations dramatically simplify the function space for the hexagon function bootstrap in planar maximally supersymmetric Yang-Mills theory. Armed with this simplification, along with the constraints of dual conformal symmetry and Regge exponentiation, we obtain the complete five-loop six-particle amplitude.
A Bootstrap Algorithm for Mixture Models and Interval Data in Inter-Comparisons
2001-07-01
parametric bootstrap. The present algorithm will be applied to a thermometric inter-comparison, where data cannot be assumed to be normally distributed. 2 Data...experimental methods, used in each laboratory) often imply that the statistical assumptions are not satisfied, as for example in several thermometric ...triangular). Indeed, in thermometric experiments these three probabilistic models can represent several common stochastic variabilities for
On the Model-Based Bootstrap with Missing Data: Obtaining a "P"-Value for a Test of Exact Fit
ERIC Educational Resources Information Center
Savalei, Victoria; Yuan, Ke-Hai
2009-01-01
Evaluating the fit of a structural equation model via bootstrap requires a transformation of the data so that the null hypothesis holds exactly in the sample. For complete data, such a transformation was proposed by Beran and Srivastava (1985) for general covariance structure models and applied to structural equation modeling by Bollen and Stine…
ERIC Educational Resources Information Center
Choi, Sae Il
2009-01-01
This study used simulation (a) to compare the kernel equating method to traditional equipercentile equating methods under the equivalent-groups (EG) design and the nonequivalent-groups with anchor test (NEAT) design and (b) to apply the parametric bootstrap method for estimating standard errors of equating. A two-parameter logistic item response…
ERIC Educational Resources Information Center
Essid, Hedi; Ouellette, Pierre; Vigeant, Stephane
2010-01-01
The objective of this paper is to measure the efficiency of high schools in Tunisia. We use a statistical data envelopment analysis (DEA)-bootstrap approach with quasi-fixed inputs to estimate the precision of our measure. To do so, we developed a statistical model serving as the foundation of the data generation process (DGP). The DGP is…
BootGraph: probabilistic fiber tractography using bootstrap algorithms and graph theory.
Vorburger, Robert S; Reischauer, Carolin; Boesiger, Peter
2013-02-01
Bootstrap methods have recently been introduced to diffusion-weighted magnetic resonance imaging to estimate the measurement uncertainty of ensuing diffusion parameters directly from the acquired data without the necessity to assume a noise model. These methods have been previously combined with deterministic streamline tractography algorithms to allow for the assessment of connection probabilities in the human brain. Thereby, the local noise induced disturbance in the diffusion data is accumulated additively due to the incremental progression of streamline tractography algorithms. Graph based approaches have been proposed to overcome this drawback of streamline techniques. For this reason, the bootstrap method is in the present work incorporated into a graph setup to derive a new probabilistic fiber tractography method, called BootGraph. The acquired data set is thereby converted into a weighted, undirected graph by defining a vertex in each voxel and edges between adjacent vertices. By means of the cone of uncertainty, which is derived using the wild bootstrap, a weight is thereafter assigned to each edge. Two path finding algorithms are subsequently applied to derive connection probabilities. While the first algorithm is based on the shortest path approach, the second algorithm takes all existing paths between two vertices into consideration. Tracking results are compared to an established algorithm based on the bootstrap method in combination with streamline fiber tractography and to another graph based algorithm. The BootGraph shows a very good performance in crossing situations with respect to false negatives and permits incorporating additional constraints, such as a curvature threshold. By inheriting the advantages of the bootstrap method and graph theory, the BootGraph method provides a computationally efficient and flexible probabilistic tractography setup to compute connection probability maps and virtual fiber pathways without the drawbacks of streamline tractography algorithms or the assumption of a noise distribution. Moreover, the BootGraph can be applied to common DTI data sets without further modifications and shows a high repeatability. Thus, it is very well suited for longitudinal studies and meta-studies based on DTI. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Yang, P.; Ng, T. L.; Yang, W.
2015-12-01
Effective water resources management depends on the reliable estimation of the uncertainty of drought events. Confidence intervals (CIs) are commonly applied to quantify this uncertainty. A CI seeks to be at the minimal length necessary to cover the true value of the estimated variable with the desired probability. In drought analysis where two or more variables (e.g., duration and severity) are often used to describe a drought, copulas have been found suitable for representing the joint probability behavior of these variables. However, the comprehensive assessment of the parameter uncertainties of copulas of droughts has been largely ignored, and the few studies that have recognized this issue have not explicitly compared the various methods to produce the best CIs. Thus, the objective of this study to compare the CIs generated using two widely applied uncertainty estimation methods, bootstrapping and Markov Chain Monte Carlo (MCMC). To achieve this objective, (1) the marginal distributions lognormal, Gamma, and Generalized Extreme Value, and the copula functions Clayton, Frank, and Plackett are selected to construct joint probability functions of two drought related variables. (2) The resulting joint functions are then fitted to 200 sets of simulated realizations of drought events with known distribution and extreme parameters and (3) from there, using bootstrapping and MCMC, CIs of the parameters are generated and compared. The effect of an informative prior on the CIs generated by MCMC is also evaluated. CIs are produced for different sample sizes (50, 100, and 200) of the simulated drought events for fitting the joint probability functions. Preliminary results assuming lognormal marginal distributions and the Clayton copula function suggest that for cases with small or medium sample sizes (~50-100), MCMC to be superior method if an informative prior exists. Where an informative prior is unavailable, for small sample sizes (~50), both bootstrapping and MCMC yield the same level of performance, and for medium sample sizes (~100), bootstrapping is better. For cases with a large sample size (~200), there is little difference between the CIs generated using bootstrapping and MCMC regardless of whether or not an informative prior exists.
NASA Astrophysics Data System (ADS)
Christen, Alejandra; Escarate, Pedro; Curé, Michel; Rial, Diego F.; Cassetti, Julia
2016-10-01
Aims: Knowing the distribution of stellar rotational velocities is essential for understanding stellar evolution. Because we measure the projected rotational speed v sin I, we need to solve an ill-posed problem given by a Fredholm integral of the first kind to recover the "true" rotational velocity distribution. Methods: After discretization of the Fredholm integral we apply the Tikhonov regularization method to obtain directly the probability distribution function for stellar rotational velocities. We propose a simple and straightforward procedure to determine the Tikhonov parameter. We applied Monte Carlo simulations to prove that the Tikhonov method is a consistent estimator and asymptotically unbiased. Results: This method is applied to a sample of cluster stars. We obtain confidence intervals using a bootstrap method. Our results are in close agreement with those obtained using the Lucy method for recovering the probability density distribution of rotational velocities. Furthermore, Lucy estimation lies inside our confidence interval. Conclusions: Tikhonov regularization is a highly robust method that deconvolves the rotational velocity probability density function from a sample of v sin I data directly without the need for any convergence criteria.
Predict-first experimental analysis using automated and integrated magnetohydrodynamic modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyons, B. C.; Paz-Soldan, C.; Meneghini, O.
An integrated-modeling workflow has been developed in this paper for the purpose of performing predict-first analysis of transient-stability experiments. Starting from an existing equilibrium reconstruction from a past experiment, the workflow couples together the EFIT Grad-Shafranov solver [L. Lao et al., Fusion Sci. Technol. 48, 968 (2005)], the EPED model for the pedestal structure [P. B. Snyder et al., Phys. Plasmas 16, 056118 (2009)], and the NEO drift-kinetic-equation solver [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)] (for bootstrap current calculations) in order to generate equilibria with self-consistent pedestal structures as the plasma shape andmore » various scalar parameters (e.g., normalized β, pedestal density, and edge safety factor [q 95]) are changed. These equilibria are then analyzed using automated M3D-C1 extended-magnetohydrodynamic modeling [S. C. Jardin et al., Comput. Sci. Discovery 5, 014002 (2012)] to compute the plasma response to three-dimensional magnetic perturbations. This workflow was created in conjunction with a DIII-D experiment examining the effect of triangularity on the 3D plasma response. Several versions of the workflow were developed, and the initial ones were used to help guide experimental planning (e.g., determining the plasma current necessary to maintain the constant edge safety factor in various shapes). Subsequent validation with the experimental results was then used to revise the workflow, ultimately resulting in the complete model presented here. We show that quantitative agreement was achieved between the M3D-C1 plasma response calculated for equilibria generated by the final workflow and equilibria reconstructed from experimental data. A comparison of results from earlier workflows is used to show the importance of properly matching certain experimental parameters in the generated equilibria, including the normalized β, pedestal density, and q 95. On the other hand, the details of the pedestal current did not significantly impact the plasma response in these equilibria. A comparison to the experimentally measured plasma response shows mixed agreement, indicating that while the equilibria are predicted well, additional analysis tools may be needed. In conclusion, we note the implications that these results have for the success of future predict-first studies, particularly the need for scans of uncertain parameters and for close collaboration between experimentalists and theorists.« less
Predict-first experimental analysis using automated and integrated magnetohydrodynamic modeling
Lyons, B. C.; Paz-Soldan, C.; Meneghini, O.; ...
2018-05-07
An integrated-modeling workflow has been developed in this paper for the purpose of performing predict-first analysis of transient-stability experiments. Starting from an existing equilibrium reconstruction from a past experiment, the workflow couples together the EFIT Grad-Shafranov solver [L. Lao et al., Fusion Sci. Technol. 48, 968 (2005)], the EPED model for the pedestal structure [P. B. Snyder et al., Phys. Plasmas 16, 056118 (2009)], and the NEO drift-kinetic-equation solver [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)] (for bootstrap current calculations) in order to generate equilibria with self-consistent pedestal structures as the plasma shape andmore » various scalar parameters (e.g., normalized β, pedestal density, and edge safety factor [q 95]) are changed. These equilibria are then analyzed using automated M3D-C1 extended-magnetohydrodynamic modeling [S. C. Jardin et al., Comput. Sci. Discovery 5, 014002 (2012)] to compute the plasma response to three-dimensional magnetic perturbations. This workflow was created in conjunction with a DIII-D experiment examining the effect of triangularity on the 3D plasma response. Several versions of the workflow were developed, and the initial ones were used to help guide experimental planning (e.g., determining the plasma current necessary to maintain the constant edge safety factor in various shapes). Subsequent validation with the experimental results was then used to revise the workflow, ultimately resulting in the complete model presented here. We show that quantitative agreement was achieved between the M3D-C1 plasma response calculated for equilibria generated by the final workflow and equilibria reconstructed from experimental data. A comparison of results from earlier workflows is used to show the importance of properly matching certain experimental parameters in the generated equilibria, including the normalized β, pedestal density, and q 95. On the other hand, the details of the pedestal current did not significantly impact the plasma response in these equilibria. A comparison to the experimentally measured plasma response shows mixed agreement, indicating that while the equilibria are predicted well, additional analysis tools may be needed. In conclusion, we note the implications that these results have for the success of future predict-first studies, particularly the need for scans of uncertain parameters and for close collaboration between experimentalists and theorists.« less
Predict-first experimental analysis using automated and integrated magnetohydrodynamic modeling
NASA Astrophysics Data System (ADS)
Lyons, B. C.; Paz-Soldan, C.; Meneghini, O.; Lao, L. L.; Weisberg, D. B.; Belli, E. A.; Evans, T. E.; Ferraro, N. M.; Snyder, P. B.
2018-05-01
An integrated-modeling workflow has been developed for the purpose of performing predict-first analysis of transient-stability experiments. Starting from an existing equilibrium reconstruction from a past experiment, the workflow couples together the EFIT Grad-Shafranov solver [L. Lao et al., Fusion Sci. Technol. 48, 968 (2005)], the EPED model for the pedestal structure [P. B. Snyder et al., Phys. Plasmas 16, 056118 (2009)], and the NEO drift-kinetic-equation solver [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)] (for bootstrap current calculations) in order to generate equilibria with self-consistent pedestal structures as the plasma shape and various scalar parameters (e.g., normalized β, pedestal density, and edge safety factor [q95]) are changed. These equilibria are then analyzed using automated M3D-C1 extended-magnetohydrodynamic modeling [S. C. Jardin et al., Comput. Sci. Discovery 5, 014002 (2012)] to compute the plasma response to three-dimensional magnetic perturbations. This workflow was created in conjunction with a DIII-D experiment examining the effect of triangularity on the 3D plasma response. Several versions of the workflow were developed, and the initial ones were used to help guide experimental planning (e.g., determining the plasma current necessary to maintain the constant edge safety factor in various shapes). Subsequent validation with the experimental results was then used to revise the workflow, ultimately resulting in the complete model presented here. We show that quantitative agreement was achieved between the M3D-C1 plasma response calculated for equilibria generated by the final workflow and equilibria reconstructed from experimental data. A comparison of results from earlier workflows is used to show the importance of properly matching certain experimental parameters in the generated equilibria, including the normalized β, pedestal density, and q95. On the other hand, the details of the pedestal current did not significantly impact the plasma response in these equilibria. A comparison to the experimentally measured plasma response shows mixed agreement, indicating that while the equilibria are predicted well, additional analysis tools may be needed. Finally, we note the implications that these results have for the success of future predict-first studies, particularly the need for scans of uncertain parameters and for close collaboration between experimentalists and theorists.
Petrovskaya, Natalia B.; Forbes, Emily; Petrovskii, Sergei V.; Walters, Keith F. A.
2018-01-01
Studies addressing many ecological problems require accurate evaluation of the total population size. In this paper, we revisit a sampling procedure used for the evaluation of the abundance of an invertebrate population from assessment data collected on a spatial grid of sampling locations. We first discuss how insufficient information about the spatial population density obtained on a coarse sampling grid may affect the accuracy of an evaluation of total population size. Such information deficit in field data can arise because of inadequate spatial resolution of the population distribution (spatially variable population density) when coarse grids are used, which is especially true when a strongly heterogeneous spatial population density is sampled. We then argue that the average trap count (the quantity routinely used to quantify abundance), if obtained from a sampling grid that is too coarse, is a random variable because of the uncertainty in sampling spatial data. Finally, we show that a probabilistic approach similar to bootstrapping techniques can be an efficient tool to quantify the uncertainty in the evaluation procedure in the presence of a spatial pattern reflecting a patchy distribution of invertebrates within the sampling grid. PMID:29495513
NASA Astrophysics Data System (ADS)
Suttrop, W.; Kirk, A.; Nazikian, R.; Leuthold, N.; Strumberger, E.; Willensdorfer, M.; Cavedon, M.; Dunne, M.; Fischer, R.; Fietz, S.; Fuchs, J. C.; Liu, Y. Q.; McDermott, R. M.; Orain, F.; Ryan, D. A.; Viezzer, E.; The ASDEX Upgrade Team; The DIII-D Team; The Eurofusion MST1 Team
2017-01-01
The interaction of externally applied small non-axisymmetric magnetic perturbations (MP) with tokamak high-confinement mode (H-mode) plasmas is reviewed and illustrated by recent experiments in ASDEX Upgrade. The plasma response to the vacuum MP field is amplified by stable ideal kink modes with low toroidal mode number n driven by the H-mode edge pressure gradient (and associated bootstrap current) which is experimentally evidenced by an observable shift of the poloidal mode number m away from field alignment (m = qn, with q being the safety factor) at the response maximum. A torque scan experiment demonstrates the importance of the perpendicular electron flow for shielding of the resonant magnetic perturbation, as expected from a two-fluid MHD picture. Two significant effects of MP occur in H-mode plasmas at low pedestal collisionality, ν \\text{ped}\\ast≤slant 0.4 : (a) a reduction of the global plasma density by up to 61 % and (b) a reduction of the energy loss associated with edge localised modes (ELMs) by a factor of up to 9. A comprehensive database of ELM mitigation pulses at low {ν\\ast} in ASDEX Upgrade shows that the degree of ELM mitigation correlates with the reduction of pedestal pressure which in turn is limited and defined by the onset of ELMs, i. e. a modification of the ELM stability limit by the magnetic perturbation.
Hyperspectral techniques in analysis of oral dosage forms.
Hamilton, Sara J; Lowell, Amanda E; Lodder, Robert A
2002-10-01
Pharmaceutical oral dosage forms are used in this paper to test the sensitivity and spatial resolution of hyperspectral imaging instruments. The first experiment tested the hypothesis that a near-infrared (IR) tunable diode-based remote sensing system is capable of monitoring degradation of hard gelatin capsules at a relatively long distance (0.5 km). Spectra from the capsules were used to differentiate among capsules exposed to an atmosphere containing 150 ppb formaldehyde for 0, 2, 4, and 8 h. Robust median-based principal component regression with Bayesian inference was employed for outlier detection. The second experiment tested the hypothesis that near-IR imaging spectrometry of tablets permits the identification and composition of multiple individual tablets to be determined simultaneously. A near-IR camera was used to collect thousands of spectra simultaneously from a field of blister-packaged tablets. The number of tablets that a typical near-IR camera can currently analyze simultaneously was estimated to be approximately 1300. The bootstrap error-adjusted single-sample technique chemometric-imaging algorithm was used to draw probability-density contour plots that revealed tablet composition. The single-capsule analysis provides an indication of how far apart the sample and instrumentation can be and still maintain adequate signal-to-noise ratio (S/N), while the multiple-tablet imaging experiment gives an indication of how many samples can be analyzed simultaneously while maintaining an adequate S/N and pixel coverage on each sample.
Numerical Analysis of the Effects of Normalized Plasma Pressure on RMP ELM Suppression in DIII-D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orlov, D. M.; Moyer, R.A.; Evans, T. E.
2010-01-01
The effect of normalized plasma pressure as characterized by normalized pressure parameter (beta(N)) on the suppression of edge localized modes (ELMs) using resonant magnetic perturbations (RMPs) is studied in low-collisionality (nu* <= 0.2) H-mode plasmas with low-triangularity ( = 0.25) and ITER similar shapes ( = 0.51). Experimental results have suggested that ELM suppression by RMPs requires a minimum threshold in plasma pressure as characterized by beta(N). The variations in the vacuum field topology with beta(N) due to safety factor profile and island overlap changes caused by variation of the Shafranov shift and pedestal bootstrap current are examined numerically withmore » the field line integration code TRIP3D. The results show very small differences in the vacuum field structure in terms of the Chirikov (magnetic island overlap) parameter, Poincare sections and field line loss fractions. These differences do not appear to explain the observed threshold in beta(N) for ELM suppression. Linear peeling-ballooning stability analysis with the ELITE code suggests that the ELMs which persist during the RMPs when beta(N) is below the observed threshold are not type I ELMs, because the pedestal conditions are deep within the stable regime for peeling-ballooning modes. These ELMs have similarities to type III ELMs or low density ELMs.« less
Reliability of dose volume constraint inference from clinical data.
Lutz, C M; Møller, D S; Hoffmann, L; Knap, M M; Alber, M
2017-04-21
Dose volume histogram points (DVHPs) frequently serve as dose constraints in radiotherapy treatment planning. An experiment was designed to investigate the reliability of DVHP inference from clinical data for multiple cohort sizes and complication incidence rates. The experimental background was radiation pneumonitis in non-small cell lung cancer and the DVHP inference method was based on logistic regression. From 102 NSCLC real-life dose distributions and a postulated DVHP model, an 'ideal' cohort was generated where the most predictive model was equal to the postulated model. A bootstrap and a Cohort Replication Monte Carlo (CoRepMC) approach were applied to create 1000 equally sized populations each. The cohorts were then analyzed to establish inference frequency distributions. This was applied to nine scenarios for cohort sizes of 102 (1), 500 (2) to 2000 (3) patients (by sampling with replacement) and three postulated DVHP models. The Bootstrap was repeated for a 'non-ideal' cohort, where the most predictive model did not coincide with the postulated model. The Bootstrap produced chaotic results for all models of cohort size 1 for both the ideal and non-ideal cohorts. For cohort size 2 and 3, the distributions for all populations were more concentrated around the postulated DVHP. For the CoRepMC, the inference frequency increased with cohort size and incidence rate. Correct inference rates >[Formula: see text] were only achieved by cohorts with more than 500 patients. Both Bootstrap and CoRepMC indicate that inference of the correct or approximate DVHP for typical cohort sizes is highly uncertain. CoRepMC results were less spurious than Bootstrap results, demonstrating the large influence that randomness in dose-response has on the statistical analysis.
Reliability of dose volume constraint inference from clinical data
NASA Astrophysics Data System (ADS)
Lutz, C. M.; Møller, D. S.; Hoffmann, L.; Knap, M. M.; Alber, M.
2017-04-01
Dose volume histogram points (DVHPs) frequently serve as dose constraints in radiotherapy treatment planning. An experiment was designed to investigate the reliability of DVHP inference from clinical data for multiple cohort sizes and complication incidence rates. The experimental background was radiation pneumonitis in non-small cell lung cancer and the DVHP inference method was based on logistic regression. From 102 NSCLC real-life dose distributions and a postulated DVHP model, an ‘ideal’ cohort was generated where the most predictive model was equal to the postulated model. A bootstrap and a Cohort Replication Monte Carlo (CoRepMC) approach were applied to create 1000 equally sized populations each. The cohorts were then analyzed to establish inference frequency distributions. This was applied to nine scenarios for cohort sizes of 102 (1), 500 (2) to 2000 (3) patients (by sampling with replacement) and three postulated DVHP models. The Bootstrap was repeated for a ‘non-ideal’ cohort, where the most predictive model did not coincide with the postulated model. The Bootstrap produced chaotic results for all models of cohort size 1 for both the ideal and non-ideal cohorts. For cohort size 2 and 3, the distributions for all populations were more concentrated around the postulated DVHP. For the CoRepMC, the inference frequency increased with cohort size and incidence rate. Correct inference rates >85 % were only achieved by cohorts with more than 500 patients. Both Bootstrap and CoRepMC indicate that inference of the correct or approximate DVHP for typical cohort sizes is highly uncertain. CoRepMC results were less spurious than Bootstrap results, demonstrating the large influence that randomness in dose-response has on the statistical analysis.
Emura, Takeshi; Konno, Yoshihiko; Michimae, Hirofumi
2015-07-01
Doubly truncated data consist of samples whose observed values fall between the right- and left- truncation limits. With such samples, the distribution function of interest is estimated using the nonparametric maximum likelihood estimator (NPMLE) that is obtained through a self-consistency algorithm. Owing to the complicated asymptotic distribution of the NPMLE, the bootstrap method has been suggested for statistical inference. This paper proposes a closed-form estimator for the asymptotic covariance function of the NPMLE, which is computationally attractive alternative to bootstrapping. Furthermore, we develop various statistical inference procedures, such as confidence interval, goodness-of-fit tests, and confidence bands to demonstrate the usefulness of the proposed covariance estimator. Simulations are performed to compare the proposed method with both the bootstrap and jackknife methods. The methods are illustrated using the childhood cancer dataset.
Comulada, W. Scott
2015-01-01
Stata’s mi commands provide powerful tools to conduct multiple imputation in the presence of ignorable missing data. In this article, I present Stata code to extend the capabilities of the mi commands to address two areas of statistical inference where results are not easily aggregated across imputed datasets. First, mi commands are restricted to covariate selection. I show how to address model fit to correctly specify a model. Second, the mi commands readily aggregate model-based standard errors. I show how standard errors can be bootstrapped for situations where model assumptions may not be met. I illustrate model specification and bootstrapping on frequency counts for the number of times that alcohol was consumed in data with missing observations from a behavioral intervention. PMID:26973439
Heptagons from the Steinmann cluster bootstrap
Dixon, Lance J.; Drummond, James; Harrington, Thomas; ...
2017-02-28
We reformulate the heptagon cluster bootstrap to take advantage of the Steinmann relations, which require certain double discontinuities of any amplitude to vanish. These constraints vastly reduce the number of functions needed to bootstrap seven-point amplitudes in planarmore » $$ \\mathcal{N} $$ = 4 supersymmetric Yang-Mills theory, making higher-loop contributions to these amplitudes more computationally accessible. In particular, dual superconformal symmetry and well-defined collinear limits suffice to determine uniquely the symbols of the three-loop NMHV and four-loop MHV seven-point amplitudes. We also show that at three loops, relaxing the dual superconformal $$\\bar{Q}$$ relations and imposing dihedral symmetry (and for NMHV the absence of spurious poles) leaves only a single ambiguity in the heptagon amplitudes. These results point to a strong tension between the collinear properties of the amplitudes and the Steinmann relations.« less
Kepler Planet Detection Metrics: Statistical Bootstrap Test
NASA Technical Reports Server (NTRS)
Jenkins, Jon M.; Burke, Christopher J.
2016-01-01
This document describes the data produced by the Statistical Bootstrap Test over the final three Threshold Crossing Event (TCE) deliveries to NExScI: SOC 9.1 (Q1Q16)1 (Tenenbaum et al. 2014), SOC 9.2 (Q1Q17) aka DR242 (Seader et al. 2015), and SOC 9.3 (Q1Q17) aka DR253 (Twicken et al. 2016). The last few years have seen significant improvements in the SOC science data processing pipeline, leading to higher quality light curves and more sensitive transit searches. The statistical bootstrap analysis results presented here and the numerical results archived at NASAs Exoplanet Science Institute (NExScI) bear witness to these software improvements. This document attempts to introduce and describe the main features and differences between these three data sets as a consequence of the software changes.
Imaging with New Classic and Vision at the NPOI
NASA Astrophysics Data System (ADS)
Jorgensen, Anders
2018-04-01
The Navy Precision Optical Interferometer (NPOI) is unique among interferometric observatories for its ability to position telescopes in an equally-spaced array configuration. This configuration is optimal for interferometric imaging because it allows the use of bootstrapping to track fringes on long baselines with signal-to-noise ratio less than one. When combined with coherent integration techniques this can produce visibilities with acceptable SNR on baselines long enough to resolve features on the surfaces of stars. The stellar surface imaging project at NPOI combines the bootstrapping array configuration of the NPOI array, real-time fringe tracking, baseline- and wavelength bootstrapping with Earth rotation to provide dense coverage in the UV plane at a wide range of spatial frequencies. In this presentation, we provide an overview of the project and an update of the latest status and results from the project.
NASA Astrophysics Data System (ADS)
Campbell, Gregory S.; Thomas, Len; Whitaker, Katherine; Douglas, Annie B.; Calambokidis, John; Hildebrand, John A.
2015-02-01
Trends in cetacean density and distribution off southern California were assessed through visual line-transect surveys during thirty-seven California Cooperative Oceanic Fisheries Investigations (CalCOFI) cruises from July 2004-November 2013. From sightings of the six most commonly encountered cetacean species, seasonal, annual and overall density estimates were calculated. Blue whales (Balaenoptera musculus), fin whales (Balaenoptera physalus) and humpback whales (Megaptera novaeangliae) were the most frequently sighted baleen whales with overall densities of 0.91/1000 km2 (CV=0.27), 2.73/1000 km2 (CV=0.19), and 1.17/1000 km2 (CV=0.21) respectively. Species specific density estimates, stratified by cruise, were analyzed using a generalized additive model to estimate long-term trends and correct for seasonal imbalances. Variances were estimated using a non-parametric bootstrap with one day of effort as the sampling unit. Blue whales were primarily observed during summer and fall while fin and humpback whales were observed year-round with peaks in density during summer and spring respectively. Short-beaked common dolphins (Delphinus delphis), Pacific white-sided dolphins (Lagenorhynchus obliquidens) and Dall's porpoise (Phocoenoidesdalli) were the most frequently encountered small cetaceans with overall densities of 705.83/1000 km2 (CV=0.22), 51.98/1000 km2 (CV=0.27), and 21.37/1000 km2 (CV=0.19) respectively. Seasonally, short-beaked common dolphins were most abundant in winter whereas Pacific white-sided dolphins and Dall's porpoise were most abundant during spring. There were no significant long-term changes in blue whale, fin whale, humpback whale, short-beaked common dolphin or Dall's porpoise densities while Pacific white-sided dolphins exhibited a significant decrease in density across the ten-year study. The results from this study were fundamentally consistent with earlier studies, but provide greater temporal and seasonal resolution.
Bootstrapping and Maintaining Trust in the Cloud
2016-12-01
proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as Amazon Web Services and Google Compute Engine means...IaaS trusted computing system: • Secure Bootstrapping – the system should enable the tenant to securely install an initial root secret into each cloud ...elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features, but none achieve all. Excalibur [31] sup
Sample Reuse in Statistical Remodeling.
1987-08-01
as the jackknife and bootstrap, is an expansion of the functional, T(Fn), or of its distribution function or both. Frangos and Schucany (1987a) used...accelerated bootstrap. In the same report Frangos and Schucany demonstrated the small sample superiority of that approach over the proposals that take...higher order terms of an Edgeworth expansion into account. In a second report Frangos and Schucany (1987b) examined the small sample performance of
ERIC Educational Resources Information Center
Ramanarayanan, Vikram; Suendermann-Oeft, David; Lange, Patrick; Ivanov, Alexei V.; Evanini, Keelan; Yu, Zhou; Tsuprun, Eugene; Qian, Yao
2016-01-01
We propose a crowdsourcing-based framework to iteratively and rapidly bootstrap a dialog system from scratch for a new domain. We leverage the open-source modular HALEF dialog system to deploy dialog applications. We illustrate the usefulness of this framework using four different prototype dialog items with applications in the educational domain…
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution.
Imai, Mutsumi; Kita, Sotaro
2014-09-19
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Guerrero, Erick G; Fenwick, Karissa; Kong, Yinfei
2017-11-14
Leadership style and specific organizational climates have emerged as critical mechanisms to implement targeted practices in organizations. Drawing from relevant theories, we propose that climate for implementation of cultural competence reflects how transformational leadership may enhance the organizational implementation of culturally responsive practices in health care organizations. Using multilevel data from 427 employees embedded in 112 addiction treatment programs collected in 2013, confirmatory factor analysis showed adequate fit statistics for our measure of climate for implementation of cultural competence (Cronbach's alpha = .88) and three outcomes: knowledge (Cronbach's alpha = .88), services (Cronbach's alpha = .86), and personnel (Cronbach's alpha = .86) practices. Results from multilevel path analyses indicate a positive relationship between employee perceptions of transformational leadership and climate for implementation of cultural competence (standardized indirect effect = .057, bootstrap p < .001). We also found a positive indirect effect between transformational leadership and each of the culturally competent practices: knowledge (standardized indirect effect = .006, bootstrap p = .004), services (standardized indirect effect = .019, bootstrap p < .001), and personnel (standardized indirect effect = .014, bootstrap p = .005). Findings contribute to implementation science. They build on leadership theory and offer evidence of the mediating role of climate in the implementation of cultural competence in addiction health service organizations.
Nixon, Richard M; Wonderling, David; Grieve, Richard D
2010-03-01
Cost-effectiveness analyses (CEA) alongside randomised controlled trials commonly estimate incremental net benefits (INB), with 95% confidence intervals, and compute cost-effectiveness acceptability curves and confidence ellipses. Two alternative non-parametric methods for estimating INB are to apply the central limit theorem (CLT) or to use the non-parametric bootstrap method, although it is unclear which method is preferable. This paper describes the statistical rationale underlying each of these methods and illustrates their application with a trial-based CEA. It compares the sampling uncertainty from using either technique in a Monte Carlo simulation. The experiments are repeated varying the sample size and the skewness of costs in the population. The results showed that, even when data were highly skewed, both methods accurately estimated the true standard errors (SEs) when sample sizes were moderate to large (n>50), and also gave good estimates for small data sets with low skewness. However, when sample sizes were relatively small and the data highly skewed, using the CLT rather than the bootstrap led to slightly more accurate SEs. We conclude that while in general using either method is appropriate, the CLT is easier to implement, and provides SEs that are at least as accurate as the bootstrap. (c) 2009 John Wiley & Sons, Ltd.
Elton-Marshall, Tara; Fong, Geoffrey T; Yong, Hua-Hie; Borland, Ron; Xu, Steve Shaowei; Quah, Anne C K; Feng, Guoze; Jiang, Yuan
2016-01-01
Background The sensory belief that ‘light/low tar’ cigarettes are smoother can also influence the belief that ‘light/low tar’ cigarettes are less harmful. However, the ‘light’ concept is one of several factors influencing beliefs. No studies have examined the impact of the sensory belief about one’s own brand of cigarettes on perceptions of harm. Objective The current study examines whether a smoker’s sensory belief that their brand is smoother is associated with the belief that their brand is less harmful and whether sensory beliefs mediate the relation between smoking a ‘light/low tar’ cigarette and relative perceptions of harm among smokers in China. Methods Data are from 5209 smokers who were recruited using a stratified multistage sampling design and participated in wave 3 of the International Tobacco Control (ITC) China Survey, a face-to-face survey of adult smokers and non-smokers in seven cities. Results Smokers who agreed that their brand of cigarettes was smoother were significantly more likely to say that their brand of cigarettes was less harmful (p<0.001, OR=6.86, 95% CI 5.64 to 8.33). Mediational analyses using the bootstrapping procedure indicated that both the direct effect of ‘light/low tar’ cigarette smokers on the belief that their cigarettes are less harmful (b=0.24, bootstrapped bias corrected 95% CI 0.13 to 0.34, p<0.001) and the indirect effect via their belief that their cigarettes are smoother were significant (b=0.32, bootstrapped bias-corrected 95% CI 0.28 to 0.37, p<0.001), suggesting that the mediation was partial. Conclusions These results demonstrate the importance of implementing tobacco control policies that address the impact that cigarette design and marketing can have in capitalising on the smoker’s natural associations between smoother sensations and lowered perceptions of harm. PMID:25370698
Elton-Marshall, Tara; Fong, Geoffrey T; Yong, Hua-Hie; Borland, Ron; Xu, Steve Shaowei; Quah, Anne C K; Feng, Guoze; Jiang, Yuan
2015-11-01
The sensory belief that 'light/low tar' cigarettes are smoother can also influence the belief that 'light/low tar' cigarettes are less harmful. However, the 'light' concept is one of several factors influencing beliefs. No studies have examined the impact of the sensory belief about one's own brand of cigarettes on perceptions of harm. The current study examines whether a smoker's sensory belief that their brand is smoother is associated with the belief that their brand is less harmful and whether sensory beliefs mediate the relation between smoking a 'light/low tar' cigarette and relative perceptions of harm among smokers in China. Data are from 5209 smokers who were recruited using a stratified multistage sampling design and participated in Wave 3 of the International Tobacco Control (ITC) China Survey, a face-to-face survey of adult smokers and non-smokers in seven cities. Smokers who agreed that their brand of cigarettes was smoother were significantly more likely to say that their brand of cigarettes was less harmful (p<0.001, OR=6.86, 95% CI 5.64 to 8.33). Mediational analyses using the bootstrapping procedure indicated that both the direct effect of 'light/low tar' cigarette smokers on the belief that their cigarettes are less harmful (b=0.24, bootstrapped bias corrected 95% CI 0.13 to 0.34, p<0.001) and the indirect effect via their belief that their cigarettes are smoother were significant (b=0.32, bootstrapped bias-corrected 95% CI 0.28 to 0.37, p<0.001), suggesting that the mediation was partial. These results demonstrate the importance of implementing tobacco control policies that address the impact that cigarette design and marketing can have in capitalising on the smoker's natural associations between smoother sensations and lowered perceptions of harm. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Maternal depression and trait anger as risk factors for escalated physical discipline.
Shay, Nicole L; Knutson, John F
2008-02-01
To test the hypothesized anger-mediated relation between maternal depression and escalation of physical discipline, 122 economically disadvantaged mothers were assessed for current and lifetime diagnoses of depression using the Current Depressive Episode, Past Depression, and Dysthymia sections of the Structured Clinical Interview for DSM-IV (SCID) and a measure of current depressive symptoms, the Beck Depression Inventory-Second Edition (BDI-II). Escalation of physical discipline was assessed using a video analog parenting task; maternal anger not specific to discipline was assessed using the Spielberger Trait Anger Expression Inventory. Reports of anger were associated with the diagnosis of depression and depressive symptoms. Bootstrap analyses of indirect effects indicated that the link between depression and escalated discipline was mediated by anger. Parallel analyses based on BDI-II scores identified a marginally significant indirect effect of depression on discipline. Findings suggest that anger and irritability are central to the putative link between depression and harsh discipline.
Some Aspects of Advanced Tokamak Modeling in DIII-D
NASA Astrophysics Data System (ADS)
St John, H. E.; Petty, C. C.; Murakami, M.; Kinsey, J. E.
2000-10-01
We extend previous work(M. Murakami, et al., General Atomics Report GA-A23310 (1999).) done on time dependent DIII-D advanced tokamak simulations by introducing theoretical confinement models rather than relying on power balance derived transport coefficients. We explore using NBCD and off axis ECCD together with a self-consistent aligned bootstrap current, driven by the internal transport barrier dynamics generated with the GLF23 confinement model, to shape the hollow current profile and to maintain MHD stable conditions. Our theoretical modeling approach uses measured DIII-D initial conditions to start off the simulations in a smooth consistent manner. This mitigates the troublesome long lived perturbations in the ohmic current profile that is normally caused by inconsistent initial data. To achieve this goal our simulation uses a sequence of time dependent eqdsks generated autonomously by the EFIT MHD equilibrium code in analyzing experimental data to supply the history for the simulation.
BOOTSTRAPPING THE CORONAL MAGNETIC FIELD WITH STEREO: UNIPOLAR POTENTIAL FIELD MODELING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aschwanden, Markus J.; Sandman, Anne W., E-mail: aschwanden@lmsal.co
We investigate the recently quantified misalignment of {alpha}{sub mis} {approx} 20{sup 0}-40{sup 0} between the three-dimensional geometry of stereoscopically triangulated coronal loops observed with STEREO/EUVI (in four active regions (ARs)) and theoretical (potential or nonlinear force-free) magnetic field models extrapolated from photospheric magnetograms. We develop an efficient method of bootstrapping the coronal magnetic field by forward fitting a parameterized potential field model to the STEREO-observed loops. The potential field model consists of a number of unipolar magnetic charges that are parameterized by decomposing a photospheric magnetogram from the Michelson Doppler Imager. The forward-fitting method yields a best-fit magnetic field modelmore » with a reduced misalignment of {alpha}{sub PF} {approx} 13{sup 0}-20{sup 0}. We also evaluate stereoscopic measurement errors and find a contribution of {alpha}{sub SE} {approx} 7{sup 0}-12{sup 0}, which constrains the residual misalignment to {alpha}{sub NP} {approx} 11{sup 0}-17{sup 0}, which is likely due to the nonpotentiality of the ARs. The residual misalignment angle, {alpha}{sub NP}, of the potential field due to nonpotentiality is found to correlate with the soft X-ray flux of the AR, which implies a relationship between electric currents and plasma heating.« less
Palmer, Tom M; Holmes, Michael V; Keating, Brendan J; Sheehan, Nuala A
2017-11-01
Mendelian randomization studies use genotypes as instrumental variables to test for and estimate the causal effects of modifiable risk factors on outcomes. Two-stage residual inclusion (TSRI) estimators have been used when researchers are willing to make parametric assumptions. However, researchers are currently reporting uncorrected or heteroscedasticity-robust standard errors for these estimates. We compared several different forms of the standard error for linear and logistic TSRI estimates in simulations and in real-data examples. Among others, we consider standard errors modified from the approach of Newey (1987), Terza (2016), and bootstrapping. In our simulations Newey, Terza, bootstrap, and corrected 2-stage least squares (in the linear case) standard errors gave the best results in terms of coverage and type I error. In the real-data examples, the Newey standard errors were 0.5% and 2% larger than the unadjusted standard errors for the linear and logistic TSRI estimators, respectively. We show that TSRI estimators with modified standard errors have correct type I error under the null. Researchers should report TSRI estimates with modified standard errors instead of reporting unadjusted or heteroscedasticity-robust standard errors. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health.
Qian, Jinping P.; Garofalo, Andrea M.; Gong, Xianzu Z.; ...
2017-03-20
Recent EAST/DIII-D joint experiments on the high poloidal betamore » $${{\\beta}_{\\text{P}}}$$ regime in DIII-D have extended operation with internal transport barriers (ITBs) and excellent energy confinement (H 98y2 ~ 1.6) to higher plasma current, for lower q 95 ≤ 7.0, and more balanced neutral beam injection (NBI) (torque injection < 2 Nm), for lower plasma rotation than previous results. Transport analysis and experimental measurements at low toroidal rotation suggest that the E × B shear effect is not key to the ITB formation in these high $${{\\beta}_{\\text{P}}}$$ discharges. Experiments and TGLF modeling show that the Shafranov shift has a key stabilizing effect on turbulence. Extrapolation of the DIII-D results using a 0D model shows that with the improved confinement, the high bootstrap fraction regime could achieve fusion gain Q = 5 in ITER at $${{\\beta}_{\\text{N}}}$$ ~ 2.9 and q 95 ~ 7. With the optimization of q(0), the required improved confinement is achievable when using 1.5D TGLF-SAT1 for transport simulations. Furthermore, results reported in this paper suggest that the DIII-D high $${{\\beta}_{\\text{P}}}$$ scenario could be a candidate for ITER steady state operation.« less
Fagerland, Morten W; Sandvik, Leiv; Mowinckel, Petter
2011-04-13
The number of events per individual is a widely reported variable in medical research papers. Such variables are the most common representation of the general variable type called discrete numerical. There is currently no consensus on how to compare and present such variables, and recommendations are lacking. The objective of this paper is to present recommendations for analysis and presentation of results for discrete numerical variables. Two simulation studies were used to investigate the performance of hypothesis tests and confidence interval methods for variables with outcomes {0, 1, 2}, {0, 1, 2, 3}, {0, 1, 2, 3, 4}, and {0, 1, 2, 3, 4, 5}, using the difference between the means as an effect measure. The Welch U test (the T test with adjustment for unequal variances) and its associated confidence interval performed well for almost all situations considered. The Brunner-Munzel test also performed well, except for small sample sizes (10 in each group). The ordinary T test, the Wilcoxon-Mann-Whitney test, the percentile bootstrap interval, and the bootstrap-t interval did not perform satisfactorily. The difference between the means is an appropriate effect measure for comparing two independent discrete numerical variables that has both lower and upper bounds. To analyze this problem, we encourage more frequent use of parametric hypothesis tests and confidence intervals.
Kemmler, Wolfgang; von Stengel, Simon; Kohl, Matthias
2016-08-01
Due to older people's low sports participation rates, exercise frequency may be the most critical component for designing exercise protocols that address bone. The aims of the present article were to determine the independent effect of exercise frequency (ExFreq) and its corresponding changes on bone mineral density (BMD) and to identify the minimum effective dose that just relevantly affects bone. Based on the 16-year follow-up of the intense, consistently supervised Erlangen Fitness and Osteoporosis Prevention-Study, ExFreq was retrospectively determined in the exercise-group of 55 initially early-postmenopausal females with osteopenia. Linear mixed-effect regression analysis was conducted to determine the independent effect of ExFreq on BMD changes at lumbar spine and total hip. Minimum effective dose of ExFreq based on BMD changes less than the 90% quantile of the sedentary control-group (n=43). Cut-offs were determined after 4, 8, 12 and 16years using bootstrap with 5000 replications. After 16years, average ExFreq ranged between 1.02 and 2.96sessions/week (2.28±0.40sessions/week). ExFreq has an independent effect on LS-BMD (p<.001) and hip-BMD (p=.005) changes. Bootstrap analysis detected a minimum effective dose at about 2sessions/week/16years (cut-off LS-BMD: 2.11, 95% CI: 2.06-2.12; total hip-BMD: 2.22, 95% CI: 2.00-2.78sessions/week/16years). In summary, the minimum effective dose of exercise frequency that relevantly addresses BMD is quite high, at least compared with the low sport participation rate of older adults. This result might not be generalizable across all exercise types, protocols and cohorts, but it does indicate at least that even when applying high impact/high intensity programs, exercise frequency and its maintenance play a key role in bone adaptation. Copyright © 2016 Elsevier Inc. All rights reserved.
Integration of uncooled scraper elements and its diagnostics into Wendelstein 7-X
Fellinger, Joris; Loesser, Doug; Neilson, Hutch; ...
2017-08-08
The modular stellarator Wendelstein 7-X in Greifswald (Germany) successfully started operation in 2015 with short pulse limiter plasmas. In 2017, the next operation phase (OP) OP1.2 will start once 10 uncooled test divertor units (TDU) with graphite armor will be installed. The TDUs allow for plasma pulses of 10 s with 8 MW heating. OP2, allowing for steady state operation, is planned for 2020 after the TDUs will be replaced by 10 water cooled CFC armored divertors. Due to the development of plasma currents like bootstrap currents in long pulse plasmas in OP2, the plasma could hit the edge ofmore » the divertor targets which has a reduced cooling capacity compared to the central part of the target tiles. To prevent overloading of these edges, a so-called scraper element can be positioned in front of the divertor, intersecting those strike lines that would otherwise hit the divertor edges. As a result, these edges are protected but as a drawback the pumping efficiency of neutrals is also reduced. As a test an uncooled scraper element with graphite tiles will be placed in two out of ten half modules in OP1.2. A decision to install ten water cooled scraper elements for OP2 is pending on the results of this test in OP1.2. To monitor the impact of the scraper element on the plasma, Langmuir probes are integrated in the plasma facing surface, and a neutral gas manometer measures the neutral density directly behind the plasma facing surface. Moreover, IR and VIS cameras observe the plasma facing surface and thermocouples monitor the temperatures of the graphite tiles and underlying support structure. This paper describes the integration of the scraper element and its diagnostics in Wendelstein 7-X.« less
National Spherical Torus Experiment (NSTX) and Planned Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peng, Yueng Kay Martin; Ono, M.; Kaye, S.
1998-01-01
The U.S. fusion energy sciences program began in 1996 to increase emphasis on confinement concept innovation. The NSTX is being built at PPPL as a national fusion science research facility in response to this emphasis. NSTX is to test fusion science principles of the Spherical Torus (ST) plasmas, which include: (1) High plasma pressure in low magnetic field for high fusion power density, (2) Good energy confinement is a small-size plasma, (3) Nearly fully self-driven (bootstrap) plasma current, (4) Dispersed heat and particle fluxes, and (5) Plasma startup without complicated in board solenoid magnet. These properties of the ST plasma,more » if verified, would lead to possible future fusion devices of high fusion performance, small size, feasible power handling, and improved economy. The design of NSTX is depicted in a figure. The vessel will be covered fully with graphite tiles and can be baked to 350 C. Other wall condition techniques are also planned. The NSTX facilty extensively utilizes the equipment at PPPL and other reasearch institutions in collaboration. These include 6-MW High Harmonic Fast Wave (HHFW) power at {approx}30 MHz for 5 s, which will be the primary heating and current drive system following the first plasma planned for April 1999, and small ECH systems to assist breakdown for initiation. A plethora of diagnostics from TFTR and collaborators are planned. A NBI system from TFTR capable of delivering 5 MW at 80 keV for 5 s, and more powerful ECH systems are also planned for installation in 2000. The baseline plan for diagnostics systems are laid out in a figure and include: (1) Rogowski coils to measure total plasma and halo curents.« less
Creation of second order magnetic barrier inside chaos created by NTMs in the ASDEX UG
NASA Astrophysics Data System (ADS)
Ali, Halima; Punjabi, Alkesh
2012-10-01
Understanding and stabilization of neoclassical tearing modes (NTM) in tokamaks is an important problem. For low temperature plasmas, tearing modes are believed to be mainly driven by current density gradient. For collisionless plasmas, even when plasma is stable to classical tearing modes, helical reduction in bootstrap current in O-point of an island can destabilize NTMs when an initial island is seeded by other global MHD instabilities or when microturbulence triggers the transition from a linear to nonlinear instability. The onset of NTMs leads to the most serious beta limit in ASDEX UG tokamak [O. Gubner et al 2005 NF 39 1321]. The important NTMs in the ASDDEX UG are (m,n)=(3,2)+(4,3)+(1,1). Realistic parameterization of these NTMs and the safety factor in ASDEX UG are given in [O. Dumbrajs et al 2005 POP 12 1107004]. We use a symplectic map in magnetic coordinates for the ASDEX UG to integrate field lines in presence of the NTMs. We add a second order control term [H. Ali and A. Punjabi 2007 PPCF 49 1565] to this ASDEX UG field line Hamiltonian to create an invariant magnetic surface inside the chaos generated by the NTMs. The relative strength, robustness, and resilience of this barrier are studied to ascertain the most desirable noble barrier in the ASDEX UG with NTMs. We present preliminary results of this work, and discuss its implications with regard to magnetic transport barriers for increasing strength of magnetic perturbations. This work is supported by the grants DE-FG02-01ER54624 and DE-FG02-04ER54793.
The effect of sheared toroidal rotation on pressure driven magnetic islands in toroidal plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hegna, C. C.
2016-05-15
The impact of sheared toroidal rotation on the evolution of pressure driven magnetic islands in tokamak plasmas is investigated using a resistive magnetohydrodynamics model augmented by a neoclassical Ohm's law. Particular attention is paid to the asymptotic matching data as the Mercier indices are altered in the presence of sheared flow. Analysis of the nonlinear island Grad-Shafranov equation shows that sheared flows tend to amplify the stabilizing pressure/curvature contribution to pressure driven islands in toroidal tokamaks relative to the island bootstrap current contribution. As such, sheared toroidal rotation tends to reduce saturated magnetic island widths.
Bootstrapping and Maintaining Trust in the Cloud
2016-03-16
of infrastructure-as-a- service (IaaS) cloud computing services such as Ama- zon Web Services, Google Compute Engine, Rackspace, et. al. means that...Implementation We implemented keylime in ∼3.2k lines of Python in four components: registrar, node, CV, and tenant. The registrar offers a REST-based web ...bootstrap key K. It provides an unencrypted REST-based web service for these two functions. As described earlier, the pro- tocols for exchanging data
Reduced Power Laer Designation Systems
2008-06-20
200KD, Ri = = 60Kfl, and R 2 = R4 = 2K yields an overall transimpedance gain of 200K x 30 x 30 = 180MV/A. Figure 3. Three stage photodiode amplifier ...transistor circuit for bootstrap buffering of the input stage, comparing the noise performance of the candidate amplifier designs, selecting the two...transistor bootstrap design as the circuit of choice, and comparing the performance of this circuit against that of a basic transconductance amplifier
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
Bootstrapping the energy flow in the beginning of life.
Hengeveld, R; Fedonkin, M A
2007-01-01
This paper suggests that the energy flow on which all living structures depend only started up slowly, the low-energy, initial phase starting up a second, slightly more energetic phase, and so on. In this way, the build up of the energy flow follows a bootstrapping process similar to that found in the development of computers, the first generation making possible the calculations necessary for constructing the second one, etc. In the biogenetic upstart of an energy flow, non-metals in the lower periods of the Periodic Table of Elements would have constituted the most primitive systems, their operation being enhanced and later supplanted by elements in the higher periods that demand more energy. This bootstrapping process would put the development of the metabolisms based on the second period elements carbon, nitrogen and oxygen at the end of the evolutionary process rather than at, or even before, the biogenetic event.
Nonlinear Thermal Instability in Compressible Viscous Flows Without Heat Conductivity
NASA Astrophysics Data System (ADS)
Jiang, Fei
2018-04-01
We investigate the thermal instability of a smooth equilibrium state, in which the density function satisfies Schwarzschild's (instability) condition, to a compressible heat-conducting viscous flow without heat conductivity in the presence of a uniform gravitational field in a three-dimensional bounded domain. We show that the equilibrium state is linearly unstable by a modified variational method. Then, based on the constructed linearly unstable solutions and a local well-posedness result of classical solutions to the original nonlinear problem, we further construct the initial data of linearly unstable solutions to be the one of the original nonlinear problem, and establish an appropriate energy estimate of Gronwall-type. With the help of the established energy estimate, we finally show that the equilibrium state is nonlinearly unstable in the sense of Hadamard by a careful bootstrap instability argument.
Seasonal comparisons of sea ice concentration estimates derived from SSM/I, OKEAN, and RADARSAT data
Belchansky, Gennady I.; Douglas, David C.
2002-01-01
The Special Sensor Microwave Imager (SSM/I) microwave satellite radiometer and its predecessor SMMR are primary sources of information for global sea ice and climate studies. However, comparisons of SSM/I, Landsat, AVHRR, and ERS-1 synthetic aperture radar (SAR) have shown substantial seasonal and regional differences in their estimates of sea ice concentration. To evaluate these differences, we compared SSM/I estimates of sea ice coverage derived with the NASA Team and Bootstrap algorithms to estimates made using RADARSAT, and OKEAN-01 satellite sensor data. The study area included the Barents Sea, Kara Sea, Laptev Sea, and adjacent parts of the Arctic Ocean, during October 1995 through October 1999. Ice concentration estimates from spatially and temporally near-coincident imagery were calculated using independent algorithms for each sensor type. The OKEAN algorithm implemented the satellite's two-channel active (radar) and passive microwave data in a linear mixture model based on the measured values of brightness temperature and radar backscatter. The RADARSAT algorithm utilized a segmentation approach of the measured radar backscatter, and the SSM/I ice concentrations were derived at National Snow and Ice Data Center (NSIDC) using the NASA Team and Bootstrap algorithms. Seasonal and monthly differences between SSM/I, OKEAN, and RADARSAT ice concentrations were calculated and compared. Overall, total sea ice concentration estimates derived independently from near-coincident RADARSAT, OKEAN-01, and SSM/I satellite imagery demonstrated mean differences of less than 5.5% (S.D.<9.5%) during the winter period. Differences between the SSM/I NASA Team and the SSM/I Bootstrap concentrations were no more than 3.1% (S.D.<5.4%) during this period. RADARSAT and OKEAN-01 data both yielded higher total ice concentrations than the NASA Team and the Bootstrap algorithms. The Bootstrap algorithm yielded higher total ice concentrations than the NASA Team algorithm. Total ice concentrations derived from OKEAN-01 and SSM/I satellite imagery were highly correlated during winter, spring, and fall, with mean differences of less than 8.1% (S.D.<15%) for the NASA Team algorithm, and less than 2.8% (S.D.<13.8%) for the Bootstrap algorithm. Respective differences between SSM/I NASA Team and SSM/I Bootstrap total concentrations were less than 5.3% (S.D.<6.9%). Monthly mean differences between SSM/I and OKEAN differed annually by less than 6%, with smaller differences primarily in winter. The NASA Team and Bootstrap algorithms underestimated the total sea ice concentrations relative to the RADARSAT ScanSAR no more than 3.0% (S.D.<9%) and 1.2% (S.D.<7.5%) during cold months, and no more than 12% and 7% during summer, respectively. ScanSAR tended to estimate higher ice concentrations for ice concentrations greater than 50%, when compared to SSM/I during all months. ScanSAR underestimated total sea ice concentration by 2% compared to the OKEAN-01 algorithm during cold months, and gave an overestimation by 2% during spring and summer months. Total NASA Team and Bootstrap sea ice concentration estimates derived from coincident SSM/I and OKEAN-01 data demonstrated mean differences of no more than 5.3% (S.D.<7%), 3.1% (S.D.<5.5%), 2.0% (S.D.<5.5%), and 7.3% (S.D.<10%) for fall, winter, spring, and summer periods, respectively. Large disagreements were observed between the OKEAN and NASA Team results in spring and summer for estimates of the first-year (FY) and multiyear (MY) age classes. The OKEAN-01 algorithm and data tended to estimate, on average, lower concentrations of young or FY ice and higher concentrations of total and MY ice for all months and seasons. Our results contribute to the growing body of documentation about the levels of disparity obtained when seasonal sea ice concentrations are estimated using various types of satellite data and algorithms.
A cluster bootstrap for two-loop MHV amplitudes
Golden, John; Spradlin, Marcus
2015-02-02
We apply a bootstrap procedure to two-loop MHV amplitudes in planar N=4 super-Yang-Mills theory. We argue that the mathematically most complicated part (the Λ 2 B 2 coproduct component) of the n-particle amplitude is uniquely determined by a simple cluster algebra property together with a few physical constraints (dihedral symmetry, analytic structure, supersymmetry, and well-defined collinear limits). Finally, we present a concise, closed-form expression which manifests these properties for all n.
Wrappers for Performance Enhancement and Oblivious Decision Graphs
1995-09-01
always select all relevant features. We test di erent search engines to search the space of feature subsets and introduce compound operators to speed...distinct instances from the original dataset appearing in the test set is thus 0:632m. The 0i accuracy estimate is derived by using bootstrap sample...i for training and the rest of the instances for testing . Given a number b, the number of bootstrap samples, let 0i be the accuracy estimate for
CME Velocity and Acceleration Error Estimates Using the Bootstrap Method
NASA Technical Reports Server (NTRS)
Michalek, Grzegorz; Gopalswamy, Nat; Yashiro, Seiji
2017-01-01
The bootstrap method is used to determine errors of basic attributes of coronal mass ejections (CMEs) visually identified in images obtained by the Solar and Heliospheric Observatory (SOHO) mission's Large Angle and Spectrometric Coronagraph (LASCO) instruments. The basic parameters of CMEs are stored, among others, in a database known as the SOHO/LASCO CME catalog and are widely employed for many research studies. The basic attributes of CMEs (e.g. velocity and acceleration) are obtained from manually generated height-time plots. The subjective nature of manual measurements introduces random errors that are difficult to quantify. In many studies the impact of such measurement errors is overlooked. In this study we present a new possibility to estimate measurements errors in the basic attributes of CMEs. This approach is a computer-intensive method because it requires repeating the original data analysis procedure several times using replicate datasets. This is also commonly called the bootstrap method in the literature. We show that the bootstrap approach can be used to estimate the errors of the basic attributes of CMEs having moderately large numbers of height-time measurements. The velocity errors are in the vast majority small and depend mostly on the number of height-time points measured for a particular event. In the case of acceleration, the errors are significant, and for more than half of all CMEs, they are larger than the acceleration itself.
Simplified Estimation and Testing in Unbalanced Repeated Measures Designs.
Spiess, Martin; Jordan, Pascal; Wendt, Mike
2018-05-07
In this paper we propose a simple estimator for unbalanced repeated measures design models where each unit is observed at least once in each cell of the experimental design. The estimator does not require a model of the error covariance structure. Thus, circularity of the error covariance matrix and estimation of correlation parameters and variances are not necessary. Together with a weak assumption about the reason for the varying number of observations, the proposed estimator and its variance estimator are unbiased. As an alternative to confidence intervals based on the normality assumption, a bias-corrected and accelerated bootstrap technique is considered. We also propose the naive percentile bootstrap for Wald-type tests where the standard Wald test may break down when the number of observations is small relative to the number of parameters to be estimated. In a simulation study we illustrate the properties of the estimator and the bootstrap techniques to calculate confidence intervals and conduct hypothesis tests in small and large samples under normality and non-normality of the errors. The results imply that the simple estimator is only slightly less efficient than an estimator that correctly assumes a block structure of the error correlation matrix, a special case of which is an equi-correlation matrix. Application of the estimator and the bootstrap technique is illustrated using data from a task switch experiment based on an experimental within design with 32 cells and 33 participants.
NASA Technical Reports Server (NTRS)
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
Image analysis of representative food structures: application of the bootstrap method.
Ramírez, Cristian; Germain, Juan C; Aguilera, José M
2009-08-01
Images (for example, photomicrographs) are routinely used as qualitative evidence of the microstructure of foods. In quantitative image analysis it is important to estimate the area (or volume) to be sampled, the field of view, and the resolution. The bootstrap method is proposed to estimate the size of the sampling area as a function of the coefficient of variation (CV(Bn)) and standard error (SE(Bn)) of the bootstrap taking sub-areas of different sizes. The bootstrap method was applied to simulated and real structures (apple tissue). For simulated structures, 10 computer-generated images were constructed containing 225 black circles (elements) and different coefficient of variation (CV(image)). For apple tissue, 8 images of apple tissue containing cellular cavities with different CV(image) were analyzed. Results confirmed that for simulated and real structures, increasing the size of the sampling area decreased the CV(Bn) and SE(Bn). Furthermore, there was a linear relationship between the CV(image) and CV(Bn) (.) For example, to obtain a CV(Bn) = 0.10 in an image with CV(image) = 0.60, a sampling area of 400 x 400 pixels (11% of whole image) was required, whereas if CV(image) = 1.46, a sampling area of 1000 x 100 pixels (69% of whole image) became necessary. This suggests that a large-size dispersion of element sizes in an image requires increasingly larger sampling areas or a larger number of images.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oberije, Cary, E-mail: cary.oberije@maastro.nl; De Ruysscher, Dirk; Universitaire Ziekenhuizen Leuven, KU Leuven
Purpose: Although patients with stage III non-small cell lung cancer (NSCLC) are homogeneous according to the TNM staging system, they form a heterogeneous group, which is reflected in the survival outcome. The increasing amount of information for an individual patient and the growing number of treatment options facilitate personalized treatment, but they also complicate treatment decision making. Decision support systems (DSS), which provide individualized prognostic information, can overcome this but are currently lacking. A DSS for stage III NSCLC requires the development and integration of multiple models. The current study takes the first step in this process by developing andmore » validating a model that can provide physicians with a survival probability for an individual NSCLC patient. Methods and Materials: Data from 548 patients with stage III NSCLC were available to enable the development of a prediction model, using stratified Cox regression. Variables were selected by using a bootstrap procedure. Performance of the model was expressed as the c statistic, assessed internally and on 2 external data sets (n=174 and n=130). Results: The final multivariate model, stratified for treatment, consisted of age, gender, World Health Organization performance status, overall treatment time, equivalent radiation dose, number of positive lymph node stations, and gross tumor volume. The bootstrapped c statistic was 0.62. The model could identify risk groups in external data sets. Nomograms were constructed to predict an individual patient's survival probability ( (www.predictcancer.org)). The data set can be downloaded at (https://www.cancerdata.org/10.1016/j.ijrobp.2015.02.048). Conclusions: The prediction model for overall survival of patients with stage III NSCLC highlights the importance of combining patient, clinical, and treatment variables. Nomograms were developed and validated. This tool could be used as a first building block for a decision support system.« less
Liu, Chunbo; Pan, Feng; Li, Yun
2016-07-29
Glutamate is of great importance in food and pharmaceutical industries. There is still lack of effective statistical approaches for fault diagnosis in the fermentation process of glutamate. To date, the statistical approach based on generalized additive model (GAM) and bootstrap has not been used for fault diagnosis in fermentation processes, much less the fermentation process of glutamate with small samples sets. A combined approach of GAM and bootstrap was developed for the online fault diagnosis in the fermentation process of glutamate with small sample sets. GAM was first used to model the relationship between glutamate production and different fermentation parameters using online data from four normal fermentation experiments of glutamate. The fitted GAM with fermentation time, dissolved oxygen, oxygen uptake rate and carbon dioxide evolution rate captured 99.6 % variance of glutamate production during fermentation process. Bootstrap was then used to quantify the uncertainty of the estimated production of glutamate from the fitted GAM using 95 % confidence interval. The proposed approach was then used for the online fault diagnosis in the abnormal fermentation processes of glutamate, and a fault was defined as the estimated production of glutamate fell outside the 95 % confidence interval. The online fault diagnosis based on the proposed approach identified not only the start of the fault in the fermentation process, but also the end of the fault when the fermentation conditions were back to normal. The proposed approach only used a small sample sets from normal fermentations excitements to establish the approach, and then only required online recorded data on fermentation parameters for fault diagnosis in the fermentation process of glutamate. The proposed approach based on GAM and bootstrap provides a new and effective way for the fault diagnosis in the fermentation process of glutamate with small sample sets.
Wang, Jung-Han; Abdel-Aty, Mohamed; Wang, Ling
2017-07-01
There have been plenty of studies intended to use different methods, for example, empirical Bayes before-after methods, to get accurate estimation of CMFs. All of them have different assumptions toward crash count if there was no treatment. Additionally, another major assumption is that multiple sites share the same true CMF. Under this assumption, the CMF at an individual intersection is randomly drawn from a normally distributed population of CMFs at all intersections. Since CMFs are non-zero values, the population of all CMFs might not follow normal distributions, and even if it does, the true mean of CMFs at some intersections may be different from that at others. Therefore, a bootstrap method based on before-after empirical Bayes theory was proposed to estimate CMFs, but it did not make distributional assumptions. This bootstrap procedure has the added benefit of producing a measure of CMF stability. Furthermore, based on the bootstrapped CMF, a new CMF precision rating method was proposed to evaluate the reliability of CMFs. This study chose 29 urban four-legged intersections as treated sites, and their controls were changed from stop-controlled to signal-controlled. Meanwhile, 124 urban four-legged stop-controlled intersections were selected as reference sites. At first, different safety performance functions (SPFs) were applied to five crash categories, and it was found that each crash category had different optimal SPF form. Then, the CMFs of these five crash categories were estimated using the bootstrap empirical Bayes method. The results of the bootstrapped method showed that signalization significantly decreased Angle+Left-Turn crashes, and its CMF had the highest precision. While, the CMF for Rear-End crashes was unreliable. For KABCO, KABC, and KAB crashes, their CMFs were proved to be reliable for the majority of intersections, but the estimated effect of signalization may be not accurate at some sites. Copyright © 2017 Elsevier Ltd. All rights reserved.
Structural parameters of young star clusters: fractal analysis
NASA Astrophysics Data System (ADS)
Hetem, A.
2017-07-01
A unified view of star formation in the Universe demand detailed and in-depth studies of young star clusters. This work is related to our previous study of fractal statistics estimated for a sample of young stellar clusters (Gregorio-Hetem et al. 2015, MNRAS 448, 2504). The structural properties can lead to significant conclusions about the early stages of cluster formation: 1) virial conditions can be used to distinguish warm collapsed; 2) bound or unbound behaviour can lead to conclusions about expansion; and 3) fractal statistics are correlated to the dynamical evolution and age. The technique of error bars estimation most used in the literature is to adopt inferential methods (like bootstrap) to estimate deviation and variance, which are valid only for an artificially generated cluster. In this paper, we expanded the number of studied clusters, in order to enhance the investigation of the cluster properties and dynamic evolution. The structural parameters were compared with fractal statistics and reveal that the clusters radial density profile show a tendency of the mean separation of the stars increase with the average surface density. The sample can be divided into two groups showing different dynamic behaviour, but they have the same dynamic evolution, since the entire sample was revealed as being expanding objects, for which the substructures do not seem to have been completely erased. These results are in agreement with the simulations adopting low surface densities and supervirial conditions.
Elastic S-matrices in (1 + 1) dimensions and Toda field theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christe, P.; Mussardo, G.
Particular deformations of 2-D conformal field theory lead to integrable massive quantum field theories. These can be characterized by the relative scattering data. This paper proposes a general scheme for classifying the elastic nondegenerate S-matrix in (1 + 1) dimensions starting from the possible boot-strap processes and the spins of the conserved currents. Their identification with the S-matrix coming from the Toda field theory is analyzed. The authors discuss both cases of Toda field theory constructed with the simply-laced Dynkin diagrams and the nonsimply-laced ones. The authors present the results of the perturbative analysis and their geometrical interpretations.
Quantitative body DW-MRI biomarkers uncertainty estimation using unscented wild-bootstrap.
Freiman, M; Voss, S D; Mulkern, R V; Perez-Rossello, J M; Warfield, S K
2011-01-01
We present a new method for the uncertainty estimation of diffusion parameters for quantitative body DW-MRI assessment. Diffusion parameters uncertainty estimation from DW-MRI is necessary for clinical applications that use these parameters to assess pathology. However, uncertainty estimation using traditional techniques requires repeated acquisitions, which is undesirable in routine clinical use. Model-based bootstrap techniques, for example, assume an underlying linear model for residuals rescaling and cannot be utilized directly for body diffusion parameters uncertainty estimation due to the non-linearity of the body diffusion model. To offset this limitation, our method uses the Unscented transform to compute the residuals rescaling parameters from the non-linear body diffusion model, and then applies the wild-bootstrap method to infer the body diffusion parameters uncertainty. Validation through phantom and human subject experiments shows that our method identify the regions with higher uncertainty in body DWI-MRI model parameters correctly with realtive error of -36% in the uncertainty values.
Bootstrapping the (A1, A2) Argyres-Douglas theory
NASA Astrophysics Data System (ADS)
Cornagliotto, Martina; Lemos, Madalena; Liendo, Pedro
2018-03-01
We apply bootstrap techniques in order to constrain the CFT data of the ( A 1 , A 2) Argyres-Douglas theory, which is arguably the simplest of the Argyres-Douglas models. We study the four-point function of its single Coulomb branch chiral ring generator and put numerical bounds on the low-lying spectrum of the theory. Of particular interest is an infinite family of semi-short multiplets labeled by the spin ℓ. Although the conformal dimensions of these multiplets are protected, their three-point functions are not. Using the numerical bootstrap we impose rigorous upper and lower bounds on their values for spins up to ℓ = 20. Through a recently obtained inversion formula, we also estimate them for sufficiently large ℓ, and the comparison of both approaches shows consistent results. We also give a rigorous numerical range for the OPE coefficient of the next operator in the chiral ring, and estimates for the dimension of the first R-symmetry neutral non-protected multiplet for small spin.
López, Erick B; Yamashita, Takashi
2017-02-01
This study examined whether household income mediates the relationship between acculturation and vegetable consumption among Latino adults in the U.S. Data from the 2009 to 2010 National Health and Nutrition Examination Survey were analyzed. Vegetable consumption index was created based on the frequencies of five kinds of vegetables intake. Acculturation was measured with the degree of English language use at home. Path model with bootstrapping technique was employed for mediation analysis. A significant partial mediation relationship was identified. Greater acculturation [95 % bias corrected bootstrap confident interval (BCBCI) = (0.02, 0.33)] was associated with the higher income and in turn, greater vegetable consumption. At the same time, greater acculturation was associated with lower vegetable consumption [95 % BCBCI = (-0.88, -0.07)]. Findings regarding the income as a mediator of the acculturation-dietary behavior relationship inform unique intervention programs and policy changes to address health disparities by race/ethnicity.
Im, Subin; Min, Soonhong
2013-04-01
Exploratory factor analyses of the Kirton Adaption-Innovation Inventory (KAI), which serves to measure individual cognitive styles, generally indicate three factors: sufficiency of originality, efficiency, and rule/group conformity. In contrast, a 2005 study by Im and Hu using confirmatory factor analysis supported a four-factor structure, dividing the sufficiency of originality dimension into two subdimensions, idea generation and preference for change. This study extends Im and Hu's (2005) study of a derived version of the KAI by providing additional evidence of the four-factor structure. Specifically, the authors test the robustness of the parameter estimates to the violation of normality assumptions in the sample using bootstrap methods. A bias-corrected confidence interval bootstrapping procedure conducted among a sample of 356 participants--members of the Arkansas Household Research Panel, with middle SES and average age of 55.6 yr. (SD = 13.9)--showed that the four-factor model with two subdimensions of sufficiency of originality fits the data significantly better than the three-factor model in non-normality conditions.
How to bootstrap a human communication system.
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified items to a partner using repeated non-linguistic vocalization, repeated gesture, or repeated non-linguistic vocalization plus gesture (but without using their existing language system). Gesture proved more effective (measured by communication success) and more efficient (measured by the time taken to communicate) than non-linguistic vocalization across a range of item categories (emotion, object, and action). Combining gesture and vocalization did not improve performance beyond gesture alone. We experimentally demonstrate that gesture is a more effective means of bootstrapping a human communication system. We argue that gesture outperforms non-linguistic vocalization because it lends itself more naturally to the production of motivated signs. © 2013 Cognitive Science Society, Inc.
Measuring and Benchmarking Technical Efficiency of Public Hospitals in Tianjin, China
Li, Hao; Dong, Siping
2015-01-01
China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. PMID:26396090
Gridless, pattern-driven point cloud completion and extension
NASA Astrophysics Data System (ADS)
Gravey, Mathieu; Mariethoz, Gregoire
2016-04-01
While satellites offer Earth observation with a wide coverage, other remote sensing techniques such as terrestrial LiDAR can acquire very high-resolution data on an area that is limited in extension and often discontinuous due to shadow effects. Here we propose a numerical approach to merge these two types of information, thereby reconstructing high-resolution data on a continuous large area. It is based on a pattern matching process that completes the areas where only low-resolution data is available, using bootstrapped high-resolution patterns. Currently, the most common approach to pattern matching is to interpolate the point data on a grid. While this approach is computationally efficient, it presents major drawbacks for point clouds processing because a significant part of the information is lost in the point-to-grid resampling, and that a prohibitive amount of memory is needed to store large grids. To address these issues, we propose a gridless method that compares point clouds subsets without the need to use a grid. On-the-fly interpolation involves a heavy computational load, which is met by using a GPU high-optimized implementation and a hierarchical pattern searching strategy. The method is illustrated using data from the Val d'Arolla, Swiss Alps, where high-resolution terrestrial LiDAR data are fused with lower-resolution Landsat and WorldView-3 acquisitions, such that the density of points is homogeneized (data completion) and that it is extend to a larger area (data extension).
van Dijk, Lisanne V; Brouwer, Charlotte L; van der Schaaf, Arjen; Burgerhof, Johannes G M; Beukinga, Roelof J; Langendijk, Johannes A; Sijtsema, Nanna M; Steenbakkers, Roel J H M
2017-02-01
Current models for the prediction of late patient-rated moderate-to-severe xerostomia (XER 12m ) and sticky saliva (STIC 12m ) after radiotherapy are based on dose-volume parameters and baseline xerostomia (XER base ) or sticky saliva (STIC base ) scores. The purpose is to improve prediction of XER 12m and STIC 12m with patient-specific characteristics, based on CT image biomarkers (IBMs). Planning CT-scans and patient-rated outcome measures were prospectively collected for 249 head and neck cancer patients treated with definitive radiotherapy with or without systemic treatment. The potential IBMs represent geometric, CT intensity and textural characteristics of the parotid and submandibular glands. Lasso regularisation was used to create multivariable logistic regression models, which were internally validated by bootstrapping. The prediction of XER 12m could be improved significantly by adding the IBM "Short Run Emphasis" (SRE), which quantifies heterogeneity of parotid tissue, to a model with mean contra-lateral parotid gland dose and XER base . For STIC 12m , the IBM maximum CT intensity of the submandibular gland was selected in addition to STIC base and mean dose to submandibular glands. Prediction of XER 12m and STIC 12m was improved by including IBMs representing heterogeneity and density of the salivary glands, respectively. These IBMs could guide additional research to the patient-specific response of healthy tissue to radiation dose. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Oladyshkin, S.; Schroeder, P.; Class, H.; Nowak, W.
2013-12-01
Predicting underground carbon dioxide (CO2) storage represents a challenging problem in a complex dynamic system. Due to lacking information about reservoir parameters, quantification of uncertainties may become the dominant question in risk assessment. Calibration on past observed data from pilot-scale test injection can improve the predictive power of the involved geological, flow, and transport models. The current work performs history matching to pressure time series from a pilot storage site operated in Europe, maintained during an injection period. Simulation of compressible two-phase flow and transport (CO2/brine) in the considered site is computationally very demanding, requiring about 12 days of CPU time for an individual model run. For that reason, brute-force approaches for calibration are not feasible. In the current work, we explore an advanced framework for history matching based on the arbitrary polynomial chaos expansion (aPC) and strict Bayesian principles. The aPC [1] offers a drastic but accurate stochastic model reduction. Unlike many previous chaos expansions, it can handle arbitrary probability distribution shapes of uncertain parameters, and can therefore handle directly the statistical information appearing during the matching procedure. We capture the dependence of model output on these multipliers with the expansion-based reduced model. In our study we keep the spatial heterogeneity suggested by geophysical methods, but consider uncertainty in the magnitude of permeability trough zone-wise permeability multipliers. Next combined the aPC with Bootstrap filtering (a brute-force but fully accurate Bayesian updating mechanism) in order to perform the matching. In comparison to (Ensemble) Kalman Filters, our method accounts for higher-order statistical moments and for the non-linearity of both the forward model and the inversion, and thus allows a rigorous quantification of calibrated model uncertainty. The usually high computational costs of accurate filtering become very feasible for our suggested aPC-based calibration framework. However, the power of aPC-based Bayesian updating strongly depends on the accuracy of prior information. In the current study, the prior assumptions on the model parameters were not satisfactory and strongly underestimate the reservoir pressure. Thus, the aPC-based response surface used in Bootstrap filtering is fitted to a distant and poorly chosen region within the parameter space. Thanks to the iterative procedure suggested in [2] we overcome this drawback with small computational costs. The iteration successively improves the accuracy of the expansion around the current estimation of the posterior distribution. The final result is a calibrated model of the site that can be used for further studies, with an excellent match to the data. References [1] Oladyshkin S. and Nowak W. Data-driven uncertainty quantification using the arbitrary polynomial chaos expansion. Reliability Engineering and System Safety, 106:179-190, 2012. [2] Oladyshkin S., Class H., Nowak W. Bayesian updating via Bootstrap filtering combined with data-driven polynomial chaos expansions: methodology and application to history matching for carbon dioxide storage in geological formations. Computational Geosciences, 17 (4), 671-687, 2013.
Lindholm, Daniel; Lindbäck, Johan; Armstrong, Paul W; Budaj, Andrzej; Cannon, Christopher P; Granger, Christopher B; Hagström, Emil; Held, Claes; Koenig, Wolfgang; Östlund, Ollie; Stewart, Ralph A H; Soffer, Joseph; White, Harvey D; de Winter, Robbert J; Steg, Philippe Gabriel; Siegbahn, Agneta; Kleber, Marcus E; Dressel, Alexander; Grammer, Tanja B; März, Winfried; Wallentin, Lars
2017-08-15
Currently, there is no generally accepted model to predict outcomes in stable coronary heart disease (CHD). This study evaluated and compared the prognostic value of biomarkers and clinical variables to develop a biomarker-based prediction model in patients with stable CHD. In a prospective, randomized trial cohort of 13,164 patients with stable CHD, we analyzed several candidate biomarkers and clinical variables and used multivariable Cox regression to develop a clinical prediction model based on the most important markers. The primary outcome was cardiovascular (CV) death, but model performance was also explored for other key outcomes. It was internally bootstrap validated, and externally validated in 1,547 patients in another study. During a median follow-up of 3.7 years, there were 591 cases of CV death. The 3 most important biomarkers were N-terminal pro-B-type natriuretic peptide (NT-proBNP), high-sensitivity cardiac troponin T (hs-cTnT), and low-density lipoprotein cholesterol, where NT-proBNP and hs-cTnT had greater prognostic value than any other biomarker or clinical variable. The final prediction model included age (A), biomarkers (B) (NT-proBNP, hs-cTnT, and low-density lipoprotein cholesterol), and clinical variables (C) (smoking, diabetes mellitus, and peripheral arterial disease). This "ABC-CHD" model had high discriminatory ability for CV death (c-index 0.81 in derivation cohort, 0.78 in validation cohort), with adequate calibration in both cohorts. This model provided a robust tool for the prediction of CV death in patients with stable CHD. As it is based on a small number of readily available biomarkers and clinical factors, it can be widely employed to complement clinical assessment and guide management based on CV risk. (The Stabilization of Atherosclerotic Plaque by Initiation of Darapladib Therapy Trial [STABILITY]; NCT00799903). Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Bootstrapping quarks and gluons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chew, G.F.
1979-04-01
Dual topological unitarization (DTU) - the approach to S-matrix causality and unitarity through combinatorial topology - is reviewed. Amplitudes associated with triangulated spheres are shown to constitute the core of particle physics. Each sphere is covered by triangulated disc faces corresponding to hadrons. The leading current candidate for the hadron-face triangulation pattern employs 3-triangle basic subdiscs whose orientations correspond to baryon number and topological color. Additional peripheral triangles lie along the hadron-face perimeter. Certain combinations of peripheral triangles with a basic-disc triangle can be identified as quarks, the flavor of a quark corresponding to the orientation of its edges thatmore » lie on the hadron-face perimeter. Both baryon number and flavor are additively conserved. Quark helicity, which can be associated with triangle-interior orientation, is not uniformly conserved and interacts with particle momentum, whereas flavor does not. Three different colors attach to the 3 quarks associated with a single basic subdisc, but there is no additive physical conservation law associated with color. There is interplay between color and quark helicity. In hadron faces with more than one basic subdisc, there may occur pairs of adjacent flavorless but colored triangles with net helicity +-1 that are identifiable as gluons. Broken symmetry is an automatic feature of the bootstrap. T, C and P symmetries, as well as up-down flavor symmetry, persist on all orientable surfaces.« less
Li, Wen; Zhao, Li-Zhong; Ma, Dong-Wang; Wang, De-Zheng; Shi, Lei; Wang, Hong-Lei; Dong, Mo; Zhang, Shu-Yi; Cao, Lei; Zhang, Wei-Hua; Zhang, Xi-Peng; Zhang, Qing-Huai; Yu, Lin; Qin, Hai; Wang, Xi-Mo; Chen, Sam Li-Sheng
2018-05-01
We aimed to predict colorectal cancer (CRC) based on the demographic features and clinical correlates of personal symptoms and signs from Tianjin community-based CRC screening data.A total of 891,199 residents who were aged 60 to 74 and were screened in 2012 were enrolled. The Lasso logistic regression model was used to identify the predictors for CRC. Predictive validity was assessed by the receiver operating characteristic (ROC) curve. Bootstrapping method was also performed to validate this prediction model.CRC was best predicted by a model that included age, sex, education level, occupations, diarrhea, constipation, colon mucosa and bleeding, gallbladder disease, a stressful life event, family history of CRC, and a positive fecal immunochemical test (FIT). The area under curve (AUC) for the questionnaire with a FIT was 84% (95% CI: 82%-86%), followed by 76% (95% CI: 74%-79%) for a FIT alone, and 73% (95% CI: 71%-76%) for the questionnaire alone. With 500 bootstrap replications, the estimated optimism (<0.005) shows good discrimination in validation of prediction model.A risk prediction model for CRC based on a series of symptoms and signs related to enteric diseases in combination with a FIT was developed from first round of screening. The results of the current study are useful for increasing the awareness of high-risk subjects and for individual-risk-guided invitations or strategies to achieve mass screening for CRC.
BusyBee Web: metagenomic data analysis by bootstrapped supervised binning and annotation
Kiefer, Christina; Fehlmann, Tobias; Backes, Christina
2017-01-01
Abstract Metagenomics-based studies of mixed microbial communities are impacting biotechnology, life sciences and medicine. Computational binning of metagenomic data is a powerful approach for the culture-independent recovery of population-resolved genomic sequences, i.e. from individual or closely related, constituent microorganisms. Existing binning solutions often require a priori characterized reference genomes and/or dedicated compute resources. Extending currently available reference-independent binning tools, we developed the BusyBee Web server for the automated deconvolution of metagenomic data into population-level genomic bins using assembled contigs (Illumina) or long reads (Pacific Biosciences, Oxford Nanopore Technologies). A reversible compression step as well as bootstrapped supervised binning enable quick turnaround times. The binning results are represented in interactive 2D scatterplots. Moreover, bin quality estimates, taxonomic annotations and annotations of antibiotic resistance genes are computed and visualized. Ground truth-based benchmarks of BusyBee Web demonstrate comparably high performance to state-of-the-art binning solutions for assembled contigs and markedly improved performance for long reads (median F1 scores: 70.02–95.21%). Furthermore, the applicability to real-world metagenomic datasets is shown. In conclusion, our reference-independent approach automatically bins assembled contigs or long reads, exhibits high sensitivity and precision, enables intuitive inspection of the results, and only requires FASTA-formatted input. The web-based application is freely accessible at: https://ccb-microbe.cs.uni-saarland.de/busybee. PMID:28472498
Extensions to the visual predictive check to facilitate model performance evaluation.
Post, Teun M; Freijer, Jan I; Ploeger, Bart A; Danhof, Meindert
2008-04-01
The Visual Predictive Check (VPC) is a valuable and supportive instrument for evaluating model performance. However in its most commonly applied form, the method largely depends on a subjective comparison of the distribution of the simulated data with the observed data, without explicitly quantifying and relating the information in both. In recent adaptations to the VPC this drawback is taken into consideration by presenting the observed and predicted data as percentiles. In addition, in some of these adaptations the uncertainty in the predictions is represented visually. However, it is not assessed whether the expected random distribution of the observations around the predicted median trend is realised in relation to the number of observations. Moreover the influence of and the information residing in missing data at each time point is not taken into consideration. Therefore, in this investigation the VPC is extended with two methods to support a less subjective and thereby more adequate evaluation of model performance: (i) the Quantified Visual Predictive Check (QVPC) and (ii) the Bootstrap Visual Predictive Check (BVPC). The QVPC presents the distribution of the observations as a percentage, thus regardless the density of the data, above and below the predicted median at each time point, while also visualising the percentage of unavailable data. The BVPC weighs the predicted median against the 5th, 50th and 95th percentiles resulting from a bootstrap of the observed data median at each time point, while accounting for the number and the theoretical position of unavailable data. The proposed extensions to the VPC are illustrated by a pharmacokinetic simulation example and applied to a pharmacodynamic disease progression example.
Troped, Philip J; Tamura, Kosuke; McDonough, Meghan H; Starnes, Heather A; James, Peter; Ben-Joseph, Eran; Cromley, Ellen; Puett, Robin; Melly, Steven J; Laden, Francine
2017-04-01
The built environment predicts walking in older adults, but the degree to which associations between the objective built environment and walking for different purposes are mediated by environmental perceptions is unknown. We examined associations between the neighborhood built environment and leisure and utilitarian walking and mediation by the perceived environment among older women. Women (N = 2732, M age = 72.8 ± 6.8 years) from Massachusetts, Pennsylvania, and California completed a neighborhood built environment and walking survey. Objective population and intersection density and density of stores and services variables were created within residential buffers. Perceived built environment variables included measures of land use mix, street connectivity, infrastructure for walking, esthetics, traffic safety, and personal safety. Regression and bootstrapping were used to test associations and indirect effects. Objective population, stores/services, and intersection density indirectly predicted leisure and utilitarian walking via perceived land use mix (odds ratios (ORs) = 1.01-1.08, 95 % bias corrected and accelerated confidence intervals do not include 1). Objective density of stores/services directly predicted ≥150 min utilitarian walking (OR = 1.11; 95% CI = 1.02, 1.22). Perceived land use mix (ORs = 1.16-1.44) and esthetics (ORs = 1.24-1.61) significantly predicted leisure and utilitarian walking, CONCLUSIONS: Perceived built environment mediated associations between objective built environment variables and walking for leisure and utilitarian purposes. Interventions for older adults should take into account how objective built environment characteristics may influence environmental perceptions and walking.
Classifier performance prediction for computer-aided diagnosis using a limited dataset.
Sahiner, Berkman; Chan, Heang-Ping; Hadjiiski, Lubomir
2008-04-01
In a practical classifier design problem, the true population is generally unknown and the available sample is finite-sized. A common approach is to use a resampling technique to estimate the performance of the classifier that will be trained with the available sample. We conducted a Monte Carlo simulation study to compare the ability of the different resampling techniques in training the classifier and predicting its performance under the constraint of a finite-sized sample. The true population for the two classes was assumed to be multivariate normal distributions with known covariance matrices. Finite sets of sample vectors were drawn from the population. The true performance of the classifier is defined as the area under the receiver operating characteristic curve (AUC) when the classifier designed with the specific sample is applied to the true population. We investigated methods based on the Fukunaga-Hayes and the leave-one-out techniques, as well as three different types of bootstrap methods, namely, the ordinary, 0.632, and 0.632+ bootstrap. The Fisher's linear discriminant analysis was used as the classifier. The dimensionality of the feature space was varied from 3 to 15. The sample size n2 from the positive class was varied between 25 and 60, while the number of cases from the negative class was either equal to n2 or 3n2. Each experiment was performed with an independent dataset randomly drawn from the true population. Using a total of 1000 experiments for each simulation condition, we compared the bias, the variance, and the root-mean-squared error (RMSE) of the AUC estimated using the different resampling techniques relative to the true AUC (obtained from training on a finite dataset and testing on the population). Our results indicated that, under the study conditions, there can be a large difference in the RMSE obtained using different resampling methods, especially when the feature space dimensionality is relatively large and the sample size is small. Under this type of conditions, the 0.632 and 0.632+ bootstrap methods have the lowest RMSE, indicating that the difference between the estimated and the true performances obtained using the 0.632 and 0.632+ bootstrap will be statistically smaller than those obtained using the other three resampling methods. Of the three bootstrap methods, the 0.632+ bootstrap provides the lowest bias. Although this investigation is performed under some specific conditions, it reveals important trends for the problem of classifier performance prediction under the constraint of a limited dataset.
Anode current density distribution in a cusped field thruster
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Huan, E-mail: wuhuan58@qq.com; Liu, Hui, E-mail: hlying@gmail.com; Meng, Yingchao
2015-12-15
The cusped field thruster is a new electric propulsion device that is expected to have a non-uniform radial current density at the anode. To further study the anode current density distribution, a multi-annulus anode is designed to directly measure the anode current density for the first time. The anode current density decreases sharply at larger radii; the magnitude of collected current density at the center is far higher compared with the outer annuli. The anode current density non-uniformity does not demonstrate a significant change with varying working conditions.
Fixed precision sampling plans for white apple leafhopper (Homoptera: Cicadellidae) on apple.
Beers, Elizabeth H; Jones, Vincent P
2004-10-01
Constant precision sampling plans for the white apple leafhopper, Typhlocyba pomaria McAtee, were developed so that it could be used as an indicator species for system stability as new integrated pest management programs without broad-spectrum pesticides are developed. Taylor's power law was used to model the relationship between the mean and the variance, and Green's constant precision sequential sample equation was used to develop sampling plans. Bootstrap simulations of the sampling plans showed greater precision (D = 0.25) than the desired precision (Do = 0.3), particularly at low mean population densities. We found that by adjusting the Do value in Green's equation to 0.4, we were able to reduce the average sample number by 25% and provided an average D = 0.31. The sampling plan described allows T. pomaria to be used as reasonable indicator species of agroecosystem stability in Washington apple orchards.
Maternal Depression and Trait Anger as Risk Factors for Escalated Physical Discipline
Shay, Nicole L.; Knutson, John F.
2008-01-01
To test the hypothesized anger-mediated relation between maternal depression and escalation of physical discipline, 122 economically disadvantaged mothers were assessed for current and lifetime diagnoses of depression using the Current Depressive Episode, Past Depression, and Dysthymia sections of the Structured Clinical Interview for DSM-IV (SCID) and a measure of current depressive symptoms, the Beck Depression Inventory–Second Edition (BDI-II). Escalation of physical discipline was assessed using a video analog parenting task; maternal anger not specific to discipline was assessed using the Spielberger Trait Anger Expression Inventory. Reports of anger were associated with the diagnosis of depression and depressive symptoms. Bootstrap analyses of indirect effects indicated that the link between depression and escalated discipline was mediated by anger. Parallel analyses based on BDI-II scores identified a marginally significant indirect effect of depression on discipline. Findings suggest that anger and irritability are central to the putative link between depression and harsh discipline. PMID:18174347
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Xian-Qu; Zhang, Rui-Bin; Meng, Guo
2016-07-15
The destabilization of ideal internal kink modes by trapped fast particles in tokamak plasmas with a “shoulder”-like equilibrium current is investigated. It is found that energetic particle branch of the mode is unstable with the driving of fast-particle precession drifts and corresponds to a precessional fishbone. The mode with a low stability threshold is also more easily excited than the conventional precessional fishbone. This is different from earlier studies for the same equilibrium in which the magnetohydrodynamic (MHD) branch of the mode is stable. Furthermore, the stability and characteristic frequency of the mode are analyzed by solving the dispersion relationmore » and comparing with the conventional fishbone. The results suggest that an equilibrium with a locally flattened q-profile, may be modified by localized current drive (or bootstrap current, etc.), is prone to the onset of the precessional fishbone branch of the mode.« less
Concept Innateness, Concept Continuity, and Bootstrapping
Carey, Susan
2011-01-01
The commentators raised issues relevant to all three important theses of The Origin of Concepts (TOOC). Some questioned the very existence of innate representational primitives, and others questioned my claims about their richness and whether they should be thought of as concepts. Some questioned the existence of conceptual discontinuity in the course of knowledge acquisition and others argued that discontinuity is much more common than portrayed in TOOC. Some raised issues with my characterization of Quinian bootstrapping, and others questioned the dual factor theory of concepts motivated by my picture of conceptual development. PMID:23264705
Crossing symmetry in alpha space
NASA Astrophysics Data System (ADS)
Hogervorst, Matthijs; van Rees, Balt C.
2017-11-01
We initiate the study of the conformal bootstrap using Sturm-Liouville theory, specializing to four-point functions in one-dimensional CFTs. We do so by decomposing conformal correlators using a basis of eigenfunctions of the Casimir which are labeled by a complex number α. This leads to a systematic method for computing conformal block decompositions. Analyzing bootstrap equations in alpha space turns crossing symmetry into an eigenvalue problem for an integral operator K. The operator K is closely related to the Wilson transform, and some of its eigenfunctions can be found in closed form.
Direct measurement of fast transients by using boot-strapped waveform averaging
NASA Astrophysics Data System (ADS)
Olsson, Mattias; Edman, Fredrik; Karki, Khadga Jung
2018-03-01
An approximation to coherent sampling, also known as boot-strapped waveform averaging, is presented. The method uses digital cavities to determine the condition for coherent sampling. It can be used to increase the effective sampling rate of a repetitive signal and the signal to noise ratio simultaneously. The method is demonstrated by using it to directly measure the fluorescence lifetime from Rhodamine 6G by digitizing the signal from a fast avalanche photodiode. The obtained lifetime of 4.0 ns is in agreement with the known values.
NASA Astrophysics Data System (ADS)
Stroeve, Julienne C.; Jenouvrier, Stephanie; Campbell, G. Garrett; Barbraud, Christophe; Delord, Karine
2016-08-01
Sea ice variability within the marginal ice zone (MIZ) and polynyas plays an important role for phytoplankton productivity and krill abundance. Therefore, mapping their spatial extent as well as seasonal and interannual variability is essential for understanding how current and future changes in these biologically active regions may impact the Antarctic marine ecosystem. Knowledge of the distribution of MIZ, consolidated pack ice and coastal polynyas in the total Antarctic sea ice cover may also help to shed light on the factors contributing towards recent expansion of the Antarctic ice cover in some regions and contraction in others. The long-term passive microwave satellite data record provides the longest and most consistent record for assessing the proportion of the sea ice cover that is covered by each of these ice categories. However, estimates of the amount of MIZ, consolidated pack ice and polynyas depend strongly on which sea ice algorithm is used. This study uses two popular passive microwave sea ice algorithms, the NASA Team and Bootstrap, and applies the same thresholds to the sea ice concentrations to evaluate the distribution and variability in the MIZ, the consolidated pack ice and coastal polynyas. Results reveal that the seasonal cycle in the MIZ and pack ice is generally similar between both algorithms, yet the NASA Team algorithm has on average twice the MIZ and half the consolidated pack ice area as the Bootstrap algorithm. Trends also differ, with the Bootstrap algorithm suggesting statistically significant trends towards increased pack ice area and no statistically significant trends in the MIZ. The NASA Team algorithm on the other hand indicates statistically significant positive trends in the MIZ during spring. Potential coastal polynya area and amount of broken ice within the consolidated ice pack are also larger in the NASA Team algorithm. The timing of maximum polynya area may differ by as much as 5 months between algorithms. These differences lead to different relationships between sea ice characteristics and biological processes, as illustrated here with the breeding success of an Antarctic seabird.
Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques
NASA Astrophysics Data System (ADS)
Mai, J.; Tolson, B.
2017-12-01
The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method's independency of the convergence testing method, we applied it to two widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991) and the variance-based Sobol' method (Solbol' 1993). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed sensitivity results. This is one step towards reliable and transferable, published sensitivity results.
Brunelli, Alessandro; Tentzeris, Vasileios; Sandri, Alberto; McKenna, Alexandra; Liew, Shan Liung; Milton, Richard; Chaudhuri, Nilanjan; Kefaloyannis, Emmanuel; Papagiannopoulos, Kostas
2016-05-01
To develop a clinically risk-adjusted financial model to estimate the cost associated with a video-assisted thoracoscopic surgery (VATS) lobectomy programme. Prospectively collected data of 236 VATS lobectomy patients (August 2012-December 2013) were analysed retrospectively. Fixed and variable intraoperative and postoperative costs were retrieved from the Hospital Accounting Department. Baseline and surgical variables were tested for a possible association with total cost using a multivariable linear regression and bootstrap analyses. Costs were calculated in GBP and expressed in Euros (EUR:GBP exchange rate 1.4). The average total cost of a VATS lobectomy was €11 368 (range €6992-€62 535). Average intraoperative (including surgical and anaesthetic time, overhead, disposable materials) and postoperative costs [including ward stay, high dependency unit (HDU) or intensive care unit (ICU) and variable costs associated with management of complications] were €8226 (range €5656-€13 296) and €3029 (range €529-€51 970), respectively. The following variables remained reliably associated with total costs after linear regression analysis and bootstrap: carbon monoxide lung diffusion capacity (DLCO) <60% predicted value (P = 0.02, bootstrap 63%) and chronic obstructive pulmonary disease (COPD; P = 0.035, bootstrap 57%). The following model was developed to estimate the total costs: 10 523 + 1894 × COPD + 2376 × DLCO < 60%. The comparison between predicted and observed costs was repeated in 1000 bootstrapped samples to verify the stability of the model. The two values were not different (P > 0.05) in 86% of the samples. A hypothetical patient with COPD and DLCO less than 60% would cost €4270 more than a patient without COPD and with higher DLCO values (€14 793 vs €10 523). Risk-adjusting financial data can help estimate the total cost associated with VATS lobectomy based on clinical factors. This model can be used to audit the internal financial performance of a VATS lobectomy programme for budgeting, planning and for appropriate bundled payment reimbursements. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
Roche, Anne I; Kroska, Emily B; Miller, Michelle L; Kroska, Sydney K; O'Hara, Michael W
2018-03-22
Childhood trauma is associated with a variety of risky, unhealthy, or problem behaviors. The current study aimed to explore experiential avoidance and mindfulness processes as mechanisms through which childhood trauma and problem behavior are linked in a college sample. The sample consisted of college-aged young adults recruited November-December, 2016 (N = 414). Participants completed self-report measures of childhood trauma, current problem behavior, experiential avoidance, and mindfulness processes. Bootstrapped mediation analyses examined the mechanistic associations of interest. Mediation analyses indicated that experiential avoidance was a significant mediator of the association between childhood trauma and problem behavior. Additionally, multiple mediation analyses indicated that specific mindfulness facets-act with awareness and nonjudgment of inner experience-significantly mediated the same association. Interventions for college students who have experienced childhood trauma might profitably target mechanisms such as avoidance and mindfulness in order to minimize engagement in problem behavior.
Anomalous dimensions of spinning operators from conformal symmetry
NASA Astrophysics Data System (ADS)
Gliozzi, Ferdinando
2018-01-01
We compute, to the first non-trivial order in the ɛ-expansion of a perturbed scalar field theory, the anomalous dimensions of an infinite class of primary operators with arbitrary spin ℓ = 0, 1, . . . , including as a particular case the weakly broken higher-spin currents, using only constraints from conformal symmetry. Following the bootstrap philosophy, no reference is made to any Lagrangian, equations of motion or coupling constants. Even the space dimensions d are left free. The interaction is implicitly turned on through the local operators by letting them acquire anomalous dimensions. When matching certain four-point and five-point functions with the corresponding quantities of the free field theory in the ɛ → 0 limit, no free parameter remains. It turns out that only the expected discrete d values are permitted and the ensuing anomalous dimensions reproduce known results for the weakly broken higher-spin currents and provide new results for the other spinning operators.
Bootstrapping language acquisition.
Abend, Omri; Kwiatkowski, Tom; Smith, Nathaniel J; Goldwater, Sharon; Steedman, Mark
2017-07-01
The semantic bootstrapping hypothesis proposes that children acquire their native language through exposure to sentences of the language paired with structured representations of their meaning, whose component substructures can be associated with words and syntactic structures used to express these concepts. The child's task is then to learn a language-specific grammar and lexicon based on (probably contextually ambiguous, possibly somewhat noisy) pairs of sentences and their meaning representations (logical forms). Starting from these assumptions, we develop a Bayesian probabilistic account of semantically bootstrapped first-language acquisition in the child, based on techniques from computational parsing and interpretation of unrestricted text. Our learner jointly models (a) word learning: the mapping between components of the given sentential meaning and lexical words (or phrases) of the language, and (b) syntax learning: the projection of lexical elements onto sentences by universal construction-free syntactic rules. Using an incremental learning algorithm, we apply the model to a dataset of real syntactically complex child-directed utterances and (pseudo) logical forms, the latter including contextually plausible but irrelevant distractors. Taking the Eve section of the CHILDES corpus as input, the model simulates several well-documented phenomena from the developmental literature. In particular, the model exhibits syntactic bootstrapping effects (in which previously learned constructions facilitate the learning of novel words), sudden jumps in learning without explicit parameter setting, acceleration of word-learning (the "vocabulary spurt"), an initial bias favoring the learning of nouns over verbs, and one-shot learning of words and their meanings. The learner thus demonstrates how statistical learning over structured representations can provide a unified account for these seemingly disparate phenomena. Copyright © 2017 Elsevier B.V. All rights reserved.
Effect of non-normality on test statistics for one-way independent groups designs.
Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R
2012-02-01
The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.
Frey, H Christopher; Zhao, Yuchao
2004-11-15
Probabilistic emission inventories were developed for urban air toxic emissions of benzene, formaldehyde, chromium, and arsenic for the example of Houston. Variability and uncertainty in emission factors were quantified for 71-97% of total emissions, depending upon the pollutant and data availability. Parametric distributions for interunit variability were fit using maximum likelihood estimation (MLE), and uncertainty in mean emission factors was estimated using parametric bootstrap simulation. For data sets containing one or more nondetected values, empirical bootstrap simulation was used to randomly sample detection limits for nondetected values and observations for sample values, and parametric distributions for variability were fit using MLE estimators for censored data. The goodness-of-fit for censored data was evaluated by comparison of cumulative distributions of bootstrap confidence intervals and empirical data. The emission inventory 95% uncertainty ranges are as small as -25% to +42% for chromium to as large as -75% to +224% for arsenic with correlated surrogates. Uncertainty was dominated by only a few source categories. Recommendations are made for future improvements to the analysis.
Bootstrapping non-commutative gauge theories from L∞ algebras
NASA Astrophysics Data System (ADS)
Blumenhagen, Ralph; Brunner, Ilka; Kupriyanov, Vladislav; Lüst, Dieter
2018-05-01
Non-commutative gauge theories with a non-constant NC-parameter are investigated. As a novel approach, we propose that such theories should admit an underlying L∞ algebra, that governs not only the action of the symmetries but also the dynamics of the theory. Our approach is well motivated from string theory. We recall that such field theories arise in the context of branes in WZW models and briefly comment on its appearance for integrable deformations of AdS5 sigma models. For the SU(2) WZW model, we show that the earlier proposed matrix valued gauge theory on the fuzzy 2-sphere can be bootstrapped via an L∞ algebra. We then apply this approach to the construction of non-commutative Chern-Simons and Yang-Mills theories on flat and curved backgrounds with non-constant NC-structure. More concretely, up to the second order, we demonstrate how derivative and curvature corrections to the equations of motion can be bootstrapped in an algebraic way from the L∞ algebra. The appearance of a non-trivial A∞ algebra is discussed, as well.
A symbol of uniqueness: the cluster bootstrap for the 3-loop MHV heptagon
Drummond, J. M.; Papathanasiou, G.; Spradlin, M.
2015-03-16
Seven-particle scattering amplitudes in planar super-Yang-Mills theory are believed to belong to a special class of generalised polylogarithm functions called heptagon functions. These are functions with physical branch cuts whose symbols may be written in terms of the 42 cluster A-coordinates on Gr(4, 7). Motivated by the success of the hexagon bootstrap programme for constructing six-particle amplitudes we initiate the systematic study of the symbols of heptagon functions. We find that there is exactly one such symbol of weight six which satisfies the MHV last-entry condition and is finite in the 7 ll 6 collinear limit. This unique symbol ismore » both dihedral and parity-symmetric, and remarkably its collinear limit is exactly the symbol of the three-loop six-particle MHV amplitude, although none of these properties were assumed a priori. It must therefore be the symbol of the threeloop seven-particle MHV amplitude. The simplicity of its construction suggests that the n-gon bootstrap may be surprisingly powerful for n > 6.« less
NASA Technical Reports Server (NTRS)
Yoshikawa, H. H.; Madison, I. B.
1971-01-01
This study was performed in support of the NASA Task B-2 Study Plan for Space Basing. The nature of space-based operations implies that orbital transfer of propellant is a prime consideration. The intent of this report is (1) to report on the findings and recommendations of existing literature on space-based propellant transfer techniques, and (2) to determine possible alternatives to the recommended methods. The reviewed literature recommends, in general, the use of conventional liquid transfer techniques (i.e., pumping) in conjunction with an artificially induced gravitational field. An alternate concept that was studied, the Thermal Bootstrap Transfer Process, is based on the compression of a two-phase fluid with subsequent condensation to a liquid (vapor compression/condensation). This concept utilizes the intrinsic energy capacities of the tanks and propellant by exploiting temperature differentials and available energy differences. The results indicate the thermodynamic feasibility of the Thermal Bootstrap Transfer Process for a specific range of tank sizes, temperatures, fill-factors and receiver tank heat transfer coefficients.
Dmitriev, Egor V; Khomenko, Georges; Chami, Malik; Sokolov, Anton A; Churilova, Tatyana Y; Korotaev, Gennady K
2009-03-01
The absorption of sunlight by oceanic constituents significantly contributes to the spectral distribution of the water-leaving radiance. Here it is shown that current parameterizations of absorption coefficients do not apply to the optically complex waters of the Crimea Peninsula. Based on in situ measurements, parameterizations of phytoplankton, nonalgal, and total particulate absorption coefficients are proposed. Their performance is evaluated using a log-log regression combined with a low-pass filter and the nonlinear least-square method. Statistical significance of the estimated parameters is verified using the bootstrap method. The parameterizations are relevant for chlorophyll a concentrations ranging from 0.45 up to 2 mg/m(3).
Use of high order, periodic orbits in the PIES code
NASA Astrophysics Data System (ADS)
Monticello, Donald; Reiman, Allan
2010-11-01
We have implemented a version of the PIES code (Princeton Iterative Equilibrium SolverootnotetextA. Reiman et al 2007 Nucl. Fusion 47 572) that uses high order periodic orbits to select the surfaces on which straight magnetic field line coordinates will be calculated. The use of high order periodic orbits has increase the robustness and speed of the PIES code. We now have more uniform treatment of in-phase and out-of-phase islands. This new version has better convergence properties and works well with a full Newton scheme. We now have the ability to shrink islands using a bootstrap like current and this includes the m=1 island in tokamaks.
Exploration of high harmonic fast wave heating on the National Spherical Torus Experiment
NASA Astrophysics Data System (ADS)
Wilson, J. R.; Bell, R. E.; Bernabei, S.; Bitter, M.; Bonoli, P.; Gates, D.; Hosea, J.; LeBlanc, B.; Mau, T. K.; Medley, S.; Menard, J.; Mueller, D.; Ono, M.; Phillips, C. K.; Pinsker, R. I.; Raman, R.; Rosenberg, A.; Ryan, P.; Sabbagh, S.; Stutman, D.; Swain, D.; Takase, Y.; Wilgen, J.
2003-05-01
High harmonic fast wave (HHFW) heating has been proposed as a particularly attractive means for plasma heating and current drive in the high beta plasmas that are achievable in spherical torus (ST) devices. The National Spherical Torus Experiment (NSTX) [M. Ono, S. M. Kaye, S. Neumeyer et al., in Proceedings of the 18th IEEE/NPSS Symposium on Fusion Engineering, Albuquerque, 1999 (IEEE, Piscataway, NJ, 1999), p. 53] is such a device. An rf heating system has been installed on the NSTX to explore the physics of HHFW heating, current drive via rf waves and for use as a tool to demonstrate the attractiveness of the ST concept as a fusion device. To date, experiments have demonstrated many of the theoretical predictions for HHFW. In particular, strong wave absorption on electrons over a wide range of plasma parameters and wave parallel phase velocities, wave acceleration of energetic ions, and indications of current drive for directed wave spectra have been observed. In addition HHFW heating has been used to explore the energy transport properties of NSTX plasmas, to create H-mode discharges with a large fraction of bootstrap current and to control the plasma current profile during the early stages of the discharge.
Globally optimal superconducting magnets part I: minimum stored energy (MSE) current density map.
Tieng, Quang M; Vegh, Viktor; Brereton, Ian M
2009-01-01
An optimal current density map is crucial in magnet design to provide the initial values within search spaces in an optimization process for determining the final coil arrangement of the magnet. A strategy for obtaining globally optimal current density maps for the purpose of designing magnets with coaxial cylindrical coils in which the stored energy is minimized within a constrained domain is outlined. The current density maps obtained utilising the proposed method suggests that peak current densities occur around the perimeter of the magnet domain, where the adjacent peaks have alternating current directions for the most compact designs. As the dimensions of the domain are increased, the current density maps yield traditional magnet designs of positive current alone. These unique current density maps are obtained by minimizing the stored magnetic energy cost function and therefore suggest magnet coil designs of minimal system energy. Current density maps are provided for a number of different domain arrangements to illustrate the flexibility of the method and the quality of the achievable designs.
Two-dimensional relativistic space charge limited current flow in the drift space
NASA Astrophysics Data System (ADS)
Liu, Y. L.; Chen, S. H.; Koh, W. S.; Ang, L. K.
2014-04-01
Relativistic two-dimensional (2D) electrostatic (ES) formulations have been derived for studying the steady-state space charge limited (SCL) current flow of a finite width W in a drift space with a gap distance D. The theoretical analyses show that the 2D SCL current density in terms of the 1D SCL current density monotonically increases with D/W, and the theory recovers the 1D classical Child-Langmuir law in the drift space under the approximation of uniform charge density in the transverse direction. A 2D static model has also been constructed to study the dynamical behaviors of the current flow with current density exceeding the SCL current density, and the static theory for evaluating the transmitted current fraction and minimum potential position have been verified by using 2D ES particle-in-cell simulation. The results show the 2D SCL current density is mainly determined by the geometrical effects, but the dynamical behaviors of the current flow are mainly determined by the relativistic effect at the current density exceeding the SCL current density.
Improvement of Current Drive Efficiency in Projected FNSF Discharges
NASA Astrophysics Data System (ADS)
Prater, R.; Chan, V.; Garofalo, A.
2012-10-01
The Fusion Nuclear Science Facility - Advanced Tokamak (FNSF-AT) is envisioned as a facility that uses the tokamak approach to address the development of the AT path to fusion and fusion's energy objectives. It uses copper coils for a compact device with high βN and moderate power gain. The major radius is 2.7 m and central toroidal field is 5.44 T. Achieving the required confinement and stability at βN˜3.7 requires a current profile with negative central shear and qmin>1. Off-axis Electron Cyclotron Current Drive (ECCD), in addition to high bootstrap current fraction, can help support this current profile. Using the applied EC frequency and launch location as free parameters, a systematic study has been carried out to optimize the ECCD in the range ρ= 0.5-0.7. Using a top launch, making use of a large toroidal component to the launch direction, adjusting the vertical launch angle so that the rays propagate nearly parallel to the resonance, and adjusting the frequency for optimum total current give a high dimensionless efficiency of 0.44 for a broad ECCD profile peaked at ρ=0.7, and the driven current is 17 kA/MW for n20= 2.1 and Te= 10.3 keV locally.
NASA Astrophysics Data System (ADS)
Hasegawa, Chika; Nakayama, Yu
2018-03-01
In this paper, we solve the two-point function of the lowest dimensional scalar operator in the critical ϕ4 theory on 4 ‑ 𝜖 dimensional real projective space in three different methods. The first is to use the conventional perturbation theory, and the second is to impose the cross-cap bootstrap equation, and the third is to solve the Schwinger-Dyson equation under the assumption of conformal invariance. We find that the three methods lead to mutually consistent results but each has its own advantage.
On critical exponents without Feynman diagrams
NASA Astrophysics Data System (ADS)
Sen, Kallol; Sinha, Aninda
2016-11-01
In order to achieve a better analytic handle on the modern conformal bootstrap program, we re-examine and extend the pioneering 1974 work of Polyakov’s, which was based on consistency between the operator product expansion and unitarity. As in the bootstrap approach, this method does not depend on evaluating Feynman diagrams. We show how this approach can be used to compute the anomalous dimensions of certain operators in the O(n) model at the Wilson-Fisher fixed point in 4-ɛ dimensions up to O({ɛ }2). AS dedicates this work to the loving memory of his mother.
Iliesiu, Luca; Kos, Filip; Poland, David; ...
2016-03-17
We study the conformal bootstrap for a 4-point function of fermions in 3D. We first introduce an embedding formalism for 3D spinors and compute the conformal blocks appearing in fermion 4-point functions. Using these results, we find general bounds on the dimensions of operators appearing in the ψ × ψ OPE, and also on the central charge C T. We observe features in our bounds that coincide with scaling dimensions in the GrossNeveu models at large N. Finally, we also speculate that other features could coincide with a fermionic CFT containing no relevant scalar operators.
New Methods for Estimating Seasonal Potential Climate Predictability
NASA Astrophysics Data System (ADS)
Feng, Xia
This study develops two new statistical approaches to assess the seasonal potential predictability of the observed climate variables. One is the univariate analysis of covariance (ANOCOVA) model, a combination of autoregressive (AR) model and analysis of variance (ANOVA). It has the advantage of taking into account the uncertainty of the estimated parameter due to sampling errors in statistical test, which is often neglected in AR based methods, and accounting for daily autocorrelation that is not considered in traditional ANOVA. In the ANOCOVA model, the seasonal signals arising from external forcing are determined to be identical or not to assess any interannual variability that may exist is potentially predictable. The bootstrap is an attractive alternative method that requires no hypothesis model and is available no matter how mathematically complicated the parameter estimator. This method builds up the empirical distribution of the interannual variance from the resamplings drawn with replacement from the given sample, in which the only predictability in seasonal means arises from the weather noise. These two methods are applied to temperature and water cycle components including precipitation and evaporation, to measure the extent to which the interannual variance of seasonal means exceeds the unpredictable weather noise compared with the previous methods, including Leith-Shukla-Gutzler (LSG), Madden, and Katz. The potential predictability of temperature from ANOCOVA model, bootstrap, LSG and Madden exhibits a pronounced tropical-extratropical contrast with much larger predictability in the tropics dominated by El Nino/Southern Oscillation (ENSO) than in higher latitudes where strong internal variability lowers predictability. Bootstrap tends to display highest predictability of the four methods, ANOCOVA lies in the middle, while LSG and Madden appear to generate lower predictability. Seasonal precipitation from ANOCOVA, bootstrap, and Katz, resembling that for temperature, is more predictable over the tropical regions, and less predictable in extropics. Bootstrap and ANOCOVA are in good agreement with each other, both methods generating larger predictability than Katz. The seasonal predictability of evaporation over land bears considerably similarity with that of temperature using ANOCOVA, bootstrap, LSG and Madden. The remote SST forcing and soil moisture reveal substantial seasonality in their relations with the potentially predictable seasonal signals. For selected regions, either SST or soil moisture or both shows significant relationships with predictable signals, hence providing indirect insight on slowly varying boundary processes involved to enable useful seasonal climate predication. A multivariate analysis of covariance (MANOCOVA) model is established to identify distinctive predictable patterns, which are uncorrelated with each other. Generally speaking, the seasonal predictability from multivariate model is consistent with that from ANOCOVA. Besides unveiling the spatial variability of predictability, MANOCOVA model also reveals the temporal variability of each predictable pattern, which could be linked to the periodic oscillations.
NASA Astrophysics Data System (ADS)
Lazzeretti, Paolo
2018-04-01
It is shown that nonsymmetric second-rank current density tensors, related to the current densities induced by magnetic fields and nuclear magnetic dipole moments, are fundamental properties of a molecule. Together with magnetizability, nuclear magnetic shielding, and nuclear spin-spin coupling, they completely characterize its response to magnetic perturbations. Gauge invariance, resolution into isotropic, deviatoric, and antisymmetric parts, and contributions of current density tensors to magnetic properties are discussed. The components of the second-rank tensor properties are rationalized via relationships explicitly connecting them to the direction of the induced current density vectors and to the components of the current density tensors. The contribution of the deviatoric part to the average value of magnetizability, nuclear shielding, and nuclear spin-spin coupling, uniquely determined by the antisymmetric part of current density tensors, vanishes identically. The physical meaning of isotropic and anisotropic invariants of current density tensors has been investigated, and the connection between anisotropy magnitude and electron delocalization has been discussed.
Schwinning, S.; Sandquist, D.R.; Miller, D.M.; Bedford, D.R.; Phillips, S.L.; Belnap, J.
2011-01-01
Drainage channels are among the most conspicuous surficial features of deserts, but little quantitative analysis of their influence on plant distributions is available. We analysed the effects of desert stream channels (‘washes’) on Larrea tridentata and Ambrosia dumosa density and cover on an alluvial piedmont in the Mojave Desert, based on a spatial analysis of transect data encompassing a total length of 2775 m surveyed in 5 cm increments. Significant deviations from average transect properties were identified by bootstrapping. Predictably, shrub cover and density were much reduced inside washes, and elevated above average levels adjacent to washes. Average Larrea and Ambrosia cover and density peaked 1·2–1·6 m and 0·5–1·0 m from wash edges, respectively. We compared wash effects in runon-depleted (−R) sections, where washes had been cut off from runon and were presumably inactive, with those in runon-supplemented (+R) sections downslope from railroad culverts to help identify mechanisms responsible for the facilitative effect of washes on adjacent shrubs. Shrub cover and density near washes peaked in both + R and − R sections, suggesting that improved water infiltration and storage alone can cause a facilitative effect on adjacent shrubs. However, washes of < 2 m width in + R sections had larger than average effects on peak cover, suggesting that plants also benefit from occasional resource supplementation. The data suggest that channel networks significantly contribute to structuring plant communities in the Mojave Desert and their disruption has notable effects on geomorphic and ecological processes far beyond the original disturbance sites.
Associations of serum adiponectin with skeletal muscle morphology and insulin sensitivity.
Ingelsson, Erik; Arnlöv, Johan; Zethelius, Björn; Vasan, Ramachandran S; Flyvbjerg, Allan; Frystyk, Jan; Berne, Christian; Hänni, Arvo; Lind, Lars; Sundström, Johan
2009-03-01
Skeletal muscle morphology and function are strongly associated with insulin sensitivity. The objective of the study was to test the hypothesis that circulating adiponectin is associated with skeletal muscle morphology and that adiponectin mediates the relation of muscle morphology to insulin sensitivity. This was a cross-sectional investigation of 461 men aged 71 yr, participants of the community-based Uppsala Longitudinal Study of Adult Men study. Measures included serum adiponectin, insulin sensitivity measured with euglycemic insulin clamp technique, and capillary density and muscle fiber composition determined from vastus lateralis muscle biopsies. In multivariable linear regression models (adjusting for age, physical activity, fasting glucose, and pharmacological treatment for diabetes), serum adiponectin levels rose with increasing capillary density (beta, 0.30 per 50 capillaries per square millimeter increase; P = 0.041) and higher proportion of type I muscle fibers (beta, 0.27 per 10% increase; P = 0.036) but declined with a higher proportion of type IIb fibers (beta, -0.39 per 10% increase; P = 0.014). Using bootstrap methods to examine the potential role of adiponectin in associations between muscle morphology and insulin sensitivity and the associations of capillary density (beta difference, 0.041; 95% confidence interval 0.001, 0.085) and proportion of type IIb muscle fibers (beta difference, -0.053; 95% confidence interval -0.107, -0.002) with insulin sensitivity were significantly attenuated when adiponectin was included in the models. Circulating adiponectin concentrations were higher with increasing skeletal muscle capillary density and in individuals with higher proportion of slow oxidative muscle fibers. Furthermore, our results indicate that adiponectin could be a partial mediator of the relations between skeletal muscle morphology and insulin sensitivity.
Numerical investigation of split flows by gravity currents into two-layered stratified water bodies
NASA Astrophysics Data System (ADS)
Cortés, A.; Wells, M. G.; Fringer, O. B.; Arthur, R. S.; Rueda, F. J.
2015-07-01
The behavior of a two-dimensional (2-D) gravity current impinging upon a density step in a two-layered stratified basin is analyzed using a high-resolution Reynolds-Averaged Navier-Stokes model. The gravity current splits at the density step, and the portion of the buoyancy flux becoming an interflow is largely controlled by the vertical distribution of velocity and density within the gravity current and the magnitude of the density step between the two ambient layers. This is in agreement with recent laboratory observations. The strongest changes in the ambient density profiles occur as a result of the impingement of supercritical currents with strong density contrasts, for which a large portion of the gravity current detaches from the bottom and becomes an interflow. We characterize the current partition process in the simulated experiments using the densimetric Froude number of the current (Fr) across the density step (upstream and downstream). When underflows are formed, more supercritical currents are observed downstream of the density step compared to upstream (Fru < Frd), and thus, stronger mixing of the current with the ambient water downstream. However, when split flows and interflows are formed, smaller Fr values are identified after the current crosses the density step (Fru > Frd), which indicates lower mixing between the current and ambient water after the impingement due to the significant stripping of interfacial material at the density step.
NASA Astrophysics Data System (ADS)
Gebhardt, Katharina; Knebelsberger, Thomas
2015-09-01
We morphologically analyzed 79 cephalopod specimens from the North and Baltic Seas belonging to 13 separate species. Another 29 specimens showed morphological features of either Alloteuthis mediaor Alloteuthis subulata or were found to be in between. Reliable identification features to distinguish between A. media and A. subulata are currently not available. The analysis of the DNA barcoding region of the COI gene revealed intraspecific distances (uncorrected p) ranging from 0 to 2.13 % (average 0.1 %) and interspecific distances between 3.31 and 22 % (average 15.52 %). All species formed monophyletic clusters in a neighbor-joining analysis and were supported by bootstrap values of ≥99 %. All COI haplotypes belonging to the 29 Alloteuthis specimens were grouped in one cluster. Neither COI nor 18S rDNA sequences helped to distinguish between the different Alloteuthis morphotypes. For species identification purposes, we recommend the use of COI, as it showed higher bootstrap support of species clusters and less amplification and sequencing failure compared to 18S. Our data strongly support the assumption that the genus Alloteuthis is only represented by a single species, at least in the North Sea. It remained unclear whether this species is A. subulata or A. media. All COI sequences including important metadata were uploaded to the Barcode of Life Data Systems and can be used as reference library for the molecular identification of more than 50 % of the cephalopod fauna known from the North and Baltic Seas.
Benedetto, Umberto; Raja, Shahzad G
2014-11-01
The effectiveness of the routine retrosternal placement of a gentamicin-impregnated collagen sponge (GICS) implant before sternotomy closure is currently a matter of some controversy. We aimed to develop a scoring system to guide decision making for the use of GICS to prevent deep sternal wound infection. Fast backward elimination on predictors, including GICS, was performed using the Lawless and Singhal method. The scoring system was reported as a partial nomogram that can be used to manually obtain predicted individual risk of deep sternal wound infection from the regression model. Bootstrapping validation of the regression models was performed. The final populations consisted of 8750 adult patients undergoing cardiac surgery through full sternotomy during the study period. A total of 329 patients (3.8%) received GICS implant. The overall incidence of deep sternal wound infection was lower among patients who received GICS implant (0.6%) than patients who did not (2.01%) (P=.02). A nomogram to predict the individual risk for deep sternal wound infection was developed that included the use of GICS. Bootstrapping validation confirmed a good discriminative power of the models. The scoring system provides an impartial assessment of the decision-making process for clinicians to establish if GICS implant is effective in reducing the risk for deep sternal wound infection in individual patients undergoing cardiac surgery through full sternotomy. Copyright © 2014 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Qian, J. P.; Garofalo, A. M.; Gong, X. Z.; Ren, Q. L.; Ding, S. Y.; Solomon, W. M.; Xu, G. S.; Grierson, B. A.; Guo, W. F.; Holcomb, C. T.; McClenaghan, J.; McKee, G. R.; Pan, C. K.; Huang, J.; Staebler, G. M.; Wan, B. N.
2017-05-01
Recent EAST/DIII-D joint experiments on the high poloidal beta {β\\text{P}} regime in DIII-D have extended operation with internal transport barriers (ITBs) and excellent energy confinement (H 98y2 ~ 1.6) to higher plasma current, for lower q 95 ⩽ 7.0, and more balanced neutral beam injection (NBI) (torque injection < 2 Nm), for lower plasma rotation than previous results (Garofalo et al, IAEA 2014, Gong et al 2014 IAEA Int. Conf. on Fusion Energy). Transport analysis and experimental measurements at low toroidal rotation suggest that the E × B shear effect is not key to the ITB formation in these high {β\\text{P}} discharges. Experiments and TGLF modeling show that the Shafranov shift has a key stabilizing effect on turbulence. Extrapolation of the DIII-D results using a 0D model shows that with the improved confinement, the high bootstrap fraction regime could achieve fusion gain Q = 5 in ITER at {β\\text{N}} ~ 2.9 and q 95 ~ 7. With the optimization of q(0), the required improved confinement is achievable when using 1.5D TGLF-SAT1 for transport simulations. Results reported in this paper suggest that the DIII-D high {β\\text{P}} scenario could be a candidate for ITER steady state operation.
NASA Astrophysics Data System (ADS)
Stroeve, Julienne; Jenouvrier, Stephanie
2016-04-01
Sea ice variability within the marginal ice zone (MIZ) and polynyas plays an important role for phytoplankton productivity and krill abundance. Therefore mapping their spatial extent, seasonal and interannual variability is essential for understanding how current and future changes in these biological active regions may impact the Antarctic marine ecosystem. Knowledge of the distribution of different ice types to the total Antarctic sea ice cover may also help to shed light on the factors contributing towards recent expansion of the Antarctic ice cover in some regions and contraction in others. The long-term passive microwave satellite data record provides the longest and most consistent data record for assessing different ice types. However, estimates of the amount of MIZ, consolidated pack ice and polynyas depends strongly on what sea ice algorithm is used. This study uses two popular passive microwave sea ice algorithms, the NASA Team and Bootstrap to evaluate the distribution and variability in the MIZ, the consolidated pack ice and coastal polynyas. Results reveal the NASA Team algorithm has on average twice the MIZ and half the consolidated pack ice area as the Bootstrap algorithm. Polynya area is also larger in the NASA Team algorithm, and the timing of maximum polynya area may differ by as much as 5 months between algorithms. These differences lead to different relationships between sea ice characteristics and biological processes, as illustrated here with the breeding success of an Antarctic seabird.
2015-01-01
Abstract The basic properties of the near‐Earth current sheet from 8 RE to 12 RE were determined based on Time History of Events and Macroscale Interactions during Substorms (THEMIS) observations from 2007 to 2013. Ampere's law was used to estimate the current density when the locations of two spacecraft were suitable for the calculation. A total of 3838 current density observations were obtained to study the vertical profile. For typical solar wind conditions, the current density near (off) the central plane of the current sheet ranged from 1 to 2 nA/m2 (1 to 8 nA/m2). All the high current densities appeared off the central plane of the current sheet, indicating the formation of a bifurcated current sheet structure when the current density increased above 2 nA/m2. The median profile also showed a bifurcated structure, in which the half thickness was about 3 RE. The distance between the peak of the current density and the central plane of the current sheet was 0.5 to 1 RE. High current densities above 4 nA/m2 were observed in some cases that occurred preferentially during substorms, but they also occurred in quiet times. In contrast to the commonly accepted picture, these high current densities can form without a high solar wind dynamic pressure. In addition, these high current densities can appear in two magnetic configurations: tail‐like and dipolar structures. At least two mechanisms, magnetic flux depletion and new current system formation during the expansion phase, other than plasma sheet compression are responsible for the formation of the bifurcated current sheets. PMID:27722039
Hong, Wandong; Lin, Suhan; Zippi, Maddalena; Geng, Wujun; Stock, Simon; Zimmer, Vincent; Xu, Chunfang; Zhou, Mengtao
2017-01-01
Early prediction of disease severity of acute pancreatitis (AP) would be helpful for triaging patients to the appropriate level of care and intervention. The aim of the study was to develop a model able to predict Severe Acute Pancreatitis (SAP). A total of 647 patients with AP were enrolled. The demographic data, hematocrit, High-Density Lipoprotein Cholesterol (HDL-C) determinant at time of admission, Blood Urea Nitrogen (BUN), and serum creatinine (Scr) determinant at time of admission and 24 hrs after hospitalization were collected and analyzed statistically. Multivariate logistic regression indicated that HDL-C at admission and BUN and Scr at 24 hours (hrs) were independently associated with SAP. A logistic regression function (LR model) was developed to predict SAP as follows: -2.25-0.06 HDL-C (mg/dl) at admission + 0.06 BUN (mg/dl) at 24 hours + 0.66 Scr (mg/dl) at 24 hours. The optimism-corrected c-index for LR model was 0.832 after bootstrap validation. The area under the receiver operating characteristic curve for LR model for the prediction of SAP was 0.84. The LR model consists of HDL-C at admission and BUN and Scr at 24 hours, representing an additional tool to stratify patients at risk of SAP.
DuPre, Natalie C; Hart, Jaime E; Bertrand, Kimberly A; Kraft, Peter; Laden, Francine; Tamimi, Rulla M
2017-11-23
High mammographic density is a strong, well-established breast cancer risk factor. Three studies conducted in various smaller geographic settings reported inconsistent findings between air pollution and mammographic density. We assessed whether particulate matter (PM) exposures (PM 2.5 , PM 2.5-10 , and PM 10 ) and distance to roadways were associated with mammographic density among women residing across the United States. The Nurses' Health Studies are prospective cohorts for whom a subset has screening mammograms from the 1990s (interquartile range 1990-1999). PM was estimated using spatio-temporal models linked to residential addresses. Among 3258 women (average age at mammogram 52.7 years), we performed multivariable linear regression to assess associations between square-root-transformed mammographic density and PM within 1 and 3 years before the mammogram. For linear regression estimates of PM in relation to untransformed mammographic density outcomes, bootstrapped robust standard errors are used to calculate 95% confidence intervals (CIs). Analyses were stratified by menopausal status and region of residence. Recent PM and distance to roadways were not associated with mammographic density in premenopausal women (PM 2.5 within 3 years before mammogram β = 0.05, 95% CI -0.16, 0.27; PM 2.5-10 β = 0, 95%, CI -0.15, 0.16; PM 10 β = 0.02, 95% CI -0.10, 0.13) and postmenopausal women (PM 2.5 within 3 years before mammogram β = -0.05, 95% CI -0.27, 0.17; PM 2.5-10 β = -0.01, 95% CI -0.16, 0.14; PM 10 β = -0.02, 95% CI -0.13, 0.09). Largely null associations were observed within regions. Suggestive associations were observed among postmenopausal women in the Northeast (n = 745), where a 10-μg/m 3 increase in PM 2.5 within 3 years before the mammogram was associated with 3.4 percentage points higher percent mammographic density (95% CI -0.5, 7.3). These findings do not support that recent PM or roadway exposures influence mammographic density. Although PM was largely not associated with mammographic density, we cannot rule out the role of PM during earlier exposure time windows and possible associations among northeastern postmenopausal women.
Patient satisfaction after pulmonary resection for lung cancer: a multicenter comparative analysis.
Pompili, Cecilia; Brunelli, Alessandro; Rocco, Gaetano; Salvi, Rosario; Xiumé, Francesco; La Rocca, Antonello; Sabbatini, Armando; Martucci, Nicola
2013-01-01
Patient satisfaction reflects the perception of the customer about the level of quality of care received during the episode of hospitalization. To compare the levels of satisfaction of patients submitted to lung resection in two different thoracic surgical units. Prospective analysis of 280 consecutive patients submitted to pulmonary resection for neoplastic disease in two centers (center A: 139 patients; center B: 141 patients; 2009-2010). Patients' satisfaction was assessed at discharge through the EORTC-InPatSat32 module, a 32-item, multi-scale self-administered anonymous questionnaire. Each scale (ranging from 0 to 100 in score) was compared between the two units. Multivariable regression and bootstrap were used to verify factors associated with the patients' general satisfaction (dependent variable). Patients from unit B reported a higher general satisfaction (91.5 vs. 88.3, p = 0.04), mainly due to a significantly higher satisfaction in the doctor-related scales (doctors' technical skill: p = 0.001; doctors' interpersonal skill: p = 0.008; doctors' availability: p = 0.005, and doctors information provision: p = 0.0006). Multivariable regression analysis and bootstrap confirmed that level of care in unit B (p = 0.006, bootstrap frequency 60%) along with lower level of education of the patient population (p = 0.02, bootstrap frequency 62%) were independent factors associated with a higher general patient satisfaction. We were able to show a different level of patient satisfaction in patients operated on in two different thoracic surgery units. A reduced level of patient satisfaction may trigger changes in the management policy of individual units in order to meet patients' expectations and improve organizational efficiency. Copyright © 2012 S. Karger AG, Basel.
Wilcox, Thomas P; Zwickl, Derrick J; Heath, Tracy A; Hillis, David M
2002-11-01
Four New World genera of dwarf boas (Exiliboa, Trachyboa, Tropidophis, and Ungaliophis) have been placed by many systematists in a single group (traditionally called Tropidophiidae). However, the monophyly of this group has been questioned in several studies. Moreover, the overall relationships among basal snake lineages, including the placement of the dwarf boas, are poorly understood. We obtained mtDNA sequence data for 12S, 16S, and intervening tRNA-val genes from 23 species of snakes representing most major snake lineages, including all four genera of New World dwarf boas. We then examined the phylogenetic position of these species by estimating the phylogeny of the basal snakes. Our phylogenetic analysis suggests that New World dwarf boas are not monophyletic. Instead, we find Exiliboa and Ungaliophis to be most closely related to sand boas (Erycinae), boas (Boinae), and advanced snakes (Caenophidea), whereas Tropidophis and Trachyboa form an independent clade that separated relatively early in snake radiation. Our estimate of snake phylogeny differs significantly in other ways from some previous estimates of snake phylogeny. For instance, pythons do not cluster with boas and sand boas, but instead show a strong relationship with Loxocemus and Xenopeltis. Additionally, uropeltids cluster strongly with Cylindrophis, and together are embedded in what has previously been considered the macrostomatan radiation. These relationships are supported by both bootstrapping (parametric and nonparametric approaches) and Bayesian analysis, although Bayesian support values are consistently higher than those obtained from nonparametric bootstrapping. Simulations show that Bayesian support values represent much better estimates of phylogenetic accuracy than do nonparametric bootstrap support values, at least under the conditions of our study. Copyright 2002 Elsevier Science (USA)
Explanation of Two Anomalous Results in Statistical Mediation Analysis.
Fritz, Matthew S; Taylor, Aaron B; Mackinnon, David P
2012-01-01
Previous studies of different methods of testing mediation models have consistently found two anomalous results. The first result is elevated Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap tests not found in nonresampling tests or in resampling tests that did not include a bias correction. This is of special concern as the bias-corrected bootstrap is often recommended and used due to its higher statistical power compared with other tests. The second result is statistical power reaching an asymptote far below 1.0 and in some conditions even declining slightly as the size of the relationship between X and M , a , increased. Two computer simulations were conducted to examine these findings in greater detail. Results from the first simulation found that the increased Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap are a function of an interaction between the size of the individual paths making up the mediated effect and the sample size, such that elevated Type I error rates occur when the sample size is small and the effect size of the nonzero path is medium or larger. Results from the second simulation found that stagnation and decreases in statistical power as a function of the effect size of the a path occurred primarily when the path between M and Y , b , was small. Two empirical mediation examples are provided using data from a steroid prevention and health promotion program aimed at high school football players (Athletes Training and Learning to Avoid Steroids; Goldberg et al., 1996), one to illustrate a possible Type I error for the bias-corrected bootstrap test and a second to illustrate a loss in power related to the size of a . Implications of these findings are discussed.
Sample size determination for mediation analysis of longitudinal data.
Pan, Haitao; Liu, Suyu; Miao, Danmin; Yuan, Ying
2018-03-27
Sample size planning for longitudinal data is crucial when designing mediation studies because sufficient statistical power is not only required in grant applications and peer-reviewed publications, but is essential to reliable research results. However, sample size determination is not straightforward for mediation analysis of longitudinal design. To facilitate planning the sample size for longitudinal mediation studies with a multilevel mediation model, this article provides the sample size required to achieve 80% power by simulations under various sizes of the mediation effect, within-subject correlations and numbers of repeated measures. The sample size calculation is based on three commonly used mediation tests: Sobel's method, distribution of product method and the bootstrap method. Among the three methods of testing the mediation effects, Sobel's method required the largest sample size to achieve 80% power. Bootstrapping and the distribution of the product method performed similarly and were more powerful than Sobel's method, as reflected by the relatively smaller sample sizes. For all three methods, the sample size required to achieve 80% power depended on the value of the ICC (i.e., within-subject correlation). A larger value of ICC typically required a larger sample size to achieve 80% power. Simulation results also illustrated the advantage of the longitudinal study design. The sample size tables for most encountered scenarios in practice have also been published for convenient use. Extensive simulations study showed that the distribution of the product method and bootstrapping method have superior performance to the Sobel's method, but the product method was recommended to use in practice in terms of less computation time load compared to the bootstrapping method. A R package has been developed for the product method of sample size determination in mediation longitudinal study design.
Oberije, Cary; De Ruysscher, Dirk; Houben, Ruud; van de Heuvel, Michel; Uyterlinde, Wilma; Deasy, Joseph O; Belderbos, Jose; Dingemans, Anne-Marie C; Rimner, Andreas; Din, Shaun; Lambin, Philippe
2015-07-15
Although patients with stage III non-small cell lung cancer (NSCLC) are homogeneous according to the TNM staging system, they form a heterogeneous group, which is reflected in the survival outcome. The increasing amount of information for an individual patient and the growing number of treatment options facilitate personalized treatment, but they also complicate treatment decision making. Decision support systems (DSS), which provide individualized prognostic information, can overcome this but are currently lacking. A DSS for stage III NSCLC requires the development and integration of multiple models. The current study takes the first step in this process by developing and validating a model that can provide physicians with a survival probability for an individual NSCLC patient. Data from 548 patients with stage III NSCLC were available to enable the development of a prediction model, using stratified Cox regression. Variables were selected by using a bootstrap procedure. Performance of the model was expressed as the c statistic, assessed internally and on 2 external data sets (n=174 and n=130). The final multivariate model, stratified for treatment, consisted of age, gender, World Health Organization performance status, overall treatment time, equivalent radiation dose, number of positive lymph node stations, and gross tumor volume. The bootstrapped c statistic was 0.62. The model could identify risk groups in external data sets. Nomograms were constructed to predict an individual patient's survival probability (www.predictcancer.org). The data set can be downloaded at https://www.cancerdata.org/10.1016/j.ijrobp.2015.02.048. The prediction model for overall survival of patients with stage III NSCLC highlights the importance of combining patient, clinical, and treatment variables. Nomograms were developed and validated. This tool could be used as a first building block for a decision support system. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Haider, S; Hrbek, A; Xu, Y
2008-06-01
Primarily this report outlines our investigation on utilizing magneto-acousto-electrical-tomography (MAET) to image the lead field current density in volume conductors. A lead field current density distribution is obtained when a current/voltage source is applied to a sample via a pair of electrodes. This is the first time a high-spatial-resolution image of current density is presented using MAET. We also compare an experimental image of current density in a sample with its corresponding numerical simulation. To image the lead field current density, rather than applying a current/voltage source directly to the sample, we place the sample in a static magnetic field and focus an ultrasonic pulse on the sample to simulate a point-like current dipole source at the focal point. Then by using electrodes we measure the voltage/current signal which, based on the reciprocity theorem, is proportional to a component of the lead field current density. In the theory section, we derive the equation relating the measured voltage to the lead field current density and the displacement velocity caused by ultrasound. The experimental data include the MAET signal and an image of the lead field current density for a thin sample. In addition, we discuss the potential improvements for MAET especially to overcome the limitation created by the observation that no signal was detected from the interior of a region having a uniform conductivity. As an auxiliary we offer a mathematical formula whereby the lead field current density may be utilized to reconstruct the distribution of the electrical impedance in a piecewise smooth object.
High current density cathode for electrorefining in molten electrolyte
Li, Shelly X.
2010-06-29
A high current density cathode for electrorefining in a molten electrolyte for the continuous production and collection of loose dendritic or powdery deposits. The high current density cathode eliminates the requirement for mechanical scraping and electrochemical stripping of the deposits from the cathode in an anode/cathode module. The high current density cathode comprises a perforated electrical insulated material coating such that the current density is up to 3 A/cm.sup.2.
Gyrokinetic Particle Simulations of Neoclassical Transport
NASA Astrophysics Data System (ADS)
Lin, Zhihong
A time varying weighting (delta f) scheme based on the small gyro-radius ordering is developed and applied to a steady state, multi-species gyrokinetic particle simulation of neoclassical transport. Accurate collision operators conserving momentum and energy are developed and implemented. Benchmark simulation results using these operators are found to agree very well with neoclassical theory. For example, it is dynamically demonstrated that like-particle collisions produce no particle flux and that the neoclassical fluxes are ambipolar for an ion -electron plasma. An important physics feature of the present scheme is the introduction of toroidal flow to the simulations. In agreement with the existing analytical neoclassical theory, ion energy flux is enhanced by the toroidal mass flow and the neoclassical viscosity is a Pfirsch-Schluter factor times the classical viscosity in the banana regime. In addition, the poloidal electric field associated with toroidal mass flow is found to enhance density gradient driven electron particle flux and the bootstrap current while reducing temperature gradient driven flux and current. Modifications of the neoclassical transport by the orbit squeezing effects due to the radial electric field associated with sheared toroidal flow are studied. Simulation results indicate a reduction of both ion thermal flux and neoclassical toroidal rotation. Neoclassical theory in the steep gradient profile regime, where conventional neoclassical theory fails, is examined by taking into account finite banana width effects. The relevance of these studies to interesting experimental conditions in tokamaks is discussed. Finally, the present numerical scheme is extended to general geometry equilibrium. This new formulation will be valuable for the development of new capabilities to address complex equilibria such as advanced stellarator configurations and possibly other alternate concepts for the magnetic confinement of plasmas. In general, the present work demonstrates a valuable new capability for studying important aspects of neoclassical transport inaccessible by conventional analytical calculation processes.
NASA Technical Reports Server (NTRS)
Dunning, J. W., Jr.; Lancashire, R. B.; Manista, E. J.
1976-01-01
Measurements have been conducted of the effect of the convection of ions and electrons on the discharge characteristics in a large scale laser. The results are presented for one particular distribution of ballast resistance. Values of electric field, current density, input power density, ratio of electric field to neutral gas density (E/N), and electron number density were calculated on the basis of measurements of the discharge properties. In a number of graphs, the E/N ratio, current density, power density, and electron density are plotted as a function of row number (downstream position) with total discharge current and gas velocity as parameters. From the dependence of the current distribution on the total current, it appears that the electron production in the first two rows significantly affects the current flowing in the succeeding rows.
Analysis of recoverable current from one component of magnetic flux density in MREIT and MRCDI.
Park, Chunjae; Lee, Byung Il; Kwon, Oh In
2007-06-07
Magnetic resonance current density imaging (MRCDI) provides a current density image by measuring the induced magnetic flux density within the subject with a magnetic resonance imaging (MRI) scanner. Magnetic resonance electrical impedance tomography (MREIT) has been focused on extracting some useful information of the current density and conductivity distribution in the subject Omega using measured B(z), one component of the magnetic flux density B. In this paper, we analyze the map Tau from current density vector field J to one component of magnetic flux density B(z) without any assumption on the conductivity. The map Tau provides an orthogonal decomposition J = J(P) + J(N) of the current J where J(N) belongs to the null space of the map Tau. We explicitly describe the projected current density J(P) from measured B(z). Based on the decomposition, we prove that B(z) data due to one injection current guarantee a unique determination of the isotropic conductivity under assumptions that the current is two-dimensional and the conductivity value on the surface is known. For a two-dimensional dominating current case, the projected current density J(P) provides a good approximation of the true current J without accumulating noise effects. Numerical simulations show that J(P) from measured B(z) is quite similar to the target J. Biological tissue phantom experiments compare J(P) with the reconstructed J via the reconstructed isotropic conductivity using the harmonic B(z) algorithm.
In vivo mapping of current density distribution in brain tissues during deep brain stimulation (DBS)
NASA Astrophysics Data System (ADS)
Sajib, Saurav Z. K.; Oh, Tong In; Kim, Hyung Joong; Kwon, Oh In; Woo, Eung Je
2017-01-01
New methods for in vivo mapping of brain responses during deep brain stimulation (DBS) are indispensable to secure clinical applications. Assessment of current density distribution, induced by internally injected currents, may provide an alternative method for understanding the therapeutic effects of electrical stimulation. The current flow and pathway are affected by internal conductivity, and can be imaged using magnetic resonance-based conductivity imaging methods. Magnetic resonance electrical impedance tomography (MREIT) is an imaging method that can enable highly resolved mapping of electromagnetic tissue properties such as current density and conductivity of living tissues. In the current study, we experimentally imaged current density distribution of in vivo canine brains by applying MREIT to electrical stimulation. The current density maps of three canine brains were calculated from the measured magnetic flux density data. The absolute current density values of brain tissues, including gray matter, white matter, and cerebrospinal fluid were compared to assess the active regions during DBS. The resulting current density in different tissue types may provide useful information about current pathways and volume activation for adjusting surgical planning and understanding the therapeutic effects of DBS.
Singh, Kunwar Pal; Guo, Chunlei
2017-06-21
The nanochannel diameter and surface charge density have a significant impact on current-voltage characteristics in a nanofluidic transistor. We have simulated the effect of the channel diameter and surface charge density on current-voltage characteristics of a fluidic nanochannel with positive surface charge on its walls and a gate electrode on its surface. Anion depletion/enrichment leads to a decrease/increase in ion current with gate potential. The ion current tends to increase linearly with gate potential for narrow channels at high surface charge densities and narrow channels are more effective to control the ion current at high surface charge densities. The current-voltage characteristics are highly nonlinear for wide channels at low surface charge densities and they show different regions of current change with gate potential. The ion current decreases with gate potential after attaining a peak value for wide channels at low values of surface charge densities. At low surface charge densities, the ion current can be controlled by a narrow range of gate potentials for wide channels. The current change with source drain voltage shows ohmic, limiting and overlimiting regions.
Surface currents associated with external kink modes in tokamak plasmas during a major disruption
NASA Astrophysics Data System (ADS)
Ng, C. S.; Bhattacharjee, A.
2017-10-01
The surface current on the plasma-vacuum interface during a disruption event involving kink instability can play an important role in driving current into the vacuum vessel. However, there have been disagreements over the nature or even the sign of the surface current in recent theoretical calculations based on idealized step-function background plasma profiles. We revisit such calculations by replacing step-function profiles with more realistic profiles characterized by a strong but finite gradient along the radial direction. It is shown that the resulting surface current is no longer a delta-function current density, but a finite and smooth current density profile with an internal structure, concentrated within the region with a strong plasma pressure gradient. Moreover, this current density profile has peaks of both signs, unlike the delta-function case with a sign opposite to, or the same as the plasma current. We show analytically and numerically that such current density can be separated into two parts, with one of them, called the convective current density, describing the transport of the background plasma density by the displacement, and the other part that remains, called the residual current density. It is argued that consideration of both types of current density is important and can resolve past controversies.
Why do workaholics experience depression? A study with Chinese University teachers.
Nie, Yingzhi; Sun, Haitao
2016-10-01
This study focuses on the relationships of workaholism to job burnout and depression of university teachers. The direct and indirect (via job burnout) effects of workaholism on depression were investigated in 412 Chinese university teachers. Structural equation modeling and bootstrap method were used. Results revealed that workaholism, job burnout, and depression significantly correlated with each other. Structural equation modeling and bootstrap test indicated the partial mediation role of job burnout on the relationship between workaholism and depression. The findings shed some light on how workaholism influenced depression and provided valuable evidence for prevention of depression in work. © The Author(s) 2015.
Blank, Jos L T; van Hulst, Bart Laurents
2011-10-01
This paper describes the efficiency of Dutch hospitals using the Data Envelopment Analysis (DEA) method with bootstrapping. In particular, the analysis focuses on accounting for cost inefficiency measures on the part of hospital corporate governance. We use bootstrap techniques, as introduced by Simar and Wilson (J. Econom. 136(1):31-64, 2007), in order to obtain more efficient estimates of the effects of governance on the efficiency. The results show that part of the cost efficiency can be explained with governance. In particular we find that a higher remuneration of the board as well as a higher remuneration of the supervisory board does not implicate better performance.
Construction of prediction intervals for Palmer Drought Severity Index using bootstrap
NASA Astrophysics Data System (ADS)
Beyaztas, Ufuk; Bickici Arikan, Bugrayhan; Beyaztas, Beste Hamiye; Kahya, Ercan
2018-04-01
In this study, we propose an approach based on the residual-based bootstrap method to obtain valid prediction intervals using monthly, short-term (three-months) and mid-term (six-months) drought observations. The effects of North Atlantic and Arctic Oscillation indexes on the constructed prediction intervals are also examined. Performance of the proposed approach is evaluated for the Palmer Drought Severity Index (PDSI) obtained from Konya closed basin located in Central Anatolia, Turkey. The finite sample properties of the proposed method are further illustrated by an extensive simulation study. Our results revealed that the proposed approach is capable of producing valid prediction intervals for future PDSI values.
Spotorno O, Angel E; Córdova, Luis; Solari I, Aldo
2008-12-01
To identify and characterize chilean samples of Trypanosoma cruzi and their association with hosts, the first 516 bp of the mitochondrial cytochrome b gene were sequenced from eight biological samples, and phylogenetically compared with other known 20 American sequences. The molecular characterization of these 28 sequences in a maximum likelihood phylogram (-lnL = 1255.12, tree length = 180, consistency index = 0.79) allowed the robust identification (bootstrap % > 99) of three previously known discrete typing units (DTU): DTU IIb, IIa, and I. An apparently undescribed new sequence found in four new chilean samples was detected and designated as DTU Ib; they were separated by 24.7 differences, but robustly related (bootstrap % = 97 in 500 replicates) to those of DTU I by sharing 12 substitutions, among which four were nonsynonymous ones. Such new DTU Ib was also robust (bootstrap % = 100), and characterized by 10 unambiguous substitutions, with a single nonsynonymous G to T change at site 409. The fact that two of such new sequences were found in parasites from a chilean endemic caviomorph rodent, Octodon degus, and that they were closely related to the ancient DTU I suggested old origins and a long association to caviomorph hosts.
Fernández-Caballero Rico, Jose Ángel; Chueca Porcuna, Natalia; Álvarez Estévez, Marta; Mosquera Gutiérrez, María Del Mar; Marcos Maeso, María Ángeles; García, Federico
2018-02-01
To show how to generate a consensus sequence from the information of massive parallel sequences data obtained from routine HIV anti-retroviral resistance studies, and that may be suitable for molecular epidemiology studies. Paired Sanger (Trugene-Siemens) and next-generation sequencing (NGS) (454 GSJunior-Roche) HIV RT and protease sequences from 62 patients were studied. NGS consensus sequences were generated using Mesquite, using 10%, 15%, and 20% thresholds. Molecular evolutionary genetics analysis (MEGA) was used for phylogenetic studies. At a 10% threshold, NGS-Sanger sequences from 17/62 patients were phylogenetically related, with a median bootstrap-value of 88% (IQR83.5-95.5). Association increased to 36/62 sequences, median bootstrap 94% (IQR85.5-98)], using a 15% threshold. Maximum association was at the 20% threshold, with 61/62 sequences associated, and a median bootstrap value of 99% (IQR98-100). A safe method is presented to generate consensus sequences from HIV-NGS data at 20% threshold, which will prove useful for molecular epidemiological studies. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Highton, R
1993-12-01
An analysis of the relationship between the number of loci utilized in an electrophoretic study of genetic relationships and the statistical support for the topology of UPGMA trees is reported for two published data sets. These are Highton and Larson (Syst. Zool.28:579-599, 1979), an analysis of the relationships of 28 species of plethodonine salamanders, and Hedges (Syst. Zool., 35:1-21, 1986), a similar study of 30 taxa of Holarctic hylid frogs. As the number of loci increases, the statistical support for the topology at each node in UPGMA trees was determined by both the bootstrap and jackknife methods. The results show that the bootstrap and jackknife probabilities supporting the topology at some nodes of UPGMA trees increase as the number of loci utilized in a study is increased, as expected for nodes that have groupings that reflect phylogenetic relationships. The pattern of increase varies and is especially rapid in the case of groups with no close relatives. At nodes that likely do not represent correct phylogenetic relationships, the bootstrap probabilities do not increase and often decline with the addition of more loci.
Comparison of mode estimation methods and application in molecular clock analysis
NASA Technical Reports Server (NTRS)
Hedges, S. Blair; Shah, Prachi
2003-01-01
BACKGROUND: Distributions of time estimates in molecular clock studies are sometimes skewed or contain outliers. In those cases, the mode is a better estimator of the overall time of divergence than the mean or median. However, different methods are available for estimating the mode. We compared these methods in simulations to determine their strengths and weaknesses and further assessed their performance when applied to real data sets from a molecular clock study. RESULTS: We found that the half-range mode and robust parametric mode methods have a lower bias than other mode methods under a diversity of conditions. However, the half-range mode suffers from a relatively high variance and the robust parametric mode is more susceptible to bias by outliers. We determined that bootstrapping reduces the variance of both mode estimators. Application of the different methods to real data sets yielded results that were concordant with the simulations. CONCLUSION: Because the half-range mode is a simple and fast method, and produced less bias overall in our simulations, we recommend the bootstrapped version of it as a general-purpose mode estimator and suggest a bootstrap method for obtaining the standard error and 95% confidence interval of the mode.
NASA Astrophysics Data System (ADS)
Angrisano, Antonio; Maratea, Antonio; Gaglione, Salvatore
2018-01-01
In the absence of obstacles, a GPS device is generally able to provide continuous and accurate estimates of position, while in urban scenarios buildings can generate multipath and echo-only phenomena that severely affect the continuity and the accuracy of the provided estimates. Receiver autonomous integrity monitoring (RAIM) techniques are able to reduce the negative consequences of large blunders in urban scenarios, but require both a good redundancy and a low contamination to be effective. In this paper a resampling strategy based on bootstrap is proposed as an alternative to RAIM, in order to estimate accurately position in case of low redundancy and multiple blunders: starting with the pseudorange measurement model, at each epoch the available measurements are bootstrapped—that is random sampled with replacement—and the generated a posteriori empirical distribution is exploited to derive the final position. Compared to standard bootstrap, in this paper the sampling probabilities are not uniform, but vary according to an indicator of the measurement quality. The proposed method has been compared with two different RAIM techniques on a data set collected in critical conditions, resulting in a clear improvement on all considered figures of merit.
2009-01-01
Background The International Commission on Radiological Protection (ICRP) recommended annual occupational dose limit is 20 mSv. Cancer mortality in Japanese A-bomb survivors exposed to less than 20 mSv external radiation in 1945 was analysed previously, using a latency model with non-linear dose response. Questions were raised regarding statistical inference with this model. Methods Cancers with over 100 deaths in the 0 - 20 mSv subcohort of the 1950-1990 Life Span Study are analysed with Poisson regression models incorporating latency, allowing linear and non-linear dose response. Bootstrap percentile and Bias-corrected accelerated (BCa) methods and simulation of the Likelihood Ratio Test lead to Confidence Intervals for Excess Relative Risk (ERR) and tests against the linear model. Results The linear model shows significant large, positive values of ERR for liver and urinary cancers at latencies from 37 - 43 years. Dose response below 20 mSv is strongly non-linear at the optimal latencies for the stomach (11.89 years), liver (36.9), lung (13.6), leukaemia (23.66), and pancreas (11.86) and across broad latency ranges. Confidence Intervals for ERR are comparable using Bootstrap and Likelihood Ratio Test methods and BCa 95% Confidence Intervals are strictly positive across latency ranges for all 5 cancers. Similar risk estimates for 10 mSv (lagged dose) are obtained from the 0 - 20 mSv and 5 - 500 mSv data for the stomach, liver, lung and leukaemia. Dose response for the latter 3 cancers is significantly non-linear in the 5 - 500 mSv range. Conclusion Liver and urinary cancer mortality risk is significantly raised using a latency model with linear dose response. A non-linear model is strongly superior for the stomach, liver, lung, pancreas and leukaemia. Bootstrap and Likelihood-based confidence intervals are broadly comparable and ERR is strictly positive by bootstrap methods for all 5 cancers. Except for the pancreas, similar estimates of latency and risk from 10 mSv are obtained from the 0 - 20 mSv and 5 - 500 mSv subcohorts. Large and significant cancer risks for Japanese survivors exposed to less than 20 mSv external radiation from the atomic bombs in 1945 cast doubt on the ICRP recommended annual occupational dose limit. PMID:20003238
Crow, Thomas; Cross, Dorthie; Powers, Abigail; Bradley, Bekh
2014-10-01
Abuse and neglect in childhood are well-established risk factors for later psychopathology. Past research has suggested that childhood emotional abuse may be particularly harmful to psychological development. The current cross-sectional study employed multiple regression techniques to assess the effects of childhood trauma on adulthood depression and emotion dysregulation in a large sample of mostly low-income African Americans recruited in an urban hospital. Bootstrap analyses were used to test emotion dysregulation as a potential mediator between emotional abuse in childhood and current depression. Childhood emotional abuse significantly predicted depressive symptoms even when accounting for all other childhood trauma types, and we found support for a complementary mediation of this relationship by emotion dysregulation. Our findings highlight the importance of emotion dysregulation and childhood emotional abuse in relation to adult depression. Moving forward, clinicians should consider the particular importance of emotional abuse in the development of depression, and future research should seek to identify mechanisms through which emotional abuse increases risk for depression and emotion dysregulation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Crow, Thomas; Cross, Dorthie; Powers, Abigail; Bradley, Bekh
2014-01-01
Abuse and neglect in childhood are well-established risk factors for later psychopathology. Past research has suggested that childhood emotional abuse may be particularly harmful to psychological development. The current cross-sectional study employed multiple regression techniques to assess the effects of childhood trauma on adulthood depression and emotion dysregulation in a large sample of mostly low-income African Americans recruited in an urban hospital. Bootstrap analyses were used to test emotion dysregulation as a potential mediator between emotional abuse in childhood and current depression. Childhood emotional abuse significantly predicted depressive symptoms even when accounting for all other childhood trauma types, and we found support for a complementary mediation of this relationship by emotion dysregulation. Our findings highlight the importance of emotion dysregulation and childhood emotional abuse in relation to adult depression. Moving forward, clinicians should consider the particular importance of emotional abuse in the development of depression, and future research should seek to identify mechanisms through which emotional abuse increases risk for depression and emotion dysregulation. PMID:25035171
Nonlinear Fluid Model Of 3-D Field Effects In Tokamak Plasmas
NASA Astrophysics Data System (ADS)
Callen, J. D.; Hegna, C. C.; Beidler, M. T.
2017-10-01
Extended MHD codes (e.g., NIMROD, M3D-C1) are beginning to explore nonlinear effects of small 3-D magnetic fields on tokamak plasmas. To facilitate development of analogous physically understandable reduced models, a fluid-based dynamic nonlinear model of these added 3-D field effects in the base axisymmetric tokamak magnetic field geometry is being developed. The model incorporates kinetic-based closures within an extended MHD framework. Key 3-D field effects models that have been developed include: 1) a comprehensive modified Rutherford equation for the growth of a magnetic island that includes the classical tearing and NTM perturbed bootstrap current drives, externally applied magnetic field and current drives, and classical and neoclassical polarization current effects, and 2) dynamic nonlinear evolution of the plasma toroidal flow (radial electric field) in response to the 3-D fields. An application of this model to RMP ELM suppression precipitated by an ELM crash will be discussed. Supported by Office of Fusion Energy Sciences, Office of Science, Dept. of Energy Grants DE-FG02-86ER53218 and DE-FG02-92ER54139.
Improved Design of Stellarator Coils for Current Carrying Plasmas
NASA Astrophysics Data System (ADS)
Drevlak, M.; Strumberger, E.; Hirshman, S.; Boozer, A.; Brooks, A.; Valanju, P.
1998-11-01
The method of automatic optimization (P. Merkel, Nucl. Fus. 27), (1987) 867; P. Merkel, M. Drevlak, Proc 25th EPS Conf. on Cont. Fus. and Plas. Phys., Prague, in print. for the design of stellarator coils consists essentially of determining filaments such that the average relative field error int dS [ (B_coil + B_j) \\cdot n]^2/B^2_coil is minimized on the prescribed plasma boundary. Bj is the magnetic field produced by the plasma currents of the given finite β fixed boundary equilibrium. For equilibria of the W7-X type, Bj can be neglected, because of the reduced parallel plasma currents. This is not true for quasi-axisymmetric stellarator (QAS) configurations (A. Reiman, et al., to be published.) with large equilibrium and net plasma (bootstrap) currents. Although the coils for QAS exhibit low values of the field error, free boundary calculations indicate that the shape of the plasma is usually not accurately reproduced , particularly when saddle coils are used. We investigate if the surface reconstruction can be improved by introducing a modified measure of the field error based on a measure of the resonant components of the normal field.
Variation of magnetoimpedance of electrodeposited NiFe/Cu with deposition current density
NASA Astrophysics Data System (ADS)
Mishra, A. C.; Jha, A. K.
2017-12-01
An investigation about influence of deposition current density on electrodeposited magnetic film is reported in this paper. Ferromagnetic NiFe thin films were electrodeposited on copper wires of 100 μm diameter for various electrdepostion current densities ranging from 10 to 60 mA/cm2 maintaining equal thickness in all films. The composition of deposited film varied with deposition current density and in particular, a composition of Ni79Fe21 was achieved for a current density of 20 mA/cm2. The surface microstructure of the film deposited at the current density of 20 mA/cm2 was found to have excellent smoothness. The coercivity of the film was lowest and highest value of magnetoimpedance was measured for this film. The influence of current density on film composition and hence magnetic properties was attributed to the change of deposition mechanism.
Andres, Fanny; Castanier, Carole; Le Scanff, Christine
2014-02-01
The present study aims to explore the mediating effects of conscientiousness and alexithymia in the relationship between parental attachment style and alcohol use in a large sample of athletic young people. Participants included 434 French sport sciences students. Alcohol use, parental attachment style, conscientiousness and alexithymia were assessed. The hypotheses were tested by using regression and bootstrapping mediation analyses. Maternal insecure attachment style is positively associated with alcohol use. The current study highlights a multiple pathway in this relationship. The results reveal the mediating effect of low conscientiousness and alexithymia between maternal insecure attachment and alcohol use. Athletes' alcohol use seems to be the result of a complex association of underlying psychological factors. © 2013.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilbanks, Matt C.; Yuter, S. E.; de Szoeke, S.
2015-09-01
Density currents (i.e. cold pools or outflows) beneath marine stratocumulus clouds are characterized using a 30-d data set of ship-based observations obtained during the 2008 Variability of American Monsoon Systems (VAMOS) Ocean-Cloud-Atmosphere-Land Study Regional Experiment (VOCALS-REx) in the southeast Pacific. An objective method identifies 71 density current fronts using an air density criterion and isolates each density current’s core (peak density) and tail (dissipating) zone. Compared to front and core zones, most density current tails exhibited weaker density gradients and wind anomalies elongated about the axis of the mean wind. The mean cloud-level advection relative to the surface layer windmore » (1.9 m s-1) nearly matches the mean density current propagation speed (1.8 m s-1). The similarity in speeds allows drizzle cells to deposit tails in their wakes. Based on high-resolution scanning Doppler lidar data, prefrontal updrafts had a mean intensity of 0.91 m s-1, reached an average altitude of 800 m, and were often surmounted by low-lying shelf clouds not connected to the overlying stratocumulus cloud. Nearly 90% of density currents were identified when C-band radar estimated 30-km diameter areal average rain rates exceeded 1 mm d-1. Rather than peaking when rain rates are highest overnight, density current occurrence peaks between 0600 and 0800 local solar time when enhanced local drizzle co-occurs with shallow subcloud dry and stable layers. The dry layers may contribute to density current formation by enhancing subcloud evaporation of drizzle. Density currents preferentially occur in regions of open cells but also occur in regions of closed cells.« less
Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques
NASA Astrophysics Data System (ADS)
Mai, Juliane; Tolson, Bryan
2017-04-01
The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters or model processes. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method independency of the convergence testing method, we applied it to three widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991, Campolongo et al., 2000), the variance-based Sobol' method (Solbol' 1993, Saltelli et al. 2010) and a derivative-based method known as Parameter Importance index (Goehler et al. 2013). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. Subsequently, we focus on the model-independency by testing the frugal method using the hydrologic model mHM (www.ufz.de/mhm) with about 50 model parameters. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed (and published) sensitivity results. This is one step towards reliable and transferable, published sensitivity results.
A line transect model for aerial surveys
Quang, Pham Xuan; Lanctot, Richard B.
1991-01-01
We employ a line transect method to estimate the density of the common and Pacific loon in the Yukon Flats National Wildlife Refuge from aerial survey data. Line transect methods have the advantage of automatically taking into account “visibility bias” due to detectability difference of animals at different distances from the transect line. However, line transect methods must overcome two difficulties when applied to inaccurate recording of sighting distances due to high travel speeds, so that in fact only a few reliable distance class counts are available. We propose a unimodal detection function that provides an estimate of the effective area lost due to the blind strip, under the assumption that a line of perfect detection exists parallel to the transect line. The unimodal detection function can also be applied when a blind strip is absent, and in certain instances when the maximum probability of detection is less than 100%. A simple bootstrap procedure to estimate standard error is illustrated. Finally, we present results from a small set of Monte Carlo experiments.
Heating and current drive requirements towards steady state operation in ITER
NASA Astrophysics Data System (ADS)
Poli, F. M.; Bonoli, P. T.; Kessel, C. E.; Batchelor, D. B.; Gorelenkova, M.; Harvey, B.; Petrov, Y.
2014-02-01
Steady state scenarios envisaged for ITER aim at optimizing the bootstrap current, while maintaining sufficient confinement and stability to provide the necessary fusion yield. Non-inductive scenarios will need to operate with Internal Transport Barriers (ITBs) in order to reach adequate fusion gain at typical currents of 9 MA. However, the large pressure gradients associated with ITBs in regions of weak or negative magnetic shear can be conducive to ideal MHD instabilities, reducing the no-wall limit. The E × B flow shear from toroidal plasma rotation is expected to be low in ITER, with a major role in the ITB dynamics being played by magnetic geometry. Combinations of H/CD sources that maintain weakly reversed magnetic shear profiles throughout the discharge are the focus of this work. Time-dependent transport simulations indicate that, with a trade-off of the EC equatorial and upper launcher, the formation and sustainment of quasi-steady state ITBs could be demonstrated in ITER with the baseline heating configuration. However, with proper constraints from peeling-ballooning theory on the pedestal width and height, the fusion gain and the maximum non-inductive current are below the ITER target. Upgrades of the heating and current drive system in ITER, like the use of Lower Hybrid current drive, could overcome these limitations, sustaining higher non-inductive current and confinement, more expanded ITBs which are ideal MHD stable.
Comparison of fusion alpha performance in JET advanced scenario and H-mode plasmas
NASA Astrophysics Data System (ADS)
Asunta, O.; Kurki-Suonio, T.; Tala, T.; Sipilä, S.; Salomaa, R.; contributors, JET-EFDA
2008-12-01
Currently, plasmas with internal transport barriers (ITBs) appear the most likely candidates for steady-state scenarios for future fusion reactors. In such plasmas, the broad hot and dense region in the plasma core leads to high fusion gain, while the cool edge protects the integrity of the first wall. Economically desirable large bootstrap current fraction and low inductive current drive may, however, lead to degraded fast ion confinement. In this work the confinement and heating profile of fusion alphas were compared between H-mode and ITB plasmas in realistic JET geometry. The work was carried out using the Monte Carlo-based guiding-center-following code ASCOT. For the same plasma current, the ITB discharges were found to produce four to eight times more fusion power than a comparable ELMy H-mode discharge. Unfortunately, also the alpha particle losses were larger (~16%) compared with the H-mode discharge (7%). In the H-mode discharges, alpha power was deposited to the plasma symmetrically around the magnetic axis, whereas in the current-hole discharge, the power was spread out to a larger volume in the plasma center. This was due to wider particle orbits, and the magnetic structure allowing for a broader hot region in the centre.
Exploration of High Harmonic Fast Wave Heating on the National Spherical Torus Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.R. Wilson; R.E. Bell; S. Bernabei
2003-02-11
High Harmonic Fast Wave (HHFW) heating has been proposed as a particularly attractive means for plasma heating and current drive in the high-beta plasmas that are achievable in spherical torus (ST) devices. The National Spherical Torus Experiment (NSTX) [Ono, M., Kaye, S.M., Neumeyer, S., et al., Proceedings, 18th IEEE/NPSS Symposium on Fusion Engineering, Albuquerque, 1999, (IEEE, Piscataway, NJ (1999), p. 53.)] is such a device. An radio-frequency (rf) heating system has been installed on NSTX to explore the physics of HHFW heating, current drive via rf waves and for use as a tool to demonstrate the attractiveness of the STmore » concept as a fusion device. To date, experiments have demonstrated many of the theoretical predictions for HHFW. In particular, strong wave absorption on electrons over a wide range of plasma parameters and wave parallel phase velocities, wave acceleration of energetic ions, and indications of current drive for directed wave spectra have been observed. In addition HHFW heating has been used to explore the energy transport properties of NSTX plasmas, to create H-mode (high-confinement mode) discharges with a large fraction of bootstrap current and to control the plasma current profile during the early stages of the discharge.« less
Particle Image Velocimetry Study of Density Current Fronts
ERIC Educational Resources Information Center
Martin, Juan Ezequiel
2009-01-01
Gravity currents are flows that occur when a horizontal density difference causes fluid to move under the action of gravity; density currents are a particular case, for which the scalar causing the density difference is conserved. Flows with a strong effect of the horizontal density difference, even if only partially driven by it--such as the…
Correcting magnetic probe perturbations on current density measurements of current carrying plasmas.
Knoblauch, P; Raspa, V; Di Lorenzo, F; Lazarte, A; Clausse, A; Moreno, C
2010-09-01
A method to infer the current density distribution in the current sheath of a plasma focus discharge from a magnetic probe is formulated and then applied to experimental data obtained in a 1.1 kJ device. Distortions on the magnetic probe signal caused by current redistribution and by a time-dependent total discharge current are considered simultaneously, leading to an integral equation for the current density. Two distinct, easy to implement, numerical procedures are given to solve such equation. Experimental results show the coexistence of at least two maxima in the current density structure of a nitrogen sheath.
Impact of Te and ne on edge current density profiles in ELM mitigated regimes on ASDEX Upgrade
NASA Astrophysics Data System (ADS)
Dunne, M. G.; Rathgeber, S.; Burckhart, A.; Fischer, R.; Giannone, L.; McCarthy, P. J.; Schneider, P. A.; Wolfrum, E.; the ASDEX Upgrade Team
2015-01-01
ELM resolved edge current density profiles are reconstructed using the CLISTE equilibrium code. As input, highly spatially and temporally resolved edge electron temperature and density profiles are used in addition to data from the extensive set of external poloidal field measurements available at ASDEX Upgrade, flux loop difference measurements, and current measurements in the scrape-off layer. Both the local and flux surface averaged current density profiles are analysed for several ELM mitigation regimes. The focus throughout is on the impact of altered temperature and density profiles on the current density. In particular, many ELM mitigation regimes rely on operation at high density. Two reference plasmas with type-I ELMs are analysed, one with a deuterium gas puff and one without, in order to provide a reference for the behaviour in type-II ELMy regimes and high density ELM mitigation with external magnetic perturbations at ASDEX Upgrade. For type-II ELMs it is found that while a similar pedestal top pressure is sustained at the higher density, the temperature gradient decreases in the pedestal. This results in lower local and flux surface averaged current densities in these phases, which reduces the drive for the peeling mode. No significant differences between the current density measured in the type-I phase and ELM mitigated phase is seen when external perturbations are applied, though the pedestal top density was increased. Finally, ELMs during the nitrogen seeded phase of a high performance discharge are analysed and compared to ELMs in the reference phase. An increased pedestal pressure gradient, which is the source of confinement improvement in impurity seeded discharges, causes a local current density increase. However, the increased Zeff in the pedestal acts to reduce the flux surface averaged current density. This dichotomy, which is not observed in other mitigation regimes, could act to stabilize both the ballooning mode and the peeling mode at the same time.
Silicon-Based Lithium-Ion Capacitor for High Energy and High Power Application
NASA Technical Reports Server (NTRS)
Wu, James J.; Demattia, Brianne; Loyselle, Patricia; Reid, Concha; Kohout, Lisa
2017-01-01
Si-based Li-ion capacitor has been developed and demonstrated. The results show it is feasible to improve both power density and energy density in this configuration. The applied current density impacts the power and energy density: low current favors energy density while high current favors power density. Active carbon has a better rate capability than Si. Next StepsFuture Directions. Si electrode needs to be further studied and improved. Further optimization of SiAC ratio and evaluation of its impact on energy density and power density.
Ider, Yusuf Ziya; Birgul, Ozlem; Oran, Omer Faruk; Arikan, Orhan; Hamamura, Mark J; Muftuler, L Tugan
2010-06-07
Fourier transform (FT)-based algorithms for magnetic resonance current density imaging (MRCDI) from one component of magnetic flux density have been developed for 2D and 3D problems. For 2D problems, where current is confined to the xy-plane and z-component of the magnetic flux density is measured also on the xy-plane inside the object, an iterative FT-MRCDI algorithm is developed by which both the current distribution inside the object and the z-component of the magnetic flux density on the xy-plane outside the object are reconstructed. The method is applied to simulated as well as actual data from phantoms. The effect of measurement error on the spatial resolution of the current density reconstruction is also investigated. For 3D objects an iterative FT-based algorithm is developed whereby the projected current is reconstructed on any slice using as data the Laplacian of the z-component of magnetic flux density measured for that slice. In an injected current MRCDI scenario, the current is not divergence free on the boundary of the object. The method developed in this study also handles this situation.
DeMonte, Tim P; Wang, Dinghui; Ma, Weijing; Gao, Jia-Hong; Joy, Michael L G
2009-01-01
Current density imaging (CDI) is a magnetic resonance imaging (MRI) technique used to quantitatively measure current density vectors throughout the volume of an object/subject placed in the MRI system. Electrical current pulses are applied externally to the object/subject and are synchronized with the MRI sequence. In this work, CDI is used to measure average current density magnitude in the torso region of an in-vivo piglet for applied current pulse amplitudes ranging from 10 mA to 110 mA. The relationship between applied current amplitude and current density magnitude is linear in simple electronic elements such as wires and resistors; however, this relationship may not be linear in living tissue. An understanding of this relationship is useful for research in defibrillation, human electro-muscular incapacitation (e.g. TASER(R)) and other bioelectric stimulation devices. This work will show that the current amplitude to current density magnitude relationship is slightly nonlinear in living tissue in the range of 10 mA to 110 mA.
Space-charge-limited currents for cathodes with electric field enhanced geometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lai, Dingguo, E-mail: laidingguo@nint.ac.cn; Qiu, Mengtong; Xu, Qifu
This paper presents the approximate analytic solutions of current density for annulus and circle cathodes. The current densities of annulus and circle cathodes are derived approximately from first principles, which are in agreement with simulation results. The large scaling laws can predict current densities of high current vacuum diodes including annulus and circle cathodes in practical applications. In order to discuss the relationship between current density and electric field on cathode surface, the existing analytical solutions of currents for concentric cylinder and sphere diodes are fitted from existing solutions relating with electric field enhancement factors. It is found that themore » space-charge-limited current density for the cathode with electric-field enhanced geometry can be written in a general form of J = g(β{sub E}){sup 2}J{sub 0}, where J{sub 0} is the classical (1D) Child-Langmuir current density, β{sub E} is the electric field enhancement factor, and g is the geometrical correction factor depending on the cathode geometry.« less
Serša, Igor; Kranjc, Matej; Miklavčič, Damijan
2015-01-01
Electroporation is gaining its importance in everyday clinical practice of cancer treatment. For its success it is extremely important that coverage of the target tissue, i.e. treated tumor, with electric field is within the specified range. Therefore, an efficient tool for the electric field monitoring in the tumor during delivery of electroporation pulses is needed. The electric field can be reconstructed by the magnetic resonance electric impedance tomography method from current density distribution data. In this study, the use of current density imaging with MRI for monitoring current density distribution during delivery of irreversible electroporation pulses was demonstrated. Using a modified single-shot RARE sequence, where four 3000 V and 100 μs long pulses were included at the start, current distribution between a pair of electrodes inserted in a liver tissue sample was imaged. Two repetitions of the sequence with phases of refocusing radiofrequency pulses 90° apart were needed to acquire one current density image. For each sample in total 45 current density images were acquired to follow a standard protocol for irreversible electroporation where 90 electric pulses are delivered at 1 Hz. Acquired current density images showed that the current density in the middle of the sample increased from first to last electric pulses by 60%, i.e. from 8 kA/m2 to 13 kA/m2 and that direction of the current path did not change with repeated electric pulses significantly. The presented single-shot RARE-based current density imaging sequence was used successfully to image current distribution during delivery of short high-voltage electric pulses. The method has a potential to enable monitoring of tumor coverage by electric field during irreversible electroporation tissue ablation.
Chang, Ling-Yin; Wu, Wen-Chi; Wu, Chi-Chen; Lin, Linen Nymphas; Yen, Lee-Lan; Chang, Hsing-Yi
2017-01-01
Peer victimization in children and adolescents is a serious public health concern. Growing evidence exists for negative consequences of peer victimization, but research has mostly been short term and little is known about the mechanisms that moderate and mediate the impacts of peer victimization on subsequent antisocial behavior. The current study intended to examine the longitudinal relationship between peer victimization in adolescence and antisocial behavior in young adulthood and to determine whether sleep problems influence this relationship. In total, 2006 adolescents participated in a prospective study from 2009 to 2013. The moderating role of sleep problems was examined by testing the significance of the interaction between peer victimization and sleep problems. The mediating role of sleep problems was tested by using bootstrapping mediational analyses. All analyses were conducted using SAS 9.3 software. We found that peer victimization during adolescence was positively and significantly associated with antisocial behavior in young adulthood (β = 0.10, p < 0.0001). This association was mediated, but not moderated by sleep problems. Specifically, peer victimization first increased levels of sleep problems, which in turn elevated the risk of antisocial behavior (indirect effect: 0.01, 95% bootstrap confidence interval: 0.004, 0.021). These findings imply that sleep problems may operate as a potential mechanism through which peer victimization during adolescence leads to increases in antisocial behavior in young adulthood. Prevention and intervention programs that target sleep problems may yield benefits for decreasing antisocial behavior in adolescents who have been victimized by peers. Copyright © 2016 Elsevier Ltd. All rights reserved.
Efficient statistical tests to compare Youden index: accounting for contingency correlation.
Chen, Fangyao; Xue, Yuqiang; Tan, Ming T; Chen, Pingyan
2015-04-30
Youden index is widely utilized in studies evaluating accuracy of diagnostic tests and performance of predictive, prognostic, or risk models. However, both one and two independent sample tests on Youden index have been derived ignoring the dependence (association) between sensitivity and specificity, resulting in potentially misleading findings. Besides, paired sample test on Youden index is currently unavailable. This article develops efficient statistical inference procedures for one sample, independent, and paired sample tests on Youden index by accounting for contingency correlation, namely associations between sensitivity and specificity and paired samples typically represented in contingency tables. For one and two independent sample tests, the variances are estimated by Delta method, and the statistical inference is based on the central limit theory, which are then verified by bootstrap estimates. For paired samples test, we show that the estimated covariance of the two sensitivities and specificities can be represented as a function of kappa statistic so the test can be readily carried out. We then show the remarkable accuracy of the estimated variance using a constrained optimization approach. Simulation is performed to evaluate the statistical properties of the derived tests. The proposed approaches yield more stable type I errors at the nominal level and substantially higher power (efficiency) than does the original Youden's approach. Therefore, the simple explicit large sample solution performs very well. Because we can readily implement the asymptotic and exact bootstrap computation with common software like R, the method is broadly applicable to the evaluation of diagnostic tests and model performance. Copyright © 2015 John Wiley & Sons, Ltd.
Understanding London's Water Supply Tradeoffs When Scheduling Interventions Under Deep Uncertainty
NASA Astrophysics Data System (ADS)
Huskova, I.; Matrosov, E. S.; Harou, J. J.; Kasprzyk, J. R.; Reed, P. M.
2015-12-01
Water supply planning in many major world cities faces several challenges associated with but not limited to climate change, population growth and insufficient land availability for infrastructure development. Long-term plans to maintain supply-demand balance and ecosystem services require careful consideration of uncertainties associated with future conditions. The current approach for London's water supply planning utilizes least cost optimization of future intervention schedules with limited uncertainty consideration. Recently, the focus of the long-term plans has shifted from solely least cost performance to robustness and resilience of the system. Identifying robust scheduling of interventions requires optimizing over a statistically representative sample of stochastic inputs which may be computationally difficult to achieve. In this study we optimize schedules using an ensemble of plausible scenarios and assess how manipulating that ensemble influences the different Pareto-approximate intervention schedules. We investigate how a major stress event's location in time as well as the optimization problem formulation influence the Pareto-approximate schedules. A bootstrapping method that respects the non-stationary trend of climate change scenarios and ensures the even distribution of the major stress event in the scenario ensemble is proposed. Different bootstrapped hydrological scenario ensembles are assessed using many-objective scenario optimization of London's future water supply and demand intervention scheduling. However, such a "fixed" scheduling of interventions approach does not aim to embed flexibility or adapt effectively as the future unfolds. Alternatively, making decisions based on the observations of occurred conditions could help planners who prefer adaptive planning. We will show how rules to guide the implementation of interventions based on observations may result in more flexible strategies.
Jones, Adam G
2015-11-01
Bateman's principles continue to play a major role in the characterization of genetic mating systems in natural populations. The modern manifestations of Bateman's ideas include the opportunity for sexual selection (i.e. I(s) - the variance in relative mating success), the opportunity for selection (i.e. I - the variance in relative reproductive success) and the Bateman gradient (i.e. β(ss) - the slope of the least-squares regression of reproductive success on mating success). These variables serve as the foundation for one convenient approach for the quantification of mating systems. However, their estimation presents at least two challenges, which I address here with a new Windows-based computer software package called BATEMANATER. The first challenge is that confidence intervals for these variables are not easy to calculate. BATEMANATER solves this problem using a bootstrapping approach. The second, more serious, problem is that direct estimates of mating system variables from open populations will typically be biased if some potential progeny or adults are missing from the analysed sample. BATEMANATER addresses this problem using a maximum-likelihood approach to estimate mating system variables from incompletely sampled breeding populations. The current version of BATEMANATER addresses the problem for systems in which progeny can be collected in groups of half- or full-siblings, as would occur when eggs are laid in discrete masses or offspring occur in pregnant females. BATEMANATER has a user-friendly graphical interface and thus represents a new, convenient tool for the characterization and comparison of genetic mating systems. © 2015 John Wiley & Sons Ltd.
HSX as an example of a resilient non-resonant divertor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bader, A.; Boozer, A. H.; Hegna, C. C.
This study describes an initial description of the resilient divertor properties of quasi-symmetric (QS) stellarators using the HSX (Helically Symmetric eXperiment) configuration as a test-case. Divertors in high-performance QS stellarators will need to be resilient to changes in plasma configuration that arise due to evolution of plasma pressure profiles and bootstrap currents for divertor design. Resiliency is tested by examining the changes in strike point patterns from the field line following, which arise due to configurational changes. A low strike point variation with high configuration changes corresponds to high resiliency. The HSX edge displays resilient properties with configuration changes arisingmore » from the (1) wall position, (2) plasma current, and (3) external coils. The resilient behavior is lost if large edge islands intersect the wall structure. The resilient edge properties are corroborated by heat flux calculations from the fully 3-D plasma simulations using EMC3-EIRENE. Additionally, the strike point patterns are found to correspond to high curvature regions of magnetic flux surfaces.« less
Exploration of spherical torus physics in the NSTX device
NASA Astrophysics Data System (ADS)
Ono, M.; Kaye, S. M.; Peng, Y.-K. M.; Barnes, G.; Blanchard, W.; Carter, M. D.; Chrzanowski, J.; Dudek, L.; Ewig, R.; Gates, D.; Hatcher, R. E.; Jarboe, T.; Jardin, S. C.; Johnson, D.; Kaita, R.; Kalish, M.; Kessel, C. E.; Kugel, H. W.; Maingi, R.; Majeski, R.; Manickam, J.; McCormack, B.; Menard, J.; Mueller, D.; Nelson, B. A.; Nelson, B. E.; Neumeyer, C.; Oliaro, G.; Paoletti, F.; Parsells, R.; Perry, E.; Pomphrey, N.; Ramakrishnan, S.; Raman, R.; Rewoldt, G.; Robinson, J.; Roquemore, A. L.; Ryan, P.; Sabbagh, S.; Swain, D.; Synakowski, E. J.; Viola, M.; Williams, M.; Wilson, J. R.; NSTX Team
2000-03-01
The National Spherical Torus Experiment (NSTX) is being built at Princeton Plasma Physics Laboratory to test the fusion physics principles for the spherical torus concept at the MA level. The NSTX nominal plasma parameters are R0 = 85 cm, a = 67 cm, R/a >= 1.26, Bt = 3 kG, Ip = 1 MA, q95 = 14, elongation κ <= 2.2, triangularity δ <= 0.5 and a plasma pulse length of up to 5 s. The plasma heating/current drive tools are high harmonic fast wave (6 MW, 5 s), neutral beam injection (5 MW, 80 keV, 5 s) and coaxial helicity injection. Theoretical calculations predict that NSTX should provide exciting possibilities for exploring a number of important new physics regimes, including very high plasma β, naturally high plasma elongation, high bootstrap current fraction, absolute magnetic well and high pressure driven sheared flow. In addition, the NSTX programme plans to explore fully non-inductive plasma startup as well as a dispersive scrape-off layer for heat and particle flux handling.
HSX as an example of a resilient non-resonant divertor
Bader, A.; Boozer, A. H.; Hegna, C. C.; ...
2017-03-16
This study describes an initial description of the resilient divertor properties of quasi-symmetric (QS) stellarators using the HSX (Helically Symmetric eXperiment) configuration as a test-case. Divertors in high-performance QS stellarators will need to be resilient to changes in plasma configuration that arise due to evolution of plasma pressure profiles and bootstrap currents for divertor design. Resiliency is tested by examining the changes in strike point patterns from the field line following, which arise due to configurational changes. A low strike point variation with high configuration changes corresponds to high resiliency. The HSX edge displays resilient properties with configuration changes arisingmore » from the (1) wall position, (2) plasma current, and (3) external coils. The resilient behavior is lost if large edge islands intersect the wall structure. The resilient edge properties are corroborated by heat flux calculations from the fully 3-D plasma simulations using EMC3-EIRENE. Additionally, the strike point patterns are found to correspond to high curvature regions of magnetic flux surfaces.« less
A high-efficiency low-voltage CMOS rectifier for harvesting energy in implantable devices.
Hashemi, S Saeid; Sawan, Mohamad; Savaria, Yvon
2012-08-01
We present, in this paper, a new full-wave CMOS rectifier dedicated for wirelessly-powered low-voltage biomedical implants. It uses bootstrapped capacitors to reduce the effective threshold voltage of selected MOS switches. It achieves a significant increase in its overall power efficiency and low voltage-drop. Therefore, the rectifier is good for applications with low-voltage power supplies and large load current. The rectifier topology does not require complex circuit design. The highest voltages available in the circuit are used to drive the gates of selected transistors in order to reduce leakage current and to lower their channel on-resistance, while having high transconductance. The proposed rectifier was fabricated using the standard TSMC 0.18 μm CMOS process. When connected to a sinusoidal source of 3.3 V peak amplitude, it allows improving the overall power efficiency by 11% compared to the best recently published results given by a gate cross-coupled-based structure.
NASA Technical Reports Server (NTRS)
Watson, Michael D.; Ashley, Paul R.; Abushagur, Mustafa
2004-01-01
A charge density and current density model of a waveguide system has been developed to explore the effects of electric field electrode poling. An optical waveguide may be modeled during poling by considering the dielectric charge distribution, polarization charge distribution, and conduction charge generated by the poling field. These charge distributions are the source of poling current densities. The model shows that boundary charge current density and polarization current density are the major source of currents measured during poling and thermally stimulated discharge These charge distributions provide insight into the poling mechanisms and are directly related to E(sub A), and, alpha(sub r). Initial comparisons with experimental data show excellent correlation to the model results.
Monte Carlo based statistical power analysis for mediation models: methods and software.
Zhang, Zhiyong
2014-12-01
The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.
SOCIAL COMPETENCE AND PSYCHOLOGICAL VULNERABILITY: THE MEDIATING ROLE OF FLOURISHING.
Uysal, Recep
2015-10-01
This study examined whether flourishing mediated the social competence and psychological vulnerability. Participants were 259 university students (147 women, 112 men; M age = 21.3 yr., SD = 1.7) who completed the Turkish versions of the Perceived Social Competence Scale, the Flourishing Scale, and the Psychological Vulnerability Scale. Mediation models were tested using the bootstrapping method to examine indirect effects. Consistent with the hypotheses, the results indicated a positive relationship between social competence and flourishing, and a negative relationship between social competence and psychological vulnerability. Results of the bootstrapping method revealed that flourishing significantly mediated the relationship between social competence and psychological vulnerability. The significance and limitations of the results were discussed.
Yang, Yi-Feng
2014-02-01
This paper discusses the effects of transformational leadership on cooperative conflict resolution (management) by evaluating several alternative models related to the mediating role of job satisfaction and change commitment. Samples of data from customer service personnel in Taiwan were analyzed. Based on the bootstrap sample technique, an empirical study was carried out to yield the best fitting model. The procedure of hierarchical nested model analysis was used, incorporating the methods of bootstrapping mediation, PRODCLIN2, and structural equation modeling (SEM) comparison. The analysis suggests that leadership that promotes integration (change commitment) and provides inspiration and motivation (job satisfaction), in the proper order, creates the means for cooperative conflict resolution.
Yang, Yi-Feng
2016-08-01
This study discusses the influence of transformational leadership on job satisfaction through assessing six alternative models related to the mediators of leadership trust and change commitment utilizing a data sample (N = 341; M age = 32.5 year, SD = 5.2) for service promotion personnel in Taiwan. The bootstrap sampling technique was used to select the better fitting model. The tool of hierarchical nested model analysis was applied, along with the approaches of bootstrapping mediation, PRODCLIN2, and structural equation modeling comparison. The results overall demonstrate that leadership is important and that leadership role identification (trust) and workgroup cohesiveness (commitment) form an ordered serial relationship. © The Author(s) 2016.
Safety parameter considerations of anodal transcranial Direct Current Stimulation in rats.
Jackson, Mark P; Truong, Dennis; Brownlow, Milene L; Wagner, Jessica A; McKinley, R Andy; Bikson, Marom; Jankord, Ryan
2017-08-01
A commonly referenced transcranial Direct Current Stimulation (tDCS) safety threshold derives from tDCS lesion studies in the rat and relies on electrode current density (and related electrode charge density) to support clinical guidelines. Concerns about the role of polarity (e.g. anodal tDCS), sub-lesion threshold injury (e.g. neuroinflammatory processes), and role of electrode montage across rodent and human studies support further investigation into animal models of tDCS safety. Thirty-two anesthetized rats received anodal tDCS between 0 and 5mA for 60min through one of three epicranial electrode montages. Tissue damage was evaluated using hemotoxylin and eosin (H&E) staining, Iba-1 immunohistochemistry, and computational brain current density modeling. Brain lesion occurred after anodal tDCS at and above 0.5mA using a 25.0mm 2 electrode (electrode current density: 20.0A/m 2 ). Lesion initially occurred using smaller 10.6mm 2 or 5.3mm 2 electrodes at 0.25mA (23.5A/m 2 ) and 0.5mA (94.2A/m 2 ), respectively. Histological damage was correlated with computational brain current density predictions. Changes in microglial phenotype occurred in higher stimulation groups. Lesions were observed using anodal tDCS at an electrode current density of 20.0A/m 2 , which is below the previously reported safety threshold of 142.9A/m 2 using cathodal tDCS. The lesion area is not simply predicted by electrode current density (and so not by charge density as duration was fixed); rather computational modeling suggests average brain current density as a better predictor for anodal tDCS. Nonetheless, under the assumption that rodent epicranial stimulation is a hypersensitive model, an electrode current density of 20.0A/m 2 represents a conservative threshold for clinical tDCS, which typically uses an electrode current density of 2A/m 2 when electrodes are placed on the skin (resulting in a lower brain current density). Copyright © 2017 Elsevier Inc. All rights reserved.
Breaking the current density threshold in spin-orbit-torque magnetic random access memory
NASA Astrophysics Data System (ADS)
Zhang, Yin; Yuan, H. Y.; Wang, X. S.; Wang, X. R.
2018-04-01
Spin-orbit-torque magnetic random access memory (SOT-MRAM) is a promising technology for the next generation of data storage devices. The main bottleneck of this technology is the high reversal current density threshold. This outstanding problem is now solved by a new strategy in which the magnitude of the driven current density is fixed while the current direction varies with time. The theoretical limit of minimal reversal current density is only a fraction (the Gilbert damping coefficient) of the threshold current density of the conventional strategy. The Euler-Lagrange equation for the fastest magnetization reversal path and the optimal current pulse is derived for an arbitrary magnetic cell and arbitrary spin-orbit torque. The theoretical limit of minimal reversal current density and current density for a GHz switching rate of the new reversal strategy for CoFeB/Ta SOT-MRAMs are, respectively, of the order of 105 A/cm 2 and 106 A/cm 2 far below 107 A/cm 2 and 108 A/cm 2 in the conventional strategy. Furthermore, no external magnetic field is needed for a deterministic reversal in the new strategy.
Chapter 8: Plasma operation and control
NASA Astrophysics Data System (ADS)
ITER Physics Expert Group on Disruptions, Control, Plasma, and MHD; ITER Physics Expert Group on Energetic Particles, Heating, Current and Drive; ITER Physics Expert Group on Diagnostics; ITER Physics Basis Editors
1999-12-01
Wall conditioning of fusion devices involves removal of desorbable hydrogen isotopes and impurities from interior device surfaces to permit reliable plasma operation. Techniques used in present devices include baking, metal film gettering, deposition of thin films of low-Z material, pulse discharge cleaning, glow discharge cleaning, radio frequency discharge cleaning, and in situ limiter and divertor pumping. Although wall conditioning techniques have become increasingly sophisticated, a reactor scale facility will involve significant new challenges, including the development of techniques applicable in the presence of a magnetic field and of methods for efficient removal of tritium incorporated into co-deposited layers on plasma facing components and their support structures. The current status of various approaches is reviewed, and the implications for reactor scale devices are summarized. Creation and magnetic control of shaped and vertically unstable elongated plasmas have been mastered in many present tokamaks. The physics of equilibrium control for reactor scale plasmas will rely on the same principles, but will face additional challenges, exemplified by the ITER/FDR design. The absolute positioning of outermost flux surface and divertor strike points will have to be precise and reliable in view of the high heat fluxes at the separatrix. Long pulses will require minimal control actions, to reduce accumulation of AC losses in superconducting PF and TF coils. To this end, more complex feedback controllers are envisaged, and the experimental validation of the plasma equilibrium response models on which such controllers are designed is encouraging. Present simulation codes provide an adequate platform on which equilibrium response techniques can be validated. Burning plasmas require kinetic control in addition to traditional magnetic shape and position control. Kinetic control refers to measures controlling density, rotation and temperature in the plasma core as well as in plasma periphery and divertor. The planned diagnostics (Chapter 7) serve as sensors for kinetic control, while gas and pellet fuelling, auxiliary power and angular momentum input, impurity injection, and non-inductive current drive constitute the control actuators. For example, in an ignited plasma, core density controls fusion power output. Kinetic control algorithms vary according to the plasma state, e.g. H- or L-mode. Generally, present facilities have demonstrated the kinetic control methods required for a reactor scale device. Plasma initiation - breakdown, burnthrough and initial current ramp - in reactor scale tokamaks will not involve physics differing from that found in present day devices. For ITER, the induced electric field in the chamber will be ~0.3V· m-1 - comparable to that required by breakdown theory but somewhat smaller than in present devices. Thus, a start-up 3MW electron cyclotron heating system will be employed to assure burnthrough. Simulations show that plasma current ramp up and termination in a reactor scale device can follow procedures developed to avoid disruption in present devices. In particular, simulations remain in the stable area of the li-q plane. For design purposes, the resistive V·s consumed during initiation is found, by experiments, to follow the Ejima expression, 0.45μ0 RIp. Advanced tokamak control has two distinct goals. First, control of density, auxiliary power, and inductive current ramping to attain reverse shear q profiles and internal transport barriers, which persist until dissipated by magnetic flux diffusion. Such internal transport barriers can lead to transient ignition. Second, combined use poloidal field shape control with non-inductive current drive and NBI angular momentum injection to create and control steady state, high bootstrap fraction, reverse shear discharges. Active n = 1 magnetic feedback and/or driven rotation will be required to suppress resistive wall modes for steady state plasmas that must operate in the wall stabilized regime for reactor levels of β >= 0.03.
Bannikova, A A; Bulatova, N Sh; Kramerov, D A
2006-06-01
Genetic exchange among chromosomal races of the common shrew Sorex araneus and the problem of reproductive barriers have been extensively studied by means of such molecular markers as mtDNA, microsatellites, and allozymes. In the present study, the interpopulation and interracial polymorphism in the common shrew was derived, using fingerprints generated by amplified DNA regions flanked by short interspersed repeats (SINEs)-interSINE PCR (IS-PCR). We used primers, complementary to consensus sequences of two short retroposons: mammalian element MIR and the SOR element from the genome of Sorex araneus. Genetic differentiation among eleven populations of the common shrew from eight chromosome races was estimated. The NP and MJ analyses, as well as multidimensional scaling showed that all samples examined grouped into two main clusters, corresponding to European Russia and Siberia. The bootstrap support of the European Russia cluster in the NJ and MP analyses was respectively 76 and 61%. The bootstrap index for the Siberian cluster was 100% in both analyses; the Tomsk race, included into this cluster, was separated with the bootstrap support of NJ/MP 92/95%.
NASA Astrophysics Data System (ADS)
Erkyihun, Solomon Tassew; Rajagopalan, Balaji; Zagona, Edith; Lall, Upmanu; Nowak, Kenneth
2016-05-01
A model to generate stochastic streamflow projections conditioned on quasi-oscillatory climate indices such as Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO) is presented. Recognizing that each climate index has underlying band-limited components that contribute most of the energy of the signals, we first pursue a wavelet decomposition of the signals to identify and reconstruct these features from annually resolved historical data and proxy based paleoreconstructions of each climate index covering the period from 1650 to 2012. A K-Nearest Neighbor block bootstrap approach is then developed to simulate the total signal of each of these climate index series while preserving its time-frequency structure and marginal distributions. Finally, given the simulated climate signal time series, a K-Nearest Neighbor bootstrap is used to simulate annual streamflow series conditional on the joint state space defined by the simulated climate index for each year. We demonstrate this method by applying it to simulation of streamflow at Lees Ferry gauge on the Colorado River using indices of two large scale climate forcings: Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO), which are known to modulate the Colorado River Basin (CRB) hydrology at multidecadal time scales. Skill in stochastic simulation of multidecadal projections of flow using this approach is demonstrated.
A bootstrapping method for development of Treebank
NASA Astrophysics Data System (ADS)
Zarei, F.; Basirat, A.; Faili, H.; Mirain, M.
2017-01-01
Using statistical approaches beside the traditional methods of natural language processing could significantly improve both the quality and performance of several natural language processing (NLP) tasks. The effective usage of these approaches is subject to the availability of the informative, accurate and detailed corpora on which the learners are trained. This article introduces a bootstrapping method for developing annotated corpora based on a complex and rich linguistically motivated elementary structure called supertag. To this end, a hybrid method for supertagging is proposed that combines both of the generative and discriminative methods of supertagging. The method was applied on a subset of Wall Street Journal (WSJ) in order to annotate its sentences with a set of linguistically motivated elementary structures of the English XTAG grammar that is using a lexicalised tree-adjoining grammar formalism. The empirical results confirm that the bootstrapping method provides a satisfactory way for annotating the English sentences with the mentioned structures. The experiments show that the method could automatically annotate about 20% of WSJ with the accuracy of F-measure about 80% of which is particularly 12% higher than the F-measure of the XTAG Treebank automatically generated from the approach proposed by Basirat and Faili [(2013). Bridge the gap between statistical and hand-crafted grammars. Computer Speech and Language, 27, 1085-1104].
Confidence intervals for correlations when data are not normal.
Bishara, Anthony J; Hittner, James B
2017-02-01
With nonnormal data, the typical confidence interval of the correlation (Fisher z') may be inaccurate. The literature has been unclear as to which of several alternative methods should be used instead, and how extreme a violation of normality is needed to justify an alternative. Through Monte Carlo simulation, 11 confidence interval methods were compared, including Fisher z', two Spearman rank-order methods, the Box-Cox transformation, rank-based inverse normal (RIN) transformation, and various bootstrap methods. Nonnormality often distorted the Fisher z' confidence interval-for example, leading to a 95 % confidence interval that had actual coverage as low as 68 %. Increasing the sample size sometimes worsened this problem. Inaccurate Fisher z' intervals could be predicted by a sample kurtosis of at least 2, an absolute sample skewness of at least 1, or significant violations of normality hypothesis tests. Only the Spearman rank-order and RIN transformation methods were universally robust to nonnormality. Among the bootstrap methods, an observed imposed bootstrap came closest to accurate coverage, though it often resulted in an overly long interval. The results suggest that sample nonnormality can justify avoidance of the Fisher z' interval in favor of a more robust alternative. R code for the relevant methods is provided in supplementary materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCann, Cooper; Repasky, Kevin S.; Morin, Mikindra
Low-cost flight-based hyperspectral imaging systems have the potential to provide important information for ecosystem and environmental studies as well as aide in land management. To realize this potential, methods must be developed to provide large-area surface reflectance data allowing for temporal data sets at the mesoscale. This paper describes a bootstrap method of producing a large-area, radiometrically referenced hyperspectral data set using the Landsat surface reflectance (LaSRC) data product as a reference target. The bootstrap method uses standard hyperspectral processing techniques that are extended to remove uneven illumination conditions between flight passes, allowing for radiometrically self-consistent data after mosaicking. Throughmore » selective spectral and spatial resampling, LaSRC data are used as a radiometric reference target. Advantages of the bootstrap method include the need for minimal site access, no ancillary instrumentation, and automated data processing. Data from two hyperspectral flights over the same managed agricultural and unmanaged range land covering approximately 5.8 km 2 acquired on June 21, 2014 and June 24, 2015 are presented. As a result, data from a flight over agricultural land collected on June 6, 2016 are compared with concurrently collected ground-based reflectance spectra as a means of validation.« less
McCann, Cooper; Repasky, Kevin S.; Morin, Mikindra; ...
2017-07-25
Low-cost flight-based hyperspectral imaging systems have the potential to provide important information for ecosystem and environmental studies as well as aide in land management. To realize this potential, methods must be developed to provide large-area surface reflectance data allowing for temporal data sets at the mesoscale. This paper describes a bootstrap method of producing a large-area, radiometrically referenced hyperspectral data set using the Landsat surface reflectance (LaSRC) data product as a reference target. The bootstrap method uses standard hyperspectral processing techniques that are extended to remove uneven illumination conditions between flight passes, allowing for radiometrically self-consistent data after mosaicking. Throughmore » selective spectral and spatial resampling, LaSRC data are used as a radiometric reference target. Advantages of the bootstrap method include the need for minimal site access, no ancillary instrumentation, and automated data processing. Data from two hyperspectral flights over the same managed agricultural and unmanaged range land covering approximately 5.8 km 2 acquired on June 21, 2014 and June 24, 2015 are presented. As a result, data from a flight over agricultural land collected on June 6, 2016 are compared with concurrently collected ground-based reflectance spectra as a means of validation.« less
Confidence intervals for distinguishing ordinal and disordinal interactions in multiple regression.
Lee, Sunbok; Lei, Man-Kit; Brody, Gene H
2015-06-01
Distinguishing between ordinal and disordinal interaction in multiple regression is useful in testing many interesting theoretical hypotheses. Because the distinction is made based on the location of a crossover point of 2 simple regression lines, confidence intervals of the crossover point can be used to distinguish ordinal and disordinal interactions. This study examined 2 factors that need to be considered in constructing confidence intervals of the crossover point: (a) the assumption about the sampling distribution of the crossover point, and (b) the possibility of abnormally wide confidence intervals for the crossover point. A Monte Carlo simulation study was conducted to compare 6 different methods for constructing confidence intervals of the crossover point in terms of the coverage rate, the proportion of true values that fall to the left or right of the confidence intervals, and the average width of the confidence intervals. The methods include the reparameterization, delta, Fieller, basic bootstrap, percentile bootstrap, and bias-corrected accelerated bootstrap methods. The results of our Monte Carlo simulation study suggest that statistical inference using confidence intervals to distinguish ordinal and disordinal interaction requires sample sizes more than 500 to be able to provide sufficiently narrow confidence intervals to identify the location of the crossover point. (c) 2015 APA, all rights reserved).
Marami Milani, Mohammad Reza; Hense, Andreas; Rahmani, Elham; Ploeger, Angelika
2015-01-01
This study analyzes the linear relationship between climate variables and milk components in Iran by applying bootstrapping to include and assess the uncertainty. The climate parameters, Temperature Humidity Index (THI) and Equivalent Temperature Index (ETI) are computed from the NASA-Modern Era Retrospective-Analysis for Research and Applications (NASA-MERRA) reanalysis (2002–2010). Milk data for fat, protein (measured on fresh matter bases), and milk yield are taken from 936,227 milk records for the same period, using cows fed by natural pasture from April to September. Confidence intervals for the regression model are calculated using the bootstrap technique. This method is applied to the original times series, generating statistically equivalent surrogate samples. As a result, despite the short time data and the related uncertainties, an interesting behavior of the relationships between milk compound and the climate parameters is visible. During spring only, a weak dependency of milk yield and climate variations is obvious, while fat and protein concentrations show reasonable correlations. In summer, milk yield shows a similar level of relationship with ETI, but not with temperature and THI. We suggest this methodology for studies in the field of the impacts of climate change and agriculture, also environment and food with short-term data. PMID:28231215
Gotelli, Nicholas J.; Dorazio, Robert M.; Ellison, Aaron M.; Grossman, Gary D.
2010-01-01
Quantifying patterns of temporal trends in species assemblages is an important analytical challenge in community ecology. We describe methods of analysis that can be applied to a matrix of counts of individuals that is organized by species (rows) and time-ordered sampling periods (columns). We first developed a bootstrapping procedure to test the null hypothesis of random sampling from a stationary species abundance distribution with temporally varying sampling probabilities. This procedure can be modified to account for undetected species. We next developed a hierarchical model to estimate species-specific trends in abundance while accounting for species-specific probabilities of detection. We analysed two long-term datasets on stream fishes and grassland insects to demonstrate these methods. For both assemblages, the bootstrap test indicated that temporal trends in abundance were more heterogeneous than expected under the null model. We used the hierarchical model to estimate trends in abundance and identified sets of species in each assemblage that were steadily increasing, decreasing or remaining constant in abundance over more than a decade of standardized annual surveys. Our methods of analysis are broadly applicable to other ecological datasets, and they represent an advance over most existing procedures, which do not incorporate effects of incomplete sampling and imperfect detection.
Modality specificity and integration in working memory: Insights from visuospatial bootstrapping.
Allen, Richard J; Havelka, Jelena; Falcon, Thomas; Evans, Sally; Darling, Stephen
2015-05-01
The question of how meaningful associations between verbal and spatial information might be utilized to facilitate working memory performance is potentially highly instructive for models of memory function. The present study explored how separable processing capacities within specialized domains might each contribute to this, by examining the disruptive impacts of simple verbal and spatial concurrent tasks on young adults' recall of visually presented digit sequences encountered either in a single location or within a meaningful spatial "keypad" configuration. The previously observed advantage for recall in the latter condition (the "visuospatial bootstrapping effect") consistently emerged across 3 experiments, indicating use of familiar spatial information in boosting verbal memory. The magnitude of this effect interacted with concurrent activity; articulatory suppression during encoding disrupted recall to a greater extent when digits were presented in single locations (Experiment 1), while spatial tapping during encoding had a larger impact on the keypad condition and abolished the visuospatial bootstrapping advantage (Experiment 2). When spatial tapping was performed during recall (Experiment 3), no task by display interaction was observed. Outcomes are discussed within the context of the multicomponent model of working memory, with a particular emphasis on cross-domain storage in the episodic buffer (Baddeley, 2000). (c) 2015 APA, all rights reserved).
Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G
2018-03-01
Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.
Optimal geometry toward uniform current density electrodes
NASA Astrophysics Data System (ADS)
Song, Yizhuang; Lee, Eunjung; Woo, Eung Je; Seo, Jin Keun
2011-07-01
Electrodes are commonly used to inject current into the human body in various biomedical applications such as functional electrical stimulation, defibrillation, electrosurgery, RF ablation, impedance imaging, and so on. When a highly conducting electrode makes direct contact with biological tissues, the induced current density has strong singularity along the periphery of the electrode, which may cause painful sensation or burn. Especially in impedance imaging methods such as the magnetic resonance electrical impedance tomography, we should avoid such singularity since more uniform current density underneath a current-injection electrode is desirable. In this paper, we study an optimal geometry of a recessed electrode to produce a well-distributed current density on the contact area under the electrode. We investigate the geometry of the electrode surface to minimize the edge singularity and produce nearly uniform current density on the contact area. We propose a mathematical framework for the uniform current density electrode and its optimal geometry. The theoretical results are supported by numerical simulations.
Nasr Esfahani, Bahram; Moghim, Sharareh; Ghasemian Safaei, Hajieh; Moghoofei, Mohsen; Sedighi, Mansour; Hadifar, Shima
2016-01-01
Background Taxonomic and phylogenetic studies of Mycobacterium species have been based around the 16sRNA gene for many years. However, due to the high strain similarity between species in the Mycobacterium genus (94.3% - 100%), defining a valid phylogenetic tree is difficult; consequently, its use in estimating the boundaries between species is limited. The sequence of the rpoB gene makes it an appropriate gene for phylogenetic analysis, especially in bacteria with limited variation. Objectives In the present study, a 360bp sequence of rpoB was used for precise classification of Mycobacterium strains isolated in Isfahan, Iran. Materials and Methods From February to October 2013, 57 clinical and environmental isolates were collected, subcultured, and identified by phenotypic methods. After DNA extraction, a 360bp fragment was PCR-amplified and sequenced. The phylogenetic tree was constructed based on consensus sequence data, using MEGA5 software. Results Slow and fast-growing groups of the Mycobacterium strains were clearly differentiated based on the constructed tree of 56 common Mycobacterium isolates. Each species with a unique title in the tree was identified; in total, 13 nods with a bootstrap value of over 50% were supported. Among the slow-growing group was Mycobacterium kansasii, with M. tuberculosis in a cluster with a bootstrap value of 98% and M. gordonae in another cluster with a bootstrap value of 90%. In the fast-growing group, one cluster with a bootstrap value of 89% was defined, including all fast-growing members present in this study. Conclusions The results suggest that only the application of the rpoB gene sequence is sufficient for taxonomic categorization and definition of a new Mycobacterium species, due to its high resolution power and proper variation in its sequence (85% - 100%); the resulting tree has high validity. PMID:27284397
NASA Astrophysics Data System (ADS)
Nagai, Yukie; Hosoda, Koh; Morita, Akio; Asada, Minoru
This study argues how human infants acquire the ability of joint attention through interactions with their caregivers from a viewpoint of cognitive developmental robotics. In this paper, a mechanism by which a robot acquires sensorimotor coordination for joint attention through bootstrap learning is described. Bootstrap learning is a process by which a learner acquires higher capabilities through interactions with its environment based on embedded lower capabilities even if the learner does not receive any external evaluation nor the environment is controlled. The proposed mechanism for bootstrap learning of joint attention consists of the robot's embedded mechanisms: visual attention and learning with self-evaluation. The former is to find and attend to a salient object in the field of the robot's view, and the latter is to evaluate the success of visual attention, not joint attention, and then to learn the sensorimotor coordination. Since the object which the robot looks at based on visual attention does not always correspond to the object which the caregiver is looking at in an environment including multiple objects, the robot may have incorrect learning situations for joint attention as well as correct ones. However, the robot is expected to statistically lose the learning data of the incorrect ones as outliers because of its weaker correlation between the sensor input and the motor output than that of the correct ones, and consequently to acquire appropriate sensorimotor coordination for joint attention even if the caregiver does not provide any task evaluation to the robot. The experimental results show the validity of the proposed mechanism. It is suggested that the proposed mechanism could explain the developmental mechanism of infants' joint attention because the learning process of the robot's joint attention can be regarded as equivalent to the developmental process of infants' one.
Yang, Xianjin; Chen, Xiao; Carrigan, Charles R.; ...
2014-06-03
A parametric bootstrap approach is presented for uncertainty quantification (UQ) of CO₂ saturation derived from electrical resistance tomography (ERT) data collected at the Cranfield, Mississippi (USA) carbon sequestration site. There are many sources of uncertainty in ERT-derived CO₂ saturation, but we focus on how the ERT observation errors propagate to the estimated CO₂ saturation in a nonlinear inversion process. Our UQ approach consists of three steps. We first estimated the observational errors from a large number of reciprocal ERT measurements. The second step was to invert the pre-injection baseline data and the resulting resistivity tomograph was used as the priormore » information for nonlinear inversion of time-lapse data. We assigned a 3% random noise to the baseline model. Finally, we used a parametric bootstrap method to obtain bootstrap CO₂ saturation samples by deterministically solving a nonlinear inverse problem many times with resampled data and resampled baseline models. Then the mean and standard deviation of CO₂ saturation were calculated from the bootstrap samples. We found that the maximum standard deviation of CO₂ saturation was around 6% with a corresponding maximum saturation of 30% for a data set collected 100 days after injection began. There was no apparent spatial correlation between the mean and standard deviation of CO₂ saturation but the standard deviation values increased with time as the saturation increased. The uncertainty in CO₂ saturation also depends on the ERT reciprocal error threshold used to identify and remove noisy data and inversion constraints such as temporal roughness. Five hundred realizations requiring 3.5 h on a single 12-core node were needed for the nonlinear Monte Carlo inversion to arrive at stationary variances while the Markov Chain Monte Carlo (MCMC) stochastic inverse approach may expend days for a global search. This indicates that UQ of 2D or 3D ERT inverse problems can be performed on a laptop or desktop PC.« less
Analyzing hospitalization data: potential limitations of Poisson regression.
Weaver, Colin G; Ravani, Pietro; Oliver, Matthew J; Austin, Peter C; Quinn, Robert R
2015-08-01
Poisson regression is commonly used to analyze hospitalization data when outcomes are expressed as counts (e.g. number of days in hospital). However, data often violate the assumptions on which Poisson regression is based. More appropriate extensions of this model, while available, are rarely used. We compared hospitalization data between 206 patients treated with hemodialysis (HD) and 107 treated with peritoneal dialysis (PD) using Poisson regression and compared results from standard Poisson regression with those obtained using three other approaches for modeling count data: negative binomial (NB) regression, zero-inflated Poisson (ZIP) regression and zero-inflated negative binomial (ZINB) regression. We examined the appropriateness of each model and compared the results obtained with each approach. During a mean 1.9 years of follow-up, 183 of 313 patients (58%) were never hospitalized (indicating an excess of 'zeros'). The data also displayed overdispersion (variance greater than mean), violating another assumption of the Poisson model. Using four criteria, we determined that the NB and ZINB models performed best. According to these two models, patients treated with HD experienced similar hospitalization rates as those receiving PD {NB rate ratio (RR): 1.04 [bootstrapped 95% confidence interval (CI): 0.49-2.20]; ZINB summary RR: 1.21 (bootstrapped 95% CI 0.60-2.46)}. Poisson and ZIP models fit the data poorly and had much larger point estimates than the NB and ZINB models [Poisson RR: 1.93 (bootstrapped 95% CI 0.88-4.23); ZIP summary RR: 1.84 (bootstrapped 95% CI 0.88-3.84)]. We found substantially different results when modeling hospitalization data, depending on the approach used. Our results argue strongly for a sound model selection process and improved reporting around statistical methods used for modeling count data. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
A Bootstrap Approach to an Affordable Exploration Program
NASA Technical Reports Server (NTRS)
Oeftering, Richard C.
2011-01-01
This paper examines the potential to build an affordable sustainable exploration program by adopting an approach that requires investing in technologies that can be used to build a space infrastructure from very modest initial capabilities. Human exploration has had a history of flight programs that have high development and operational costs. Since Apollo, human exploration has had very constrained budgets and they are expected be constrained in the future. Due to their high operations costs it becomes necessary to consider retiring established space facilities in order to move on to the next exploration challenge. This practice may save cost in the near term but it does so by sacrificing part of the program s future architecture. Human exploration also has a history of sacrificing fully functional flight hardware to achieve mission objectives. An affordable exploration program cannot be built when it involves billions of dollars of discarded space flight hardware, instead, the program must emphasize preserving its high value space assets and building a suitable permanent infrastructure. Further this infrastructure must reduce operational and logistics cost. The paper examines the importance of achieving a high level of logistics independence by minimizing resource consumption, minimizing the dependency on external logistics, and maximizing the utility of resources available. The approach involves the development and deployment of a core suite of technologies that have minimum initial needs yet are able expand upon initial capability in an incremental bootstrap fashion. The bootstrap approach incrementally creates an infrastructure that grows and becomes self sustaining and eventually begins producing the energy, products and consumable propellants that support human exploration. The bootstrap technologies involve new methods of delivering and manipulating energy and materials. These technologies will exploit the space environment, minimize dependencies, and minimize the need for imported resources. They will provide the widest range of utility in a resource scarce environment and pave the way to an affordable exploration program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paz-Soldan, C.; Logan, N. C.; Haskey, S. R.
The nature of the multi-modal n=2 plasma response and its impact on global confinement is studied as a function of the axisymmetric equilibrium pressure, edge safety factor, collisionality, and L-versus H-mode conditions. Varying the relative phase (ΔΦ UL) between upper and lower in-vessel coils demonstrates that different n=2 poloidal spectra preferentially excite different plasma responses. These different plasma response modes are preferentially detected on the tokamak high-field side (HFS) or low-field side (LFS) midplanes, have different radial extents, couple differently to the resonant surfaces, and have variable impacts on edge stability and global confinement. In all equilibrium conditions studied, themore » observed confinement degradation shares the same ΔΦ UL dependence as the coupling to the resonant surfaces given by both ideal (IPEC) and resistive (MARS-F) MHD computation. Varying the edge safety factor shifts the equilibrium field-line pitch and thus the ΔΦ UL dependence of both the global confinement and the n=2 magnetic response. As edge safety factor is varied, modeling finds that the HFS response (but not the LFS response), the resonant surface coupling, and the edge displacements near the X-point all share the same ΔΦ UL dependence. The LFS response magnitude is strongly sensitive to the core pressure and is insensitive to the collisionality and edge safety factor. This indicates that the LFS measurements are primarily sensitive to a pressure-driven kink-ballooning mode that couples to the core plasma. MHD modeling accurately reproduces these (and indeed all) LFS experimental trends and supports this interpretation. In contrast to the LFS, the HFS magnetic response and correlated global confinement impact is unchanged with plasma pressure, but is strongly reduced in high collisionality conditions in both H- and L-mode. This experimentally suggests the bootstrap current drives the HFS response through the kink-peeling mode drive, though surprisingly weak or no dependence on the bootstrap current is seen in modeling. Instead, modeling is revealed to be very sensitive to the details of the edge current profile and equilibrium truncation. Furthermore, holding truncation fixed, most HFS experimental trends are not captured, thus demonstrating a stark contrast between the robustness of the HFS experimental results and the sensitivity of its computation.« less
NASA Astrophysics Data System (ADS)
Paz-Soldan, C.; Logan, N. C.; Haskey, S. R.; Nazikian, R.; Strait, E. J.; Chen, X.; Ferraro, N. M.; King, J. D.; Lyons, B. C.; Park, J.-K.
2016-05-01
The nature of the multi-modal n = 2 plasma response and its impact on global confinement is studied as a function of the axisymmetric equilibrium pressure, edge safety factor, collisionality, and L-versus H-mode conditions. Varying the relative phase (Δ {φ\\text{UL}} ) between upper and lower in-vessel coils demonstrates that different n = 2 poloidal spectra preferentially excite different plasma responses. These different plasma response modes are preferentially detected on the tokamak high-field side (HFS) or low-field side (LFS) midplanes, have different radial extents, couple differently to the resonant surfaces, and have variable impacts on edge stability and global confinement. In all equilibrium conditions studied, the observed confinement degradation shares the same Δ {φ\\text{UL}} dependence as the coupling to the resonant surfaces given by both ideal (IPEC) and resistive (MARS-F) MHD computation. Varying the edge safety factor shifts the equilibrium field-line pitch and thus the Δ {φ\\text{UL}} dependence of both the global confinement and the n = 2 magnetic response. As edge safety factor is varied, modeling finds that the HFS response (but not the LFS response), the resonant surface coupling, and the edge displacements near the X-point all share the same Δ {φ\\text{UL}} dependence. The LFS response magnitude is strongly sensitive to the core pressure and is insensitive to the collisionality and edge safety factor. This indicates that the LFS measurements are primarily sensitive to a pressure-driven kink-ballooning mode that couples to the core plasma. MHD modeling accurately reproduces these (and indeed all) LFS experimental trends and supports this interpretation. In contrast to the LFS, the HFS magnetic response and correlated global confinement impact is unchanged with plasma pressure, but is strongly reduced in high collisionality conditions in both H- and L-mode. This experimentally suggests the bootstrap current drives the HFS response through the kink-peeling mode drive, though surprisingly weak or no dependence on the bootstrap current is seen in modeling. Instead, modeling is revealed to be very sensitive to the details of the edge current profile and equilibrium truncation. Holding truncation fixed, most HFS experimental trends are not captured, thus demonstrating a stark contrast between the robustness of the HFS experimental results and the sensitivity of its computation.
Paz-Soldan, C.; Logan, N. C.; Haskey, S. R.; ...
2016-03-31
The nature of the multi-modal n=2 plasma response and its impact on global confinement is studied as a function of the axisymmetric equilibrium pressure, edge safety factor, collisionality, and L-versus H-mode conditions. Varying the relative phase (ΔΦ UL) between upper and lower in-vessel coils demonstrates that different n=2 poloidal spectra preferentially excite different plasma responses. These different plasma response modes are preferentially detected on the tokamak high-field side (HFS) or low-field side (LFS) midplanes, have different radial extents, couple differently to the resonant surfaces, and have variable impacts on edge stability and global confinement. In all equilibrium conditions studied, themore » observed confinement degradation shares the same ΔΦ UL dependence as the coupling to the resonant surfaces given by both ideal (IPEC) and resistive (MARS-F) MHD computation. Varying the edge safety factor shifts the equilibrium field-line pitch and thus the ΔΦ UL dependence of both the global confinement and the n=2 magnetic response. As edge safety factor is varied, modeling finds that the HFS response (but not the LFS response), the resonant surface coupling, and the edge displacements near the X-point all share the same ΔΦ UL dependence. The LFS response magnitude is strongly sensitive to the core pressure and is insensitive to the collisionality and edge safety factor. This indicates that the LFS measurements are primarily sensitive to a pressure-driven kink-ballooning mode that couples to the core plasma. MHD modeling accurately reproduces these (and indeed all) LFS experimental trends and supports this interpretation. In contrast to the LFS, the HFS magnetic response and correlated global confinement impact is unchanged with plasma pressure, but is strongly reduced in high collisionality conditions in both H- and L-mode. This experimentally suggests the bootstrap current drives the HFS response through the kink-peeling mode drive, though surprisingly weak or no dependence on the bootstrap current is seen in modeling. Instead, modeling is revealed to be very sensitive to the details of the edge current profile and equilibrium truncation. Furthermore, holding truncation fixed, most HFS experimental trends are not captured, thus demonstrating a stark contrast between the robustness of the HFS experimental results and the sensitivity of its computation.« less
Eckermann, Simon; Karnon, Jon; Willan, Andrew R
2010-01-01
Value of information (VOI) methods have been proposed as a systematic approach to inform optimal research design and prioritization. Four related questions arise that VOI methods could address. (i) Is further research for a health technology assessment (HTA) potentially worthwhile? (ii) Is the cost of a given research design less than its expected value? (iii) What is the optimal research design for an HTA? (iv) How can research funding be best prioritized across alternative HTAs? Following Occam's razor, we consider the usefulness of VOI methods in informing questions 1-4 relative to their simplicity of use. Expected value of perfect information (EVPI) with current information, while simple to calculate, is shown to provide neither a necessary nor a sufficient condition to address question 1, given that what EVPI needs to exceed varies with the cost of research design, which can vary from very large down to negligible. Hence, for any given HTA, EVPI does not discriminate, as it can be large and further research not worthwhile or small and further research worthwhile. In contrast, each of questions 1-4 are shown to be fully addressed (necessary and sufficient) where VOI methods are applied to maximize expected value of sample information (EVSI) minus expected costs across designs. In comparing complexity in use of VOI methods, applying the central limit theorem (CLT) simplifies analysis to enable easy estimation of EVSI and optimal overall research design, and has been shown to outperform bootstrapping, particularly with small samples. Consequently, VOI methods applying the CLT to inform optimal overall research design satisfy Occam's razor in both improving decision making and reducing complexity. Furthermore, they enable consideration of relevant decision contexts, including option value and opportunity cost of delay, time, imperfect implementation and optimal design across jurisdictions. More complex VOI methods such as bootstrapping of the expected value of partial EVPI may have potential value in refining overall research design. However, Occam's razor must be seriously considered in application of these VOI methods, given their increased complexity and current limitations in informing decision making, with restriction to EVPI rather than EVSI and not allowing for important decision-making contexts. Initial use of CLT methods to focus these more complex partial VOI methods towards where they may be useful in refining optimal overall trial design is suggested. Integrating CLT methods with such partial VOI methods to allow estimation of partial EVSI is suggested in future research to add value to the current VOI toolkit.
MgB2 wire diameter reduction by hot isostatic pressing—a route for enhanced critical current density
NASA Astrophysics Data System (ADS)
Morawski, A.; Cetner, T.; Gajda, D.; Zaleski, A. J.; Häßler, W.; Nenkov, K.; Rindfleisch, M. A.; Tomsic, M.; Przysłupski, P.
2018-07-01
The effect of wire diameter reduction on the critical current density of pristine MgB2 wire was studied. Wires were treated by a hot isostatic pressing method at 570 °C and at pressures of up to 1.1 GPa. It was found that the wire diameter reduction induces an increase of up to 70% in the mass density of the superconducting cores. This feature leads to increases in critical current, critical current density, and pinning force density. The magnitude and field dependence of the critical current density are related to both grain connectivity and structural defects, which act as effective pinning centers. High field transport properties were obtained without doping of the MgB2 phase. A critical current density jc of 3500 A mm‑2 was reached at 4 K, 6 T for the best sample, which was a five-fold increase compared to MgB2 samples synthesized at ambient pressure.