Modelisation de l'historique d'operation de groupes turbine-alternateur
NASA Astrophysics Data System (ADS)
Szczota, Mickael
Because of their ageing fleet, the utility managers are increasingly in needs of tools that can help them to plan efficiently maintenance operations. Hydro-Quebec started a project that aim to foresee the degradation of their hydroelectric runner, and use that information to classify the generating unit. That classification will help to know which generating unit is more at risk to undergo a major failure. Cracks linked to the fatigue phenomenon are a predominant degradation mode and the loading sequences applied to the runner is a parameter impacting the crack growth. So, the aim of this memoir is to create a generator able to generate synthetic loading sequences that are statistically equivalent to the observed history. Those simulated sequences will be used as input in a life assessment model. At first, we describe how the generating units are operated by Hydro-Quebec and analyse the available data, the analysis shows that the data are non-stationnary. Then, we review modelisation and validation methods. In the following chapter a particular attention is given to a precise description of the validation and comparison procedure. Then, we present the comparison of three kind of model : Discrete Time Markov Chains, Discrete Time Semi-Markov Chains and the Moving Block Bootstrap. For the first two models, we describe how to take account for the non-stationnarity. Finally, we show that the Markov Chain is not adapted for our case, and that the Semi-Markov chains are better when they include the non-stationnarity. The final choice between Semi-Markov Chains and the Moving Block Bootstrap depends of the user. But, with a long term vision we recommend the use of Semi-Markov chains for their flexibility. Keywords: Stochastic models, Models validation, Reliability, Semi-Markov Chains, Markov Chains, Bootstrap
Assessing uncertainties in superficial water provision by different bootstrap-based techniques
NASA Astrophysics Data System (ADS)
Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo Mario
2014-05-01
An assessment of water security can incorporate several water-related concepts, characterizing the interactions between societal needs, ecosystem functioning, and hydro-climatic conditions. The superficial freshwater provision level depends on the methods chosen for 'Environmental Flow Requirement' estimations, which integrate the sources of uncertainty in the understanding of how water-related threats to aquatic ecosystem security arise. Here, we develop an uncertainty assessment of superficial freshwater provision based on different bootstrap techniques (non-parametric resampling with replacement). To illustrate this approach, we use an agricultural basin (291 km2) within the Cantareira water supply system in Brazil monitored by one daily streamflow gage (24-year period). The original streamflow time series has been randomly resampled for different times or sample sizes (N = 500; ...; 1000), then applied to the conventional bootstrap approach and variations of this method, such as: 'nearest neighbor bootstrap'; and 'moving blocks bootstrap'. We have analyzed the impact of the sampling uncertainty on five Environmental Flow Requirement methods, based on: flow duration curves or probability of exceedance (Q90%, Q75% and Q50%); 7-day 10-year low-flow statistic (Q7,10); and presumptive standard (80% of the natural monthly mean ?ow). The bootstrap technique has been also used to compare those 'Environmental Flow Requirement' (EFR) methods among themselves, considering the difference between the bootstrap estimates and the "true" EFR characteristic, which has been computed averaging the EFR values of the five methods and using the entire streamflow record at monitoring station. This study evaluates the bootstrapping strategies, the representativeness of streamflow series for EFR estimates and their confidence intervals, in addition to overview of the performance differences between the EFR methods. The uncertainties arisen during EFR methods assessment will be propagated through water security indicators referring to water scarcity and vulnerability, seeking to provide meaningful support to end-users and water managers facing the incorporation of uncertainties in the decision making process.
NASA Astrophysics Data System (ADS)
Olafsdottir, Kristin B.; Mudelsee, Manfred
2013-04-01
Estimation of the Pearson's correlation coefficient between two time series to evaluate the influences of one time depended variable on another is one of the most often used statistical method in climate sciences. Various methods are used to estimate confidence interval to support the correlation point estimate. Many of them make strong mathematical assumptions regarding distributional shape and serial correlation, which are rarely met. More robust statistical methods are needed to increase the accuracy of the confidence intervals. Bootstrap confidence intervals are estimated in the Fortran 90 program PearsonT (Mudelsee, 2003), where the main intention was to get an accurate confidence interval for correlation coefficient between two time series by taking the serial dependence of the process that generated the data into account. However, Monte Carlo experiments show that the coverage accuracy for smaller data sizes can be improved. Here we adapt the PearsonT program into a new version called PearsonT3, by calibrating the confidence interval to increase the coverage accuracy. Calibration is a bootstrap resampling technique, which basically performs a second bootstrap loop or resamples from the bootstrap resamples. It offers, like the non-calibrated bootstrap confidence intervals, robustness against the data distribution. Pairwise moving block bootstrap is used to preserve the serial correlation of both time series. The calibration is applied to standard error based bootstrap Student's t confidence intervals. The performances of the calibrated confidence intervals are examined with Monte Carlo simulations, and compared with the performances of confidence intervals without calibration, that is, PearsonT. The coverage accuracy is evidently better for the calibrated confidence intervals where the coverage error is acceptably small (i.e., within a few percentage points) already for data sizes as small as 20. One form of climate time series is output from numerical models which simulate the climate system. The method is applied to model data from the high resolution ocean model, INALT01 where the relationship between the Agulhas Leakage and the North Brazil Current is evaluated. Preliminary results show significant correlation between the two variables when there is 10 year lag between them, which is more or less the time that takes the Agulhas Leakage water to reach the North Brazil Current. Mudelsee, M., 2003. Estimating Pearson's correlation coefficient with bootstrap confidence interval from serially dependent time series. Mathematical Geology 35, 651-665.
Conformal Bootstrap in Mellin Space
NASA Astrophysics Data System (ADS)
Gopakumar, Rajesh; Kaviraj, Apratim; Sen, Kallol; Sinha, Aninda
2017-02-01
We propose a new approach towards analytically solving for the dynamical content of conformal field theories (CFTs) using the bootstrap philosophy. This combines the original bootstrap idea of Polyakov with the modern technology of the Mellin representation of CFT amplitudes. We employ exchange Witten diagrams with built-in crossing symmetry as our basic building blocks rather than the conventional conformal blocks in a particular channel. Demanding consistency with the operator product expansion (OPE) implies an infinite set of constraints on operator dimensions and OPE coefficients. We illustrate the power of this method in the ɛ expansion of the Wilson-Fisher fixed point by reproducing anomalous dimensions and, strikingly, obtaining OPE coefficients to higher orders in ɛ than currently available using other analytic techniques (including Feynman diagram calculations). Our results enable us to get a somewhat better agreement between certain observables in the 3D Ising model and the precise numerical values that have been recently obtained.
Cuyabano, B C D; Su, G; Rosa, G J M; Lund, M S; Gianola, D
2015-10-01
This study compared the accuracy of genome-enabled prediction models using individual single nucleotide polymorphisms (SNP) or haplotype blocks as covariates when using either a single breed or a combined population of Nordic Red cattle. The main objective was to compare predictions of breeding values of complex traits using a combined training population with haplotype blocks, with predictions using a single breed as training population and individual SNP as predictors. To compare the prediction reliabilities, bootstrap samples were taken from the test data set. With the bootstrapped samples of prediction reliabilities, we built and graphed confidence ellipses to allow comparisons. Finally, measures of statistical distances were used to calculate the gain in predictive ability. Our analyses are innovative in the context of assessment of predictive models, allowing a better understanding of prediction reliabilities and providing a statistical basis to effectively calibrate whether one prediction scenario is indeed more accurate than another. An ANOVA indicated that use of haplotype blocks produced significant gains mainly when Bayesian mixture models were used but not when Bayesian BLUP was fitted to the data. Furthermore, when haplotype blocks were used to train prediction models in a combined Nordic Red cattle population, we obtained up to a statistically significant 5.5% average gain in prediction accuracy, over predictions using individual SNP and training the model with a single breed. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Trends and Correlation Estimation in Climate Sciences: Effects of Timescale Errors
NASA Astrophysics Data System (ADS)
Mudelsee, M.; Bermejo, M. A.; Bickert, T.; Chirila, D.; Fohlmeister, J.; Köhler, P.; Lohmann, G.; Olafsdottir, K.; Scholz, D.
2012-12-01
Trend describes time-dependence in the first moment of a stochastic process, and correlation measures the linear relation between two random variables. Accurately estimating the trend and correlation, including uncertainties, from climate time series data in the uni- and bivariate domain, respectively, allows first-order insights into the geophysical process that generated the data. Timescale errors, ubiquitious in paleoclimatology, where archives are sampled for proxy measurements and dated, poses a problem to the estimation. Statistical science and the various applied research fields, including geophysics, have almost completely ignored this problem due to its theoretical almost-intractability. However, computational adaptations or replacements of traditional error formulas have become technically feasible. This contribution gives a short overview of such an adaptation package, bootstrap resampling combined with parametric timescale simulation. We study linear regression, parametric change-point models and nonparametric smoothing for trend estimation. We introduce pairwise-moving block bootstrap resampling for correlation estimation. Both methods share robustness against autocorrelation and non-Gaussian distributional shape. We shortly touch computing-intensive calibration of bootstrap confidence intervals and consider options to parallelize the related computer code. Following examples serve not only to illustrate the methods but tell own climate stories: (1) the search for climate drivers of the Agulhas Current on recent timescales, (2) the comparison of three stalagmite-based proxy series of regional, western German climate over the later part of the Holocene, and (3) trends and transitions in benthic oxygen isotope time series from the Cenozoic. Financial support by Deutsche Forschungsgemeinschaft (FOR 668, FOR 1070, MU 1595/4-1) and the European Commission (MC ITN 238512, MC ITN 289447) is acknowledged.
NASA Astrophysics Data System (ADS)
Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker
2017-10-01
Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c≥ 13/24 for the central charge of such models, which we argue cannot be saturated by an interacting SCFT.
NASA Technical Reports Server (NTRS)
Garner, Gregory G.; Thompson, Anne M.
2013-01-01
An ensemble statistical post-processor (ESP) is developed for the National Air Quality Forecast Capability (NAQFC) to address the unique challenges of forecasting surface ozone in Baltimore, MD. Air quality and meteorological data were collected from the eight monitors that constitute the Baltimore forecast region. These data were used to build the ESP using a moving-block bootstrap, regression tree models, and extreme-value theory. The ESP was evaluated using a 10-fold cross-validation to avoid evaluation with the same data used in the development process. Results indicate that the ESP is conditionally biased, likely due to slight overfitting while training the regression tree models. When viewed from the perspective of a decision-maker, the ESP provides a wealth of additional information previously not available through the NAQFC alone. The user is provided the freedom to tailor the forecast to the decision at hand by using decision-specific probability thresholds that define a forecast for an ozone exceedance. Taking advantage of the ESP, the user not only receives an increase in value over the NAQFC, but also receives value for An ensemble statistical post-processor (ESP) is developed for the National Air Quality Forecast Capability (NAQFC) to address the unique challenges of forecasting surface ozone in Baltimore, MD. Air quality and meteorological data were collected from the eight monitors that constitute the Baltimore forecast region. These data were used to build the ESP using a moving-block bootstrap, regression tree models, and extreme-value theory. The ESP was evaluated using a 10-fold cross-validation to avoid evaluation with the same data used in the development process. Results indicate that the ESP is conditionally biased, likely due to slight overfitting while training the regression tree models. When viewed from the perspective of a decision-maker, the ESP provides a wealth of additional information previously not available through the NAQFC alone. The user is provided the freedom to tailor the forecast to the decision at hand by using decision-specific probability thresholds that define a forecast for an ozone exceedance. Taking advantage of the ESP, the user not only receives an increase in value over the NAQFC, but also receives value for
Harmony of spinning conformal blocks
NASA Astrophysics Data System (ADS)
Schomerus, Volker; Sobko, Evgeny; Isachenkov, Mikhail
2017-03-01
Conformal blocks for correlation functions of tensor operators play an increasingly important role for the conformal bootstrap programme. We develop a universal approach to such spinning blocks through the harmonic analysis of certain bundles over a coset of the conformal group. The resulting Casimir equations are given by a matrix version of the Calogero-Sutherland Hamiltonian that describes the scattering of interacting spinning particles in a 1-dimensional external potential. The approach is illustrated in several examples including fermionic seed blocks in 3D CFT where they take a very simple form.
Closure of the operator product expansion in the non-unitary bootstrap
DOE Office of Scientific and Technical Information (OSTI.GOV)
Esterlis, Ilya; Fitzpatrick, A. Liam; Ramirez, David M.
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a specialmore » case of the “Gliozzi” bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.« less
Closure of the operator product expansion in the non-unitary bootstrap
Esterlis, Ilya; Fitzpatrick, A. Liam; Ramirez, David M.
2016-11-07
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a specialmore » case of the “Gliozzi” bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.« less
Crossing symmetry in alpha space
NASA Astrophysics Data System (ADS)
Hogervorst, Matthijs; van Rees, Balt C.
2017-11-01
We initiate the study of the conformal bootstrap using Sturm-Liouville theory, specializing to four-point functions in one-dimensional CFTs. We do so by decomposing conformal correlators using a basis of eigenfunctions of the Casimir which are labeled by a complex number α. This leads to a systematic method for computing conformal block decompositions. Analyzing bootstrap equations in alpha space turns crossing symmetry into an eigenvalue problem for an integral operator K. The operator K is closely related to the Wilson transform, and some of its eigenfunctions can be found in closed form.
Iliesiu, Luca; Kos, Filip; Poland, David; ...
2016-03-17
We study the conformal bootstrap for a 4-point function of fermions in 3D. We first introduce an embedding formalism for 3D spinors and compute the conformal blocks appearing in fermion 4-point functions. Using these results, we find general bounds on the dimensions of operators appearing in the ψ × ψ OPE, and also on the central charge C T. We observe features in our bounds that coincide with scaling dimensions in the GrossNeveu models at large N. Finally, we also speculate that other features could coincide with a fermionic CFT containing no relevant scalar operators.
Causality constraints in conformal field theory
Hartman, Thomas; Jain, Sachin; Kundu, Sandipan
2016-05-17
Causality places nontrivial constraints on QFT in Lorentzian signature, for example fixing the signs of certain terms in the low energy Lagrangian. In d dimensional conformal field theory, we show how such constraints are encoded in crossing symmetry of Euclidean correlators, and derive analogous constraints directly from the conformal bootstrap (analytically). The bootstrap setup is a Lorentzian four-point function corresponding to propagation through a shockwave. Crossing symmetry fixes the signs of certain log terms that appear in the conformal block expansion, which constrains the interactions of low-lying operators. As an application, we use the bootstrap to rederive the well knownmore » sign constraint on the (Φ) 4 coupling in effective field theory, from a dual CFT. We also find constraints on theories with higher spin conserved currents. As a result, our analysis is restricted to scalar correlators, but we argue that similar methods should also impose nontrivial constraints on the interactions of spinning operators« less
Reliability of reservoir firm yield determined from the historical drought of record
Archfield, S.A.; Vogel, R.M.
2005-01-01
The firm yield of a reservoir is typically defined as the maximum yield that could have been delivered without failure during the historical drought of record. In the future, reservoirs will experience droughts that are either more or less severe than the historical drought of record. The question addressed here is what the reliability of such systems will be when operated at the firm yield. To address this question, we examine the reliability of 25 hypothetical reservoirs sited across five locations in the central and western United States. These locations provided a continuous 756-month streamflow record spanning the same time interval. The firm yield of each reservoir was estimated from the historical drought of record at each location. To determine the steady-state monthly reliability of each firm-yield estimate, 12,000-month synthetic records were generated using the moving-blocks bootstrap method. Bootstrapping was repeated 100 times for each reservoir to obtain an average steady-state monthly reliability R, the number of months the reservoir did not fail divided by the total months. Values of R were greater than 0.99 for 60 percent of the study reservoirs; the other 40 percent ranged from 0.95 to 0.98. Estimates of R were highly correlated with both the level of development (ratio of firm yield to average streamflow) and average lag-1 monthly autocorrelation. Together these two predictors explained 92 percent of the variability in R, with the level of development alone explaining 85 percent of the variability. Copyright ASCE 2005.
$$ \\mathcal{N} $$ = 4 superconformal bootstrap of the K 3 CFT
Lin, Ying-Hsuan; Shao, Shu-Heng; Simmons-Duffin, David; ...
2017-05-23
We study two-dimensional (4; 4) superconformal eld theories of central charge c = 6, corresponding to nonlinear sigma models on K3 surfaces, using the superconformal bootstrap. This is made possible through a surprising relation between the BPS N = 4 superconformal blocks with c = 6 and bosonic Virasoro conformal blocks with c = 28, and an exact result on the moduli dependence of a certain integrated BPS 4-point function. Nontrivial bounds on the non-BPS spectrum in the K3 CFT are obtained as functions of the CFT moduli, that interpolate between the free orbifold points and singular CFT points. Wemore » observe directly from the CFT perspective the signature of a continuous spectrum above a gap at the singular moduli, and fi nd numerically an upper bound on this gap that is saturated by the A1 N = 4 cigar CFT. We also derive an analytic upper bound on the fi rst nonzero eigenvalue of the scalar Laplacian on K3 in the large volume regime, that depends on the K3 moduli data. As two byproducts, we find an exact equivalence between a class of BPS N = 2 superconformal blocks and Virasoro conformal blocks in two dimensions, and an upper bound on the four-point functions of operators of sufficiently low scaling dimension in three and four dimensional CFTs.« less
$$ \\mathcal{N} $$ = 4 superconformal bootstrap of the K 3 CFT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Ying-Hsuan; Shao, Shu-Heng; Simmons-Duffin, David
We study two-dimensional (4; 4) superconformal eld theories of central charge c = 6, corresponding to nonlinear sigma models on K3 surfaces, using the superconformal bootstrap. This is made possible through a surprising relation between the BPS N = 4 superconformal blocks with c = 6 and bosonic Virasoro conformal blocks with c = 28, and an exact result on the moduli dependence of a certain integrated BPS 4-point function. Nontrivial bounds on the non-BPS spectrum in the K3 CFT are obtained as functions of the CFT moduli, that interpolate between the free orbifold points and singular CFT points. Wemore » observe directly from the CFT perspective the signature of a continuous spectrum above a gap at the singular moduli, and fi nd numerically an upper bound on this gap that is saturated by the A1 N = 4 cigar CFT. We also derive an analytic upper bound on the fi rst nonzero eigenvalue of the scalar Laplacian on K3 in the large volume regime, that depends on the K3 moduli data. As two byproducts, we find an exact equivalence between a class of BPS N = 2 superconformal blocks and Virasoro conformal blocks in two dimensions, and an upper bound on the four-point functions of operators of sufficiently low scaling dimension in three and four dimensional CFTs.« less
Unitary subsector of generalized minimal models
NASA Astrophysics Data System (ADS)
Behan, Connor
2018-05-01
We revisit the line of nonunitary theories that interpolate between the Virasoro minimal models. Numerical bootstrap applications have brought about interest in the four-point function involving the scalar primary of lowest dimension. Using recent progress in harmonic analysis on the conformal group, we prove the conjecture that global conformal blocks in this correlator appear with positive coefficients. We also compute many such coefficients in the simplest mixed correlator system. Finally, we comment on the status of using global conformal blocks to isolate the truly unitary points on this line.
This Isn't Business, It's Personal: Personal Narratives in the Field of Composition Studies
ERIC Educational Resources Information Center
Golar, Norman
2010-01-01
I focus on three critical autobiographies in the field of composition studies: Mike Rose's "Lives on the Boundary: A Moving Account of the Struggles and Achievements of America's Educationally Underprepared," Keith Gilyard's "Voices of the Self: A Study of Language Competence," and Victor Villanueva, Jr.'s "Bootstraps: From an American Academic of…
Simplified Estimation and Testing in Unbalanced Repeated Measures Designs.
Spiess, Martin; Jordan, Pascal; Wendt, Mike
2018-05-07
In this paper we propose a simple estimator for unbalanced repeated measures design models where each unit is observed at least once in each cell of the experimental design. The estimator does not require a model of the error covariance structure. Thus, circularity of the error covariance matrix and estimation of correlation parameters and variances are not necessary. Together with a weak assumption about the reason for the varying number of observations, the proposed estimator and its variance estimator are unbiased. As an alternative to confidence intervals based on the normality assumption, a bias-corrected and accelerated bootstrap technique is considered. We also propose the naive percentile bootstrap for Wald-type tests where the standard Wald test may break down when the number of observations is small relative to the number of parameters to be estimated. In a simulation study we illustrate the properties of the estimator and the bootstrap techniques to calculate confidence intervals and conduct hypothesis tests in small and large samples under normality and non-normality of the errors. The results imply that the simple estimator is only slightly less efficient than an estimator that correctly assumes a block structure of the error correlation matrix, a special case of which is an equi-correlation matrix. Application of the estimator and the bootstrap technique is illustrated using data from a task switch experiment based on an experimental within design with 32 cells and 33 participants.
(2, 2) superconformal bootstrap in two dimensions
Lin, Ying -Hsuan; Shao, Shu -Heng; Wang, Yifan; ...
2017-05-19
We find a simple relation between two-dimensional BPS N = 2 superconformal blocks and bosonic Virasoro conformal blocks, which allows us to analyze the crossing equations for BPS 4-point functions in unitary (2, 2) superconformal theories numerically with semidefinite programming. Here, we constrain gaps in the non-BPS spectrum through the operator product expansion of BPS operators, in ways that depend on the moduli of exactly marginal deformations through chiral ring coefficients. In some cases, our bounds on the spectral gaps are observed to be saturated by free theories, by N = 2 Liouville theory, and by certain Landau-Ginzburg models.
2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS ...
2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS FROM POND TO JACK LADDER--AN ENDLESS CHAIN CONVEYOR THAT MOVES LOGS INTO MILL - Lester Shingle Mill, 1602 North Eighteenth Street, Sweet Home, Linn County, OR
Robust control charts in industrial production of olive oil
NASA Astrophysics Data System (ADS)
Grilo, Luís M.; Mateus, Dina M. R.; Alves, Ana C.; Grilo, Helena L.
2014-10-01
Acidity is one of the most important variables in the quality analysis and characterization of olive oil. During the industrial production we use individuals and moving range charts to monitor this variable, which is not always normal distributed. After a brief exploratory data analysis, where we use the bootstrap method, we construct control charts, before and after a Box-Cox transformation, and compare their robustness and performance.
Bootstrapped Learning Analysis and Curriculum Development Environment (BLADE)
2012-02-01
framework Development of the automated teacher The software development aspect of the BL program was conducted primarily in the Java programming...parameters are analogous to Java class data members or to fields in a C structure. Here is an example composite IL object from Blocks World, an...2 and 3, alternative methods of implementing generators were developed, first in Java , later in Ruby. Both of these alternatives lowered the
NASA Astrophysics Data System (ADS)
Erkyihun, Solomon Tassew; Rajagopalan, Balaji; Zagona, Edith; Lall, Upmanu; Nowak, Kenneth
2016-05-01
A model to generate stochastic streamflow projections conditioned on quasi-oscillatory climate indices such as Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO) is presented. Recognizing that each climate index has underlying band-limited components that contribute most of the energy of the signals, we first pursue a wavelet decomposition of the signals to identify and reconstruct these features from annually resolved historical data and proxy based paleoreconstructions of each climate index covering the period from 1650 to 2012. A K-Nearest Neighbor block bootstrap approach is then developed to simulate the total signal of each of these climate index series while preserving its time-frequency structure and marginal distributions. Finally, given the simulated climate signal time series, a K-Nearest Neighbor bootstrap is used to simulate annual streamflow series conditional on the joint state space defined by the simulated climate index for each year. We demonstrate this method by applying it to simulation of streamflow at Lees Ferry gauge on the Colorado River using indices of two large scale climate forcings: Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO), which are known to modulate the Colorado River Basin (CRB) hydrology at multidecadal time scales. Skill in stochastic simulation of multidecadal projections of flow using this approach is demonstrated.
ERIC Educational Resources Information Center
Chihak, Benjamin J.; Plumert, Jodie M.; Ziemer, Christine J.; Babu, Sabarish; Grechkin, Timofey; Cremer, James F.; Kearney, Joseph K.
2010-01-01
Two experiments examined how 10- and 12-year-old children and adults intercept moving gaps while bicycling in an immersive virtual environment. Participants rode an actual bicycle along a virtual roadway. At 12 test intersections, participants attempted to pass through a gap between 2 moving, car-sized blocks without stopping. The blocks were…
Paleomagnetic constraints on deformation of superfast-spread oceanic crust exposed at Pito Deep Rift
NASA Astrophysics Data System (ADS)
Horst, A. J.; Varga, R. J.; Gee, J. S.; Karson, J. A.
2011-12-01
The uppermost oceanic crust produced at the superfast spreading (˜142 km Ma-1, full-spreading rate) southern East Pacific Rise (EPR) during the Gauss Chron is exposed in a tectonic window along the northeastern wall of the Pito Deep Rift. Paleomagnetic analysis of fully oriented dike (62) and gabbro (5) samples from two adjacent study areas yield bootstrapped mean remanence directions of 38.9° ± 8.1°, -16.7° ± 15.6°, n = 23 (Area A) and 30.4° ± 8.0°, -25.1° ± 12.9°, n = 44 (Area B), both are significantly distinct from the Geocentric Axial Dipole expected direction at 23° S. Regional tectonics and outcrop-scale structural data combined with bootstrapped remanence directions constrain models that involve a sequence of three rotations that result in dikes restored to subvertical orientations related to (1) inward-tilting of crustal blocks during spreading (Area A = 11°, Area B = 22°), (2) clockwise, vertical-axis rotation of the Easter Microplate (A = 46°, B = 44°), and (3) block tilting at Pito Deep Rift (A = 21°, B = 10°). These data support a structural model for accretion at the southern EPR in which outcrop-scale faulting and block rotation accommodates spreading-related subaxial subsidence that is generally less than that observed in crust generated at a fast spreading rate exposed at Hess Deep Rift. These data also support previous estimates for the clockwise rotation of crust adjacent to the Easter Microplate. Dike sample natural remanent magnetization (NRM) has an arithmetic mean of 5.96 A/m ± 3.76, which suggests that they significantly contribute to observed magnetic anomalies from fast- to superfast-spread crust.
15. Detail of 1946 chain rack alot for moving blocks ...
15. Detail of 1946 chain rack alot for moving blocks on floor of drydock in far SE corner of South Section. In 1994 theme chain rack alots and accompanying blocking mechanisms were no longer in use and were extremely deteriorated. For view of system as originally installed, see 1946 historic copy photograph WA-116-E-33. - Puget Sound Naval Shipyard, Drydock No. 3, Farragut Avenue, Bremerton, Kitsap County, WA
Geometric Theory of Moving Grid Wavefront Sensor
1977-06-30
Identify by block numbot) Adaptive Optics WaVefront Sensor Geometric Optics Analysis Moving Ronchi Grid "ABSTRACT (Continue an revere sdde If nooessaY...ad Identify by block nucber)A geometric optics analysis is made for a wavefront sensor that uses a moving Ronchi grid. It is shown that by simple data... optical systems being considered or being developed -3 for imaging an object through a turbulent atmosphere. Some of these use a wavefront sensor to
7 CFR 58.425 - Conveyor for moving and draining block or barrel cheese.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Conveyor for moving and draining block or barrel cheese. 58.425 Section 58.425 Agriculture Regulations of the Department of Agriculture (Continued... cheese. The conveyor shall be constructed so that it will not contaminate the cheese and be easily...
7 CFR 58.425 - Conveyor for moving and draining block or barrel cheese.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Conveyor for moving and draining block or barrel cheese. 58.425 Section 58.425 Agriculture Regulations of the Department of Agriculture (Continued... cheese. The conveyor shall be constructed so that it will not contaminate the cheese and be easily...
Moving up the Block: Learning To Think Like a Peer.
ERIC Educational Resources Information Center
Bokser, Julie A.
For one educator, an assistant professor of English with a specialization in writing, the short but dramatic move "up the block" from the University of Illinois at Chicago (UIC) to DePaul University eight miles north occasioned an adjustment to a radically different institutional personality and student body, despite similar street…
Passive forensics for copy-move image forgery using a method based on DCT and SVD.
Zhao, Jie; Guo, Jichang
2013-12-10
As powerful image editing tools are widely used, the demand for identifying the authenticity of an image is much increased. Copy-move forgery is one of the tampering techniques which are frequently used. Most existing techniques to expose this forgery need to improve the robustness for common post-processing operations and fail to precisely locate the tampering region especially when there are large similar or flat regions in the image. In this paper, a robust method based on DCT and SVD is proposed to detect this specific artifact. Firstly, the suspicious image is divided into fixed-size overlapping blocks and 2D-DCT is applied to each block, then the DCT coefficients are quantized by a quantization matrix to obtain a more robust representation of each block. Secondly, each quantized block is divided non-overlapping sub-blocks and SVD is applied to each sub-block, then features are extracted to reduce the dimension of each block using its largest singular value. Finally, the feature vectors are lexicographically sorted, and duplicated image blocks will be matched by predefined shift frequency threshold. Experiment results demonstrate that our proposed method can effectively detect multiple copy-move forgery and precisely locate the duplicated regions, even when an image was distorted by Gaussian blurring, AWGN, JPEG compression and their mixed operations. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
A Bootstrap Metropolis-Hastings Algorithm for Bayesian Analysis of Big Data.
Liang, Faming; Kim, Jinsu; Song, Qifan
2016-01-01
Markov chain Monte Carlo (MCMC) methods have proven to be a very powerful tool for analyzing data of complex structures. However, their computer-intensive nature, which typically require a large number of iterations and a complete scan of the full dataset for each iteration, precludes their use for big data analysis. In this paper, we propose the so-called bootstrap Metropolis-Hastings (BMH) algorithm, which provides a general framework for how to tame powerful MCMC methods to be used for big data analysis; that is to replace the full data log-likelihood by a Monte Carlo average of the log-likelihoods that are calculated in parallel from multiple bootstrap samples. The BMH algorithm possesses an embarrassingly parallel structure and avoids repeated scans of the full dataset in iterations, and is thus feasible for big data problems. Compared to the popular divide-and-combine method, BMH can be generally more efficient as it can asymptotically integrate the whole data information into a single simulation run. The BMH algorithm is very flexible. Like the Metropolis-Hastings algorithm, it can serve as a basic building block for developing advanced MCMC algorithms that are feasible for big data problems. This is illustrated in the paper by the tempering BMH algorithm, which can be viewed as a combination of parallel tempering and the BMH algorithm. BMH can also be used for model selection and optimization by combining with reversible jump MCMC and simulated annealing, respectively.
A Bootstrap Metropolis–Hastings Algorithm for Bayesian Analysis of Big Data
Kim, Jinsu; Song, Qifan
2016-01-01
Markov chain Monte Carlo (MCMC) methods have proven to be a very powerful tool for analyzing data of complex structures. However, their computer-intensive nature, which typically require a large number of iterations and a complete scan of the full dataset for each iteration, precludes their use for big data analysis. In this paper, we propose the so-called bootstrap Metropolis-Hastings (BMH) algorithm, which provides a general framework for how to tame powerful MCMC methods to be used for big data analysis; that is to replace the full data log-likelihood by a Monte Carlo average of the log-likelihoods that are calculated in parallel from multiple bootstrap samples. The BMH algorithm possesses an embarrassingly parallel structure and avoids repeated scans of the full dataset in iterations, and is thus feasible for big data problems. Compared to the popular divide-and-combine method, BMH can be generally more efficient as it can asymptotically integrate the whole data information into a single simulation run. The BMH algorithm is very flexible. Like the Metropolis-Hastings algorithm, it can serve as a basic building block for developing advanced MCMC algorithms that are feasible for big data problems. This is illustrated in the paper by the tempering BMH algorithm, which can be viewed as a combination of parallel tempering and the BMH algorithm. BMH can also be used for model selection and optimization by combining with reversible jump MCMC and simulated annealing, respectively. PMID:29033469
The ABC (in any D) of logarithmic CFT
NASA Astrophysics Data System (ADS)
Hogervorst, Matthijs; Paulos, Miguel; Vichi, Alessandro
2017-10-01
Logarithmic conformal field theories have a vast range of applications, from critical percolation to systems with quenched disorder. In this paper we thoroughly examine the structure of these theories based on their symmetry properties. Our analysis is model-independent and holds for any spacetime dimension. Our results include a determination of the general form of correlation functions and conformal block decompositions, clearing the path for future bootstrap applications. Several examples are discussed in detail, including logarithmic generalized free fields, holographic models, self-avoiding random walks and critical percolation.
A Bootstrap Approach to an Affordable Exploration Program
NASA Technical Reports Server (NTRS)
Oeftering, Richard C.
2011-01-01
This paper examines the potential to build an affordable sustainable exploration program by adopting an approach that requires investing in technologies that can be used to build a space infrastructure from very modest initial capabilities. Human exploration has had a history of flight programs that have high development and operational costs. Since Apollo, human exploration has had very constrained budgets and they are expected be constrained in the future. Due to their high operations costs it becomes necessary to consider retiring established space facilities in order to move on to the next exploration challenge. This practice may save cost in the near term but it does so by sacrificing part of the program s future architecture. Human exploration also has a history of sacrificing fully functional flight hardware to achieve mission objectives. An affordable exploration program cannot be built when it involves billions of dollars of discarded space flight hardware, instead, the program must emphasize preserving its high value space assets and building a suitable permanent infrastructure. Further this infrastructure must reduce operational and logistics cost. The paper examines the importance of achieving a high level of logistics independence by minimizing resource consumption, minimizing the dependency on external logistics, and maximizing the utility of resources available. The approach involves the development and deployment of a core suite of technologies that have minimum initial needs yet are able expand upon initial capability in an incremental bootstrap fashion. The bootstrap approach incrementally creates an infrastructure that grows and becomes self sustaining and eventually begins producing the energy, products and consumable propellants that support human exploration. The bootstrap technologies involve new methods of delivering and manipulating energy and materials. These technologies will exploit the space environment, minimize dependencies, and minimize the need for imported resources. They will provide the widest range of utility in a resource scarce environment and pave the way to an affordable exploration program.
Comparison of bootstrap approaches for estimation of uncertainties of DTI parameters.
Chung, SungWon; Lu, Ying; Henry, Roland G
2006-11-01
Bootstrap is an empirical non-parametric statistical technique based on data resampling that has been used to quantify uncertainties of diffusion tensor MRI (DTI) parameters, useful in tractography and in assessing DTI methods. The current bootstrap method (repetition bootstrap) used for DTI analysis performs resampling within the data sharing common diffusion gradients, requiring multiple acquisitions for each diffusion gradient. Recently, wild bootstrap was proposed that can be applied without multiple acquisitions. In this paper, two new approaches are introduced called residual bootstrap and repetition bootknife. We show that repetition bootknife corrects for the large bias present in the repetition bootstrap method and, therefore, better estimates the standard errors. Like wild bootstrap, residual bootstrap is applicable to single acquisition scheme, and both are based on regression residuals (called model-based resampling). Residual bootstrap is based on the assumption that non-constant variance of measured diffusion-attenuated signals can be modeled, which is actually the assumption behind the widely used weighted least squares solution of diffusion tensor. The performances of these bootstrap approaches were compared in terms of bias, variance, and overall error of bootstrap-estimated standard error by Monte Carlo simulation. We demonstrate that residual bootstrap has smaller biases and overall errors, which enables estimation of uncertainties with higher accuracy. Understanding the properties of these bootstrap procedures will help us to choose the optimal approach for estimating uncertainties that can benefit hypothesis testing based on DTI parameters, probabilistic fiber tracking, and optimizing DTI methods.
Infants Use Social Context to Bind Actions into a Collaborative Sequence
ERIC Educational Resources Information Center
Fawcett, Christine; Gredebäck, Gustaf
2013-01-01
Eye tracking was used to show that 18-month-old infants are sensitive to social context as a sign that others' actions are bound together as a collaborative sequence based on a joint goal. Infants observed five identical demonstrations in which Actor 1 moved a block to one location and Actor 2 moved the same block to a new location, creating…
ERIC Educational Resources Information Center
Wyeth, Peta; Purchase, Helen
2002-01-01
Electronic Blocks are a new programming environment designed specifically for children between three and eight years of age. As such, the design of the Electronic Block environment is firmly based on principles of developmentally appropriate practices in early childhood education. Electronic Blocks are the physical embodiment of computer…
Wang, Yi; Zheng, Tong; Zhao, Ying; Jiang, Jiping; Wang, Yuanyuan; Guo, Liang; Wang, Peng
2013-12-01
In this paper, bootstrapped wavelet neural network (BWNN) was developed for predicting monthly ammonia nitrogen (NH(4+)-N) and dissolved oxygen (DO) in Harbin region, northeast of China. The Morlet wavelet basis function (WBF) was employed as a nonlinear activation function of traditional three-layer artificial neural network (ANN) structure. Prediction intervals (PI) were constructed according to the calculated uncertainties from the model structure and data noise. Performance of BWNN model was also compared with four different models: traditional ANN, WNN, bootstrapped ANN, and autoregressive integrated moving average model. The results showed that BWNN could handle the severely fluctuating and non-seasonal time series data of water quality, and it produced better performance than the other four models. The uncertainty from data noise was smaller than that from the model structure for NH(4+)-N; conversely, the uncertainty from data noise was larger for DO series. Besides, total uncertainties in the low-flow period were the biggest due to complicated processes during the freeze-up period of the Songhua River. Further, a data missing-refilling scheme was designed, and better performances of BWNNs for structural data missing (SD) were observed than incidental data missing (ID). For both ID and SD, temporal method was satisfactory for filling NH(4+)-N series, whereas spatial imputation was fit for DO series. This filling BWNN forecasting method was applied to other areas suffering "real" data missing, and the results demonstrated its efficiency. Thus, the methods introduced here will help managers to obtain informed decisions.
From conformal blocks to path integrals in the Vaidya geometry
NASA Astrophysics Data System (ADS)
Anous, Tarek; Hartman, Thomas; Rovai, Antonin; Sonner, Julian
2017-09-01
Correlators in conformal field theory are naturally organized as a sum over conformal blocks. In holographic theories, this sum must reorganize into a path integral over bulk fields and geometries. We explore how these two sums are related in the case of a point particle moving in the background of a 3d collapsing black hole. The conformal block expansion is recast as a sum over paths of the first-quantized particle moving in the bulk geometry. Off-shell worldlines of the particle correspond to subdominant contributions in the Euclidean conformal block expansion, but these same operators must be included in order to correctly reproduce complex saddles in the Lorentzian theory. During thermalization, a complex saddle dominates under certain circumstances; in this case, the CFT correlator is not given by the Virasoro identity block in any channel, but can be recovered by summing heavy operators. This effectively converts the conformal block expansion in CFT from a sum over intermediate states to a sum over channels that mimics the bulk path integral.
Bootstrapping on Undirected Binary Networks Via Statistical Mechanics
NASA Astrophysics Data System (ADS)
Fushing, Hsieh; Chen, Chen; Liu, Shan-Yu; Koehl, Patrice
2014-09-01
We propose a new method inspired from statistical mechanics for extracting geometric information from undirected binary networks and generating random networks that conform to this geometry. In this method an undirected binary network is perceived as a thermodynamic system with a collection of permuted adjacency matrices as its states. The task of extracting information from the network is then reformulated as a discrete combinatorial optimization problem of searching for its ground state. To solve this problem, we apply multiple ensembles of temperature regulated Markov chains to establish an ultrametric geometry on the network. This geometry is equipped with a tree hierarchy that captures the multiscale community structure of the network. We translate this geometry into a Parisi adjacency matrix, which has a relative low energy level and is in the vicinity of the ground state. The Parisi adjacency matrix is then further optimized by making block permutations subject to the ultrametric geometry. The optimal matrix corresponds to the macrostate of the original network. An ensemble of random networks is then generated such that each of these networks conforms to this macrostate; the corresponding algorithm also provides an estimate of the size of this ensemble. By repeating this procedure at different scales of the ultrametric geometry of the network, it is possible to compute its evolution entropy, i.e. to estimate the evolution of its complexity as we move from a coarse to a fine description of its geometric structure. We demonstrate the performance of this method on simulated as well as real data networks.
Jiang, Wenyu; Simon, Richard
2007-12-20
This paper first provides a critical review on some existing methods for estimating the prediction error in classifying microarray data where the number of genes greatly exceeds the number of specimens. Special attention is given to the bootstrap-related methods. When the sample size n is small, we find that all the reviewed methods suffer from either substantial bias or variability. We introduce a repeated leave-one-out bootstrap (RLOOB) method that predicts for each specimen in the sample using bootstrap learning sets of size ln. We then propose an adjusted bootstrap (ABS) method that fits a learning curve to the RLOOB estimates calculated with different bootstrap learning set sizes. The ABS method is robust across the situations we investigate and provides a slightly conservative estimate for the prediction error. Even with small samples, it does not suffer from large upward bias as the leave-one-out bootstrap and the 0.632+ bootstrap, and it does not suffer from large variability as the leave-one-out cross-validation in microarray applications. Copyright (c) 2007 John Wiley & Sons, Ltd.
Fast, Exact Bootstrap Principal Component Analysis for p > 1 million
Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim
2015-01-01
Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject (p) is much larger than the number of subjects (n), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n-dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n-dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p-dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings (p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) (p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods. PMID:27616801
Chaibub Neto, Elias
2015-01-01
In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson’s sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling. PMID:26125965
A Bullet-Block Experiment That Explains the Chain Fountain
ERIC Educational Resources Information Center
Pantaleone, J.; Smith, R.
2018-01-01
It is common in science for two phenomena to appear to be very different, but in fact follow from the same basic principles. Here we consider such a case, the connection between the chain fountain and a bullet-block collision experiment. When an upward moving bullet strikes a wooden block resting on a horizontal table, the block will rise to a…
Gunasekara, Chathura; Zhang, Kui; Deng, Wenping; Brown, Laura
2018-01-01
Abstract Despite their important roles, the regulators for most metabolic pathways and biological processes remain elusive. Presently, the methods for identifying metabolic pathway and biological process regulators are intensively sought after. We developed a novel algorithm called triple-gene mutual interaction (TGMI) for identifying these regulators using high-throughput gene expression data. It first calculated the regulatory interactions among triple gene blocks (two pathway genes and one transcription factor (TF)), using conditional mutual information, and then identifies significantly interacted triple genes using a newly identified novel mutual interaction measure (MIM), which was substantiated to reflect strengths of regulatory interactions within each triple gene block. The TGMI calculated the MIM for each triple gene block and then examined its statistical significance using bootstrap. Finally, the frequencies of all TFs present in all significantly interacted triple gene blocks were calculated and ranked. We showed that the TFs with higher frequencies were usually genuine pathway regulators upon evaluating multiple pathways in plants, animals and yeast. Comparison of TGMI with several other algorithms demonstrated its higher accuracy. Therefore, TGMI will be a valuable tool that can help biologists to identify regulators of metabolic pathways and biological processes from the exploded high-throughput gene expression data in public repositories. PMID:29579312
Coefficient Omega Bootstrap Confidence Intervals: Nonnormal Distributions
ERIC Educational Resources Information Center
Padilla, Miguel A.; Divers, Jasmin
2013-01-01
The performance of the normal theory bootstrap (NTB), the percentile bootstrap (PB), and the bias-corrected and accelerated (BCa) bootstrap confidence intervals (CIs) for coefficient omega was assessed through a Monte Carlo simulation under conditions not previously investigated. Of particular interests were nonnormal Likert-type and binary items.…
Tests of Independence for Ordinal Data Using Bootstrap.
ERIC Educational Resources Information Center
Chan, Wai; Yung, Yiu-Fai; Bentler, Peter M.; Tang, Man-Lai
1998-01-01
Two bootstrap tests are proposed to test the independence hypothesis in a two-way cross table. Monte Carlo studies are used to compare the traditional asymptotic test with these bootstrap methods, and the bootstrap methods are found superior in two ways: control of Type I error and statistical power. (SLD)
29 CFR 1926.1501 - Cranes and derricks.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., chains, or other reciprocating, rotating, or other moving parts or equipment shall be guarded if such... more than one hoisting unit, each hoist shall have its rated load marked on it or its load block, and... contact between the load block or overhaul ball and the boom tip (anti-two-blocking device), or a system...
From conformal blocks to path integrals in the Vaidya geometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anous, Tarek; Hartman, Thomas; Rovai, Antonin
Correlators in conformal field theory are naturally organized as a sum over conformal blocks. In holographic theories, this sum must reorganize into a path integral over bulk fields and geometries. We explore how these two sums are related in the case of a point particle moving in the background of a 3d collapsing black hole. The conformal block expansion is recast as a sum over paths of the first-quantized particle moving in the bulk geometry. Off-shell worldlines of the particle correspond to subdominant contributions in the Euclidean conformal block expansion, but these same operators must be included in order tomore » correctly reproduce complex saddles in the Lorentzian theory. During thermalization, a complex saddle dominates under certain circumstances; in this case, the CFT correlator is not given by the Virasoro identity block in any channel, but can be recovered by summing heavy operators. This effectively converts the conformal block expansion in CFT from a sum over intermediate states to a sum over channels that mimics the bulk path integral.« less
From conformal blocks to path integrals in the Vaidya geometry
Anous, Tarek; Hartman, Thomas; Rovai, Antonin; ...
2017-09-04
Correlators in conformal field theory are naturally organized as a sum over conformal blocks. In holographic theories, this sum must reorganize into a path integral over bulk fields and geometries. We explore how these two sums are related in the case of a point particle moving in the background of a 3d collapsing black hole. The conformal block expansion is recast as a sum over paths of the first-quantized particle moving in the bulk geometry. Off-shell worldlines of the particle correspond to subdominant contributions in the Euclidean conformal block expansion, but these same operators must be included in order tomore » correctly reproduce complex saddles in the Lorentzian theory. During thermalization, a complex saddle dominates under certain circumstances; in this case, the CFT correlator is not given by the Virasoro identity block in any channel, but can be recovered by summing heavy operators. This effectively converts the conformal block expansion in CFT from a sum over intermediate states to a sum over channels that mimics the bulk path integral.« less
Thin sheet casting with electromagnetic pressurization
Walk, Steven R.; Slepian, R. Michael; Nathenson, Richard D.; Williams, Robert S.
1991-01-01
An apparatus, method and system for the casting of thin strips or strips of metal upon a moving chill block that includes an electromagnet located so that molten metal poured from a reservoir onto the chill block passes into the magnetic field produced by the electromagnet. The electromagnet produces a force on the molten metal on said chill block in the direction toward said chill block in order to enhance thermal contact between the molten metal and the chill block.
ERIC Educational Resources Information Center
Fan, Xitao
This paper empirically and systematically assessed the performance of bootstrap resampling procedure as it was applied to a regression model. Parameter estimates from Monte Carlo experiments (repeated sampling from population) and bootstrap experiments (repeated resampling from one original bootstrap sample) were generated and compared. Sample…
ERIC Educational Resources Information Center
Spinella, Sarah
2011-01-01
As result replicability is essential to science and difficult to achieve through external replicability, the present paper notes the insufficiency of null hypothesis statistical significance testing (NHSST) and explains the bootstrap as a plausible alternative, with a heuristic example to illustrate the bootstrap method. The bootstrap relies on…
ERIC Educational Resources Information Center
Nevitt, Jonathan; Hancock, Gregory R.
2001-01-01
Evaluated the bootstrap method under varying conditions of nonnormality, sample size, model specification, and number of bootstrap samples drawn from the resampling space. Results for the bootstrap suggest the resampling-based method may be conservative in its control over model rejections, thus having an impact on the statistical power associated…
Nonparametric bootstrap analysis with applications to demographic effects in demand functions.
Gozalo, P L
1997-12-01
"A new bootstrap proposal, labeled smooth conditional moment (SCM) bootstrap, is introduced for independent but not necessarily identically distributed data, where the classical bootstrap procedure fails.... A good example of the benefits of using nonparametric and bootstrap methods is the area of empirical demand analysis. In particular, we will be concerned with their application to the study of two important topics: what are the most relevant effects of household demographic variables on demand behavior, and to what extent present parametric specifications capture these effects." excerpt
Effects of magnetic islands on bootstrap current in toroidal plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, G.; Lin, Z.
The effects of magnetic islands on electron bootstrap current in toroidal plasmas are studied using gyrokinetic simulations. The magnetic islands cause little changes of the bootstrap current level in the banana regime because of trapped electron effects. In the plateau regime, the bootstrap current is completely suppressed at the island centers due to the destruction of trapped electron orbits by collisions and the flattening of pressure profiles by the islands. In the collisional regime, small but finite bootstrap current can exist inside the islands because of the pressure gradients created by large collisional transport across the islands. Lastly, simulation resultsmore » show that the bootstrap current level increases near the island separatrix due to steeper local density gradients.« less
Effects of magnetic islands on bootstrap current in toroidal plasmas
Dong, G.; Lin, Z.
2016-12-19
The effects of magnetic islands on electron bootstrap current in toroidal plasmas are studied using gyrokinetic simulations. The magnetic islands cause little changes of the bootstrap current level in the banana regime because of trapped electron effects. In the plateau regime, the bootstrap current is completely suppressed at the island centers due to the destruction of trapped electron orbits by collisions and the flattening of pressure profiles by the islands. In the collisional regime, small but finite bootstrap current can exist inside the islands because of the pressure gradients created by large collisional transport across the islands. Lastly, simulation resultsmore » show that the bootstrap current level increases near the island separatrix due to steeper local density gradients.« less
System and method for 100% moisture and basis weight measurement of moving paper
Hernandez, Jose E.; Koo, Jackson C.
2002-01-01
A system for characterizing a set of properties for a moving substance are disclosed. The system includes: a first near-infrared linear array; a second near-infrared linear array; a first filter transparent to a first absorption wavelength emitted by the moving substance and juxtaposed between the substance and the first array; a second filter blocking the first absorption wavelength emitted by the moving substance and juxtaposed between the substance and the second array; and a computational device for characterizing data from the arrays into information on a property of the substance. The method includes the steps of: filtering out a first absorption wavelength emitted by a substance; monitoring the first absorption wavelength with a first near-infrared linear array; blocking the first wavelength from reaching a second near-infrared linear array; and characterizing data from the arrays into information on a property of the substance.
NASA Astrophysics Data System (ADS)
Bhardwaj, Alok; Ziegler, Alan D.; Wasson, Robert J.; Chow, Winston; Sharma, Mukat L.
2017-04-01
Extreme monsoon rainfall is the primary reason of floods and other secondary hazards such as landslides in the Indian Himalaya. Understanding the phenomena of extreme monsoon rainfall is therefore required to study the natural hazards. In this work, we study the characteristics of extreme monsoon rainfall including its intensity and frequency in the Garhwal Himalaya in India, with a focus on the Mandakini River Catchment, the site of devastating flood and multiple large landslides in 2013. We have used two long term rainfall gridded data sets: the Asian Precipitation Highly Resolved Observational Data Integration Towards Evaluation of Water Resources (APHRODITE) product with daily rainfall data from 1951-2007 and the India Meteorological Department (IMD) product with daily rainfall data from 1901 to 2013. Two methods of Mann Kendall and Sen Slope estimator are used to identify the statistical significance and magnitude of trends in intensity and frequency of extreme monsoon rainfall respectively, at a significance level of 0.05. The autocorrelation in the time series of extreme monsoon rainfall is identified and reduced using the methods of: pre-whitening, trend-free pre-whitening, variance correction, and block bootstrap. We define extreme monsoon rainfall threshold as the 99th percentile of time series of rainfall values and any rainfall depth greater than 99th percentile is considered as extreme in nature. With the IMD data set, significant increasing trend in intensity and frequency of extreme rainfall with slope magnitude of 0.55 and 0.02 respectively was obtained in the north of the Mandakini Catchment as identified by all four methods. Significant increasing trend in intensity with a slope magnitude of 0.3 is found in the middle of the catchment as identified by all methods except block bootstrap. In the south of the catchment, significant increasing trend in intensity with a slope magnitude of 0.86 for pre-whitening method and 0.28 for trend-free pre-whitening and variance correction methods was obtained. Further, increasing trend in frequency with a slope magnitude of 0.01 was identified by three methods except block bootstrap in the south of the catchment. With the APHRODITE data set, we obtained significant increasing trend in intensity with a slope magnitude of 1.27 at the middle of the catchment as identified by all four methods. Collectively, both the datasets show signals of increasing intensity, and IMD shows results for increasing frequency in the Mandakini Catchment. The increasing occurrence of extreme events, as identified here, is becoming more disastrous because of rising human population and infrastructure in the Mandakini Catchment. For example, the 2013 flood due to extreme rainfall was catastrophic in terms of loss of human and animal lives and destruction of the local economy. We believe our results will help understand more about extreme rainfall events in the Mandakini Catchment and in the Indian Himalaya.
Kaufmann, Esther; Wittmann, Werner W.
2016-01-01
The success of bootstrapping or replacing a human judge with a model (e.g., an equation) has been demonstrated in Paul Meehl’s (1954) seminal work and bolstered by the results of several meta-analyses. To date, however, analyses considering different types of meta-analyses as well as the potential dependence of bootstrapping success on the decision domain, the level of expertise of the human judge, and the criterion for what constitutes an accurate decision have been missing from the literature. In this study, we addressed these research gaps by conducting a meta-analysis of lens model studies. We compared the results of a traditional (bare-bones) meta-analysis with findings of a meta-analysis of the success of bootstrap models corrected for various methodological artifacts. In line with previous studies, we found that bootstrapping was more successful than human judgment. Furthermore, bootstrapping was more successful in studies with an objective decision criterion than in studies with subjective or test score criteria. We did not find clear evidence that the success of bootstrapping depended on the decision domain (e.g., education or medicine) or on the judge’s level of expertise (novice or expert). Correction of methodological artifacts increased the estimated success of bootstrapping, suggesting that previous analyses without artifact correction (i.e., traditional meta-analyses) may have underestimated the value of bootstrapping models. PMID:27327085
Efficient bootstrap estimates for tail statistics
NASA Astrophysics Data System (ADS)
Breivik, Øyvind; Aarnes, Ole Johan
2017-03-01
Bootstrap resamples can be used to investigate the tail of empirical distributions as well as return value estimates from the extremal behaviour of the sample. Specifically, the confidence intervals on return value estimates or bounds on in-sample tail statistics can be obtained using bootstrap techniques. However, non-parametric bootstrapping from the entire sample is expensive. It is shown here that it suffices to bootstrap from a small subset consisting of the highest entries in the sequence to make estimates that are essentially identical to bootstraps from the entire sample. Similarly, bootstrap estimates of confidence intervals of threshold return estimates are found to be well approximated by using a subset consisting of the highest entries. This has practical consequences in fields such as meteorology, oceanography and hydrology where return values are calculated from very large gridded model integrations spanning decades at high temporal resolution or from large ensembles of independent and identically distributed model fields. In such cases the computational savings are substantial.
Wanetick, S.
1962-03-01
ABS>ure the change in velocity of a moving object. The transducer includes a radioactive source having a collimated beam of radioactive particles, a shield which can block the passage of the radioactive beam, and a scintillation detector to measure the number of radioactive particles in the beam which are not blocked by the shield. The shield is operatively placed across the radioactive beam so that any motion normal to the beam will cause the shield to move in the opposite direction thereby allowing more radioactive particles to reach the detector. The number of particles detected indicates the acceleration. (AEC)
2009-03-01
the MV–kV correlation method by sinusoidally moving a block of solid water (measuring 5 × 5× 10 cm3) containing three embedded BB metallic markers (3 mm...in diameter). A 4D motion platform (Washington University, St Louis, MO) holding the block of solid water (figure 1) was programmed to produce the...Varian Medical Systems, Palo Alto, CA). The SS-IMRT plans were delivered to either the moving pelvic phantom or the cube of solid water attached to the
Norris, David C; Wilson, Andrew
2016-01-01
In a 2014 report on adolescent mental health outcomes in the Moving to Opportunity for Fair Housing Demonstration (MTO), Kessler et al. reported that, at 10- to 15-year follow-up, boys from households randomized to an experimental housing voucher intervention experienced 12-month prevalence of post-traumatic stress disorder (PTSD) at several times the rate of boys from control households. We reanalyze this finding here, bringing to light a PTSD outcome imputation procedure used in the original analysis, but not described in the study report. By bootstrapping with repeated draws from the frequentist sampling distribution of the imputation model used by Kessler et al., and by varying two pseudorandom number generator seeds that fed their analysis, we account for several purely statistical components of the uncertainty inherent in their imputation procedure. We also discuss other sources of uncertainty in this procedure that were not accessible to a formal reanalysis.
What Teachers Should Know About the Bootstrap: Resampling in the Undergraduate Statistics Curriculum
Hesterberg, Tim C.
2015-01-01
Bootstrapping has enormous potential in statistics education and practice, but there are subtle issues and ways to go wrong. For example, the common combination of nonparametric bootstrapping and bootstrap percentile confidence intervals is less accurate than using t-intervals for small samples, though more accurate for larger samples. My goals in this article are to provide a deeper understanding of bootstrap methods—how they work, when they work or not, and which methods work better—and to highlight pedagogical issues. Supplementary materials for this article are available online. [Received December 2014. Revised August 2015] PMID:27019512
Strategies for Teaching in a Block-of-Time Schedule.
ERIC Educational Resources Information Center
Hackmann, Donald G.; Schmitt, Donna M.
1997-01-01
Offers suggestions for developing creative instructional approaches in time-blocked classes. Teachers should continuously engage students in active learning, include group activities to encourage student participation, incorporate activities addressing multiple intelligences, use creative thinking activities, move outside the classroom, employ…
NASA Astrophysics Data System (ADS)
Brenning, A.; Schwinn, M.; Ruiz-Páez, A. P.; Muenchow, J.
2014-03-01
Mountain roads in developing countries are known to increase landslide occurrence due to often inadequate drainage systems and mechanical destabilization of hillslopes by undercutting and overloading. This study empirically investigates landslide initiation frequency along two paved interurban highways in the tropical Andes of southern Ecuador across different climatic regimes. Generalized additive models (GAM) and generalized linear models (GLM) were used to analyze the relationship between mapped landslide initiation points and distance to highway while accounting for topographic, climatic and geological predictors as possible confounders. A spatial block bootstrap was used to obtain non-parametric confidence intervals for the odds ratio of landslide occurrence near the highways (25 m distance) compared to a 200 m distance. The estimated odds ratio was 18-21 with lower 95% confidence bounds > 13 in all analyses. Spatial bootstrap estimation using the GAM supports the higher odds ratio estimate of 21.2 (95% confidence interval: 15.5-25.3). The highway-related effects were observed to fade at about 150 m distance. Road effects appear to be enhanced in geological units characterized by Holocene gravels and Laramide andesite/basalt. Overall, landslide susceptibility was found to be more than one order of magnitude higher in close proximity to paved interurban highways in the Andes of southern Ecuador.
NASA Astrophysics Data System (ADS)
Brenning, A.; Schwinn, M.; Ruiz-Páez, A. P.; Muenchow, J.
2015-01-01
Mountain roads in developing countries are known to increase landslide occurrence due to often inadequate drainage systems and mechanical destabilization of hillslopes by undercutting and overloading. This study empirically investigates landslide initiation frequency along two paved interurban highways in the tropical Andes of southern Ecuador across different climatic regimes. Generalized additive models (GAM) and generalized linear models (GLM) were used to analyze the relationship between mapped landslide initiation points and distance to highway while accounting for topographic, climatic, and geological predictors as possible confounders. A spatial block bootstrap was used to obtain nonparametric confidence intervals for the odds ratio of landslide occurrence near the highways (25 m distance) compared to a 200 m distance. The estimated odds ratio was 18-21, with lower 95% confidence bounds >13 in all analyses. Spatial bootstrap estimation using the GAM supports the higher odds ratio estimate of 21.2 (95% confidence interval: 15.5-25.3). The highway-related effects were observed to fade at about 150 m distance. Road effects appear to be enhanced in geological units characterized by Holocene gravels and Laramide andesite/basalt. Overall, landslide susceptibility was found to be more than 1 order of magnitude higher in close proximity to paved interurban highways in the Andes of southern Ecuador.
"What Do I Teach for 90 Minutes?" Creating a Successful Block-Scheduled English Classroom.
ERIC Educational Resources Information Center
Porter, Carol
The story of the process that Mundelein High School (located in a northwest suburb of Chicago, Illinois) as it moved from a traditional schedule to a block schedule is described throughout this book as a way to blend theory with practice. The book addresses types of block schedules; key issues for effective preparation; professional development…
Reduced ion bootstrap current drive on NTM instability
NASA Astrophysics Data System (ADS)
Qu, Hongpeng; Wang, Feng; Wang, Aike; Peng, Xiaodong; Li, Jiquan
2018-05-01
The loss of bootstrap current inside magnetic island plays a dominant role in driving the neoclassical tearing mode (NTM) instability in tokamak plasmas. In this work, we investigate the finite-banana-width (FBW) effect on the profile of ion bootstrap current in the island vicinity via an analytical approach. The results show that even if the pressure gradient vanishes inside the island, the ion bootstrap current can partly survive due to the FBW effect. The efficiency of the FBW effect is higher when the island width becomes smaller. Nevertheless, even when the island width is comparable to the ion FBW, the unperturbed ion bootstrap current inside the island cannot be largely recovered by the FBW effect, and thus the current loss still exists. This suggests that FBW effect alone cannot dramatically reduce the ion bootstrap current drive on NTMs.
Warton, David I; Thibaut, Loïc; Wang, Yi Alice
2017-01-01
Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)-common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of "model-free bootstrap", adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods.
Bootstrap Percolation on Homogeneous Trees Has 2 Phase Transitions
NASA Astrophysics Data System (ADS)
Fontes, L. R. G.; Schonmann, R. H.
2008-09-01
We study the threshold θ bootstrap percolation model on the homogeneous tree with degree b+1, 2≤ θ≤ b, and initial density p. It is known that there exists a nontrivial critical value for p, which we call p f , such that a) for p> p f , the final bootstrapped configuration is fully occupied for almost every initial configuration, and b) if p< p f , then for almost every initial configuration, the final bootstrapped configuration has density of occupied vertices less than 1. In this paper, we establish the existence of a distinct critical value for p, p c , such that 0< p c < p f , with the following properties: 1) if p≤ p c , then for almost every initial configuration there is no infinite cluster of occupied vertices in the final bootstrapped configuration; 2) if p> p c , then for almost every initial configuration there are infinite clusters of occupied vertices in the final bootstrapped configuration. Moreover, we show that 3) for p< p c , the distribution of the occupied cluster size in the final bootstrapped configuration has an exponential tail; 4) at p= p c , the expected occupied cluster size in the final bootstrapped configuration is infinite; 5) the probability of percolation of occupied vertices in the final bootstrapped configuration is continuous on [0, p f ] and analytic on ( p c , p f ), admitting an analytic continuation from the right at p c and, only in the case θ= b, also from the left at p f .
A new dry hypothesis for the formation of Martian linear gullies
Diniega, Serina; Hansen, Candice J.; McElwaine, Jim N.; Hugenholtz, C.H.; Dundas, Colin M.; McEwen, Alfred S.; Bourke, Mary C.
2013-01-01
Long, narrow grooves found on the slopes of martian sand dunes have been cited as evidence of liquid water via the hypothesis that melt-water initiated debris flows eroded channels and deposited lateral levées. However, this theory has several short-comings for explaining the observed morphology and activity of these linear gullies. We present an alternative hypothesis that is consistent with the observed morphology, location, and current activity: that blocks of CO2 ice break from over-steepened cornices as sublimation processes destabilize the surface in the spring, and these blocks move downslope, carving out levéed grooves of relatively uniform width and forming terminal pits. To test this hypothesis, we describe experiments involving water and CO2 blocks on terrestrial dunes and then compare results with the martian features. Furthermore, we present a theoretical model of the initiation of block motion due to sublimation and use this to quantitatively compare the expected behavior of blocks on the Earth and Mars. The model demonstrates that CO2 blocks can be expected to move via our proposed mechanism on the Earth and Mars, and the experiments show that the motion of these blocks will naturally create the main morphological features of linear gullies seen on Mars.
NASA Astrophysics Data System (ADS)
Park, Sang-Gon; Jeong, Dong-Seok
2000-12-01
In this paper, we propose a fast adaptive diamond search algorithm (FADS) for block matching motion estimation. Many fast motion estimation algorithms reduce the computational complexity by the UESA (Unimodal Error Surface Assumption) where the matching error monotonically increases as the search moves away from the global minimum point. Recently, many fast BMAs (Block Matching Algorithms) make use of the fact that global minimum points in real world video sequences are centered at the position of zero motion. But these BMAs, especially in large motion, are easily trapped into the local minima and result in poor matching accuracy. So, we propose a new motion estimation algorithm using the spatial correlation among the neighboring blocks. We move the search origin according to the motion vectors of the spatially neighboring blocks and their MAEs (Mean Absolute Errors). The computer simulation shows that the proposed algorithm has almost the same computational complexity with DS (Diamond Search), but enhances PSNR. Moreover, the proposed algorithm gives almost the same PSNR as that of FS (Full Search), even for the large motion with half the computational load.
How to push a block along a wall
NASA Technical Reports Server (NTRS)
Mason, Matthew T.
1989-01-01
Some robot tasks require manipulation of objects that may be touching other fixed objects. The effects of friction and kinematic constraint must be anticipated, and may even be exploited to accomplish the task. An example task, a dynamic analysis, and appropriate effector motions are presented. The goal is to move a rectangular block along a wall, so that one side of the block maintains contact with the wall. Two solutions that push the block along the wall are discussed.
Xiao, Yongling; Abrahamowicz, Michal
2010-03-30
We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.
Darling, Stephen; Parker, Mary-Jane; Goodall, Karen E; Havelka, Jelena; Allen, Richard J
2014-03-01
When participants carry out visually presented digit serial recall, their performance is better if they are given the opportunity to encode extra visuospatial information at encoding-a phenomenon that has been termed visuospatial bootstrapping. This bootstrapping is the result of integration of information from different modality-specific short-term memory systems and visuospatial knowledge in long term memory, and it can be understood in the context of recent models of working memory that address multimodal binding (e.g., models incorporating an episodic buffer). Here we report a cross-sectional developmental study that demonstrated visuospatial bootstrapping in adults (n=18) and 9-year-old children (n=15) but not in 6-year-old children (n=18). This is the first developmental study addressing visuospatial bootstrapping, and results demonstrate that the developmental trajectory of bootstrapping is different from that of basic verbal and visuospatial working memory. This pattern suggests that bootstrapping (and hence integrative functions such as those associated with the episodic buffer) emerge independent of the development of basic working memory slave systems during childhood. Copyright © 2013 Elsevier Inc. All rights reserved.
A bootstrap based space-time surveillance model with an application to crime occurrences
NASA Astrophysics Data System (ADS)
Kim, Youngho; O'Kelly, Morton
2008-06-01
This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.
NASA Astrophysics Data System (ADS)
Elliott, J.; Freymueller, J. T.; Larsen, C. F.; Motyka, R. J.
2010-12-01
GPS data from southern Alaska and the northern Canadian Cordillera have helped redefine the region’s tectonic landscape. Instead of a comparatively simple interaction between the Pacific and North American plates, with relative motion accommodated on a single boundary fault, we find a margin made up of a number of small blocks and deformation zones with relative motion distributed across a variety of structures. Much of this complexity can be attributed to the Yakutat block, an allochthonous terrane that has been colliding with southern Alaska since the Miocene. We present a GPS-derived tectonic model for the Yakutat block collision and its effects on southern Alaska and eastern Canada. The Yakutat block moves NNW at a rate of 50 mm/a, resulting in ~ 45 mm/a of NW-directed convergence with southern Alaska. Along its eastern edge, the Yakutat block is deforming, represented in our model by two small northwesterly moving blocks outboard of the Fairweather fault. Part of the strain from the collision is transferred east of the Fairweather - Queen Charlotte fault system, causing the region inboard of the Fairweather fault to undergo a distinct clockwise rotation into the northern Canadian Cordillera. Further south, the region directly east of the Queen Charlotte fault displays a much slower clockwise rotation, suggesting that it is at least partially pulled along by the northern block motion. About 5% of the relative motion is transferred even further east, causing small northeasterly motions well into the northern Cordillera. The northwestern edge of the Yakutat block marks the main deformation front between that block and southern Alaska. Multiple narrow, northwesterly moving blocks bounded by N- to NW-dipping thrust faults are required to explain the GPS data between the Malaspina Glacier and the Bagley Ice Valley. These “blocks” may be more aptly termed crustal slivers or deformation zones due to their size and because their bounding faults may sole out into a main thrust instead of cutting through the lithosphere. In contrast with the region to the east, relative convergence is accommodated over a fairly short distance across the St. Elias Mountains. West of the deformation front, the en echelon blocks and faults continue until the vicinity of the Bering Glacier, where the GPS data reveal a rotation towards the north as the tectonic regime transitions from the collision and accretion of the Yakutat block to subduction along the Aleutian Megathrust. North of the Chugach and St. Elias Ranges, the Southern Alaska block rotates counterclockwise.
ERIC Educational Resources Information Center
Enders, Craig K.
2005-01-01
The Bollen-Stine bootstrap can be used to correct for standard error and fit statistic bias that occurs in structural equation modeling (SEM) applications due to nonnormal data. The purpose of this article is to demonstrate the use of a custom SAS macro program that can be used to implement the Bollen-Stine bootstrap with existing SEM software.…
NASA Astrophysics Data System (ADS)
Monticello, D. A.; Reiman, A. H.; Watanabe, K. Y.; Nakajima, N.; Okamoto, M.
1997-11-01
The existence of bootstrap currents in both tokamaks and stellarators was confirmed, experimentally, more than ten years ago. Such currents can have significant effects on the equilibrium and stability of these MHD devices. In addition, stellarators, with the notable exception of W7-X, are predicted to have such large bootstrap currents that reliable equilibrium calculations require the self-consistent evaluation of bootstrap currents. Modeling of discharges which contain islands requires an algorithm that does not assume good surfaces. Only one of the two 3-D equilibrium codes that exist, PIES( Reiman, A. H., Greenside, H. S., Compt. Phys. Commun. 43), (1986)., can easily be modified to handle bootstrap current. Here we report on the coupling of the PIES 3-D equilibrium code and NIFS bootstrap code(Watanabe, K., et al., Nuclear Fusion 35) (1995), 335.
Comparison of Sample Size by Bootstrap and by Formulas Based on Normal Distribution Assumption.
Wang, Zuozhen
2018-01-01
Bootstrapping technique is distribution-independent, which provides an indirect way to estimate the sample size for a clinical trial based on a relatively smaller sample. In this paper, sample size estimation to compare two parallel-design arms for continuous data by bootstrap procedure are presented for various test types (inequality, non-inferiority, superiority, and equivalence), respectively. Meanwhile, sample size calculation by mathematical formulas (normal distribution assumption) for the identical data are also carried out. Consequently, power difference between the two calculation methods is acceptably small for all the test types. It shows that the bootstrap procedure is a credible technique for sample size estimation. After that, we compared the powers determined using the two methods based on data that violate the normal distribution assumption. To accommodate the feature of the data, the nonparametric statistical method of Wilcoxon test was applied to compare the two groups in the data during the process of bootstrap power estimation. As a result, the power estimated by normal distribution-based formula is far larger than that by bootstrap for each specific sample size per group. Hence, for this type of data, it is preferable that the bootstrap method be applied for sample size calculation at the beginning, and that the same statistical method as used in the subsequent statistical analysis is employed for each bootstrap sample during the course of bootstrap sample size estimation, provided there is historical true data available that can be well representative of the population to which the proposed trial is planning to extrapolate.
Application of the Bootstrap Methods in Factor Analysis.
ERIC Educational Resources Information Center
Ichikawa, Masanori; Konishi, Sadanori
1995-01-01
A Monte Carlo experiment was conducted to investigate the performance of bootstrap methods in normal theory maximum likelihood factor analysis when the distributional assumption was satisfied or unsatisfied. Problems arising with the use of bootstrap methods are highlighted. (SLD)
SOSlope: a new slope stability model for vegetated hillslopes
NASA Astrophysics Data System (ADS)
Cohen, D.; Schwarz, M.
2016-12-01
Roots contribute to increase soil strength but forces mobilized by roots depend on soil relative displacement. This effect is not included in models of slope stability. Here we present a new numerical model of shallow landslides for vegetated hillslopes that uses a strain-step loading approach for force redistributions within a soil mass including the effects of root strength in both tension and compression. The hillslope is discretized into a two-dimensional array of blocks connected by bonds. During a rainfall event the blocks's mass increases and the soil shear strength decreases. At each time step, we compute a factor of safety for each block. If the factor of safety of one or more blocks is less than one, those blocks are moved in the direction of the local active force by a predefined amount and the factor of safety is recalculated for all blocks. Because of the relative motion between blocks that have moved and those that remain stationary, mechanical bond forces between blocks that depend on relative displacement change, modifying the force balance. This relative motion triggers instantaneous force redistributions across the entire hillslope similar to a self-organized critical system. Looping over blocks and moving those that are unstable is repeated until all blocks are stable and the system reaches a new equilibrium, or, some blocks have failed causing a landslide. Spatial heterogeneity of vegetation is included by computing the root density and distribution as a function of distance form trees. A simple subsurface hydrological model based on dual permeability concepts is used to compute the temporal evolution of water content, pore-water pressure, suction stress, and soil shear strength. Simulations for a conceptual slope indicates that forces mobilized in tension and compression both contribute to the stability of the slope. However, the maximum tensional and compressional forces imparted by roots do not contribute simultaneously to the stability of the soil mass, in contrast to what is commonly assumed in models. Simulations with different tree sizes (different magnitude of root reinforcement) indicate that there is a threshold in tree spacing (or tree diameter) above (or below) which root density and root sizes no longer provide sufficient reinforcement to keep the slope stable during a rainfall event.
Multiple-block grid adaption for an airplane geometry
NASA Technical Reports Server (NTRS)
Abolhassani, Jamshid Samareh; Smith, Robert E.
1988-01-01
Grid-adaption methods are developed with the capability of moving grid points in accordance with several variables for a three-dimensional multiple-block grid system. These methods are algebraic, and they are implemented for the computation of high-speed flow over an airplane configuration.
NASA Technical Reports Server (NTRS)
White, III, Dorsey E. (Inventor); Updike, deceased, Benjamin T. (Inventor); Allred, Johnny W. (Inventor)
1989-01-01
A quick actuating closure for a pressure vessel 80 in which a wedge ring 30 with a conical outer surface 31 is moved forward to force shear blocks 40, with conical inner surfaces 41, radially outward to lock an end closure plug 70 within an opening 81 in the pressure vessel 80. A seal ring 60 and a preload ramp 50 sit between the shear blocks 40 and the end closure plug 70 to provide a backup sealing capability. Conical surfaces 44 and 55 of the preload ramp 50 and the shear blocks 40 interact to force the seal ring 60 into shoulders 73 and 85 in the end closure plug 70 and opening 81 to form a tight seal. The end closure plug 70 is unlocked by moving the wedge ring 30 rearward, which causes T-bars 32 of the wedge ring 30 riding within T -slots 42 of the shear blocks 40 to force them radially inward. The end closure plug 70 is then removed, allowing access to the interior of the pressure vessel 80.
Small sample mediation testing: misplaced confidence in bootstrapped confidence intervals.
Koopman, Joel; Howe, Michael; Hollenbeck, John R; Sin, Hock-Peng
2015-01-01
Bootstrapping is an analytical tool commonly used in psychology to test the statistical significance of the indirect effect in mediation models. Bootstrapping proponents have particularly advocated for its use for samples of 20-80 cases. This advocacy has been heeded, especially in the Journal of Applied Psychology, as researchers are increasingly utilizing bootstrapping to test mediation with samples in this range. We discuss reasons to be concerned with this escalation, and in a simulation study focused specifically on this range of sample sizes, we demonstrate not only that bootstrapping has insufficient statistical power to provide a rigorous hypothesis test in most conditions but also that bootstrapping has a tendency to exhibit an inflated Type I error rate. We then extend our simulations to investigate an alternative empirical resampling method as well as a Bayesian approach and demonstrate that they exhibit comparable statistical power to bootstrapping in small samples without the associated inflated Type I error. Implications for researchers testing mediation hypotheses in small samples are presented. For researchers wishing to use these methods in their own research, we have provided R syntax in the online supplemental materials. (c) 2015 APA, all rights reserved.
Bootstrap confidence levels for phylogenetic trees.
Efron, B; Halloran, E; Holmes, S
1996-07-09
Evolutionary trees are often estimated from DNA or RNA sequence data. How much confidence should we have in the estimated trees? In 1985, Felsenstein [Felsenstein, J. (1985) Evolution 39, 783-791] suggested the use of the bootstrap to answer this question. Felsenstein's method, which in concept is a straightforward application of the bootstrap, is widely used, but has been criticized as biased in the genetics literature. This paper concerns the use of the bootstrap in the tree problem. We show that Felsenstein's method is not biased, but that it can be corrected to better agree with standard ideas of confidence levels and hypothesis testing. These corrections can be made by using the more elaborate bootstrap method presented here, at the expense of considerably more computation.
NASA Technical Reports Server (NTRS)
Voellmer, George
1992-01-01
Compliant element for robot wrist accepts small displacements in one direction only (to first approximation). Three such elements combined to obtain translational compliance along three orthogonal directions, without rotational compliance along any of them. Element is double-blade flexure joint in which two sheets of spring steel attached between opposing blocks, forming rectangle. Blocks moved parallel to each other in one direction only. Sheets act as double cantilever beams deforming in S-shape, keeping blocks parallel.
Coefficient Alpha Bootstrap Confidence Interval under Nonnormality
ERIC Educational Resources Information Center
Padilla, Miguel A.; Divers, Jasmin; Newton, Matthew
2012-01-01
Three different bootstrap methods for estimating confidence intervals (CIs) for coefficient alpha were investigated. In addition, the bootstrap methods were compared with the most promising coefficient alpha CI estimation methods reported in the literature. The CI methods were assessed through a Monte Carlo simulation utilizing conditions…
Pearson-type goodness-of-fit test with bootstrap maximum likelihood estimation.
Yin, Guosheng; Ma, Yanyuan
2013-01-01
The Pearson test statistic is constructed by partitioning the data into bins and computing the difference between the observed and expected counts in these bins. If the maximum likelihood estimator (MLE) of the original data is used, the statistic generally does not follow a chi-squared distribution or any explicit distribution. We propose a bootstrap-based modification of the Pearson test statistic to recover the chi-squared distribution. We compute the observed and expected counts in the partitioned bins by using the MLE obtained from a bootstrap sample. This bootstrap-sample MLE adjusts exactly the right amount of randomness to the test statistic, and recovers the chi-squared distribution. The bootstrap chi-squared test is easy to implement, as it only requires fitting exactly the same model to the bootstrap data to obtain the corresponding MLE, and then constructs the bin counts based on the original data. We examine the test size and power of the new model diagnostic procedure using simulation studies and illustrate it with a real data set.
Collell, Guillem; Prelec, Drazen; Patil, Kaustubh R
2018-01-31
Class imbalance presents a major hurdle in the application of classification methods. A commonly taken approach is to learn ensembles of classifiers using rebalanced data. Examples include bootstrap averaging (bagging) combined with either undersampling or oversampling of the minority class examples. However, rebalancing methods entail asymmetric changes to the examples of different classes, which in turn can introduce their own biases. Furthermore, these methods often require specifying the performance measure of interest a priori, i.e., before learning. An alternative is to employ the threshold moving technique, which applies a threshold to the continuous output of a model, offering the possibility to adapt to a performance measure a posteriori , i.e., a plug-in method. Surprisingly, little attention has been paid to this combination of a bagging ensemble and threshold-moving. In this paper, we study this combination and demonstrate its competitiveness. Contrary to the other resampling methods, we preserve the natural class distribution of the data resulting in well-calibrated posterior probabilities. Additionally, we extend the proposed method to handle multiclass data. We validated our method on binary and multiclass benchmark data sets by using both, decision trees and neural networks as base classifiers. We perform analyses that provide insights into the proposed method.
Thibaut, Loïc; Wang, Yi Alice
2017-01-01
Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)—common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of “model-free bootstrap”, adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods. PMID:28738071
NASA Astrophysics Data System (ADS)
Zhu, Q.; Xu, Y. P.; Gu, H.
2014-12-01
Traditionally, regional frequency analysis methods were developed for stationary environmental conditions. Nevertheless, recent studies have identified significant changes in hydrological records, leading to the 'death' of stationarity. Besides, uncertainty in hydrological frequency analysis is persistent. This study aims to investigate the impact of one of the most important uncertainty sources, parameter uncertainty, together with nonstationarity, on design rainfall depth in Qu River Basin, East China. A spatial bootstrap is first proposed to analyze the uncertainty of design rainfall depth estimated by regional frequency analysis based on L-moments and estimated on at-site scale. Meanwhile, a method combining the generalized additive models with 30-year moving window is employed to analyze non-stationarity existed in the extreme rainfall regime. The results show that the uncertainties of design rainfall depth with 100-year return period under stationary conditions estimated by regional spatial bootstrap can reach 15.07% and 12.22% with GEV and PE3 respectively. On at-site scale, the uncertainties can reach 17.18% and 15.44% with GEV and PE3 respectively. In non-stationary conditions, the uncertainties of maximum rainfall depth (corresponding to design rainfall depth) with 0.01 annual exceedance probability (corresponding to 100-year return period) are 23.09% and 13.83% with GEV and PE3 respectively. Comparing the 90% confidence interval, the uncertainty of design rainfall depth resulted from parameter uncertainty is less than that from non-stationarity frequency analysis with GEV, however, slightly larger with PE3. This study indicates that the spatial bootstrap can be successfully applied to analyze the uncertainty of design rainfall depth on both regional and at-site scales. And the non-stationary analysis shows that the differences between non-stationary quantiles and their stationary equivalents are important for decision makes of water resources management and risk management.
ERIC Educational Resources Information Center
Trundle, Kathy Cabe; Smith, Mandy McCormick
2011-01-01
Some of children's earliest explorations focus on movement of their own bodies. Quickly, children learn to further explore movement by using objects like a ball or car. They recognize that a ball moves differently than a pushed block. As they grow, children enjoy their experiences with motion and movement, including making objects move, changing…
Recycling Buildings for Libraries: A Moving Account.
ERIC Educational Resources Information Center
Shields, Gerald R.
1994-01-01
Described a project that moved a retired Carnegie library four blocks and back into service as an annex to the Mexico-Audrain Library System (Missouri). Insights are provided into the practicality of recycling buildings as public library facilities and the effects that such efforts can have on community pride and involvement. (SLW)
Bootstrap Estimates of Standard Errors in Generalizability Theory
ERIC Educational Resources Information Center
Tong, Ye; Brennan, Robert L.
2007-01-01
Estimating standard errors of estimated variance components has long been a challenging task in generalizability theory. Researchers have speculated about the potential applicability of the bootstrap for obtaining such estimates, but they have identified problems (especially bias) in using the bootstrap. Using Brennan's bias-correcting procedures…
Problems with Multivariate Normality: Can the Multivariate Bootstrap Help?
ERIC Educational Resources Information Center
Thompson, Bruce
Multivariate normality is required for some statistical tests. This paper explores the implications of violating the assumption of multivariate normality and illustrates a graphical procedure for evaluating multivariate normality. The logic for using the multivariate bootstrap is presented. The multivariate bootstrap can be used when distribution…
Unbiased Estimates of Variance Components with Bootstrap Procedures
ERIC Educational Resources Information Center
Brennan, Robert L.
2007-01-01
This article provides general procedures for obtaining unbiased estimates of variance components for any random-model balanced design under any bootstrap sampling plan, with the focus on designs of the type typically used in generalizability theory. The results reported here are particularly helpful when the bootstrap is used to estimate standard…
Explorations in Statistics: the Bootstrap
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2009-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fourth installment of Explorations in Statistics explores the bootstrap. The bootstrap gives us an empirical approach to estimate the theoretical variability among possible values of a sample statistic such as the…
Bootstrapping Confidence Intervals for Robust Measures of Association.
ERIC Educational Resources Information Center
King, Jason E.
A Monte Carlo simulation study was conducted to determine the bootstrap correction formula yielding the most accurate confidence intervals for robust measures of association. Confidence intervals were generated via the percentile, adjusted, BC, and BC(a) bootstrap procedures and applied to the Winsorized, percentage bend, and Pearson correlation…
Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection
NASA Technical Reports Server (NTRS)
Kumar, Sricharan; Srivistava, Ashok N.
2012-01-01
Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.
Schneider, Kevin; Koblmüller, Stephan; Sefc, Kristina M
2015-11-11
The homoplasy excess test (HET) is a tree-based screen for hybrid taxa in multilocus nuclear phylogenies. Homoplasy between a hybrid taxon and the clades containing the parental taxa reduces bootstrap support in the tree. The HET is based on the expectation that excluding the hybrid taxon from the data set increases the bootstrap support for the parental clades, whereas excluding non-hybrid taxa has little effect on statistical node support. To carry out a HET, bootstrap trees are calculated with taxon-jackknife data sets, that is excluding one taxon (species, population) at a time. Excess increase in bootstrap support for certain nodes upon exclusion of a particular taxon indicates the hybrid (the excluded taxon) and its parents (the clades with increased support).We introduce a new software program, hext, which generates the taxon-jackknife data sets, runs the bootstrap tree calculations, and identifies excess bootstrap increases as outlier values in boxplot graphs. hext is written in r language and accepts binary data (0/1; e.g. AFLP) as well as co-dominant SNP and genotype data.We demonstrate the usefulness of hext in large SNP data sets containing putative hybrids and their parents. For instance, using published data of the genus Vitis (~6,000 SNP loci), hext output supports V. × champinii as a hybrid between V. rupestris and V. mustangensis .With simulated SNP and AFLP data sets, excess increases in bootstrap support were not always connected with the hybrid taxon (false positives), whereas the expected bootstrap signal failed to appear on several occasions (false negatives). Potential causes for both types of spurious results are discussed.With both empirical and simulated data sets, the taxon-jackknife output generated by hext provided additional signatures of hybrid taxa, including changes in tree topology across trees, consistent effects of exclusions of the hybrid and the parent taxa, and moderate (rather than excessive) increases in bootstrap support. hext significantly facilitates the taxon-jackknife approach to hybrid taxon detection, even though the simple test for excess bootstrap increase may not reliably identify hybrid taxa in all applications.
Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach
NASA Astrophysics Data System (ADS)
Rodrigues, D. B. B.
2015-12-01
Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.
Bootstrap Estimation of Sample Statistic Bias in Structural Equation Modeling.
ERIC Educational Resources Information Center
Thompson, Bruce; Fan, Xitao
This study empirically investigated bootstrap bias estimation in the area of structural equation modeling (SEM). Three correctly specified SEM models were used under four different sample size conditions. Monte Carlo experiments were carried out to generate the criteria against which bootstrap bias estimation should be judged. For SEM fit indices,…
A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment
ERIC Educational Resources Information Center
Finch, Holmes; Monahan, Patrick
2008-01-01
This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…
NASA Astrophysics Data System (ADS)
Divine, D. V.; Granskog, M. A.; Hudson, S. R.; Pedersen, C. A.; Karlsen, T. I.; Divina, S. A.; Gerland, S.
2014-07-01
The paper presents a case study of the regional (≈ 150 km) broadband albedo of first year Arctic sea ice in advanced stages of melt, estimated from a combination of in situ albedo measurements and aerial imagery. The data were collected during the eight day ICE12 drift experiment carried out by the Norwegian Polar Institute in the Arctic north of Svalbard at 82.3° N from 26 July to 3 August 2012. The study uses in situ albedo measurements representative of the four main surface types: bare ice, dark melt ponds, bright melt ponds and open water. Images acquired by a helicopter borne camera system during ice survey flights covered about 28 km2. A subset of > 8000 images from the area of homogeneous melt with open water fraction of ≈ 0.11 and melt pond coverage of ≈ 0.25 used in the upscaling yielded a regional albedo estimate of 0.40 (0.38; 0.42). The 95% confidence interval on the estimate was derived using the moving block bootstrap approach applied to sequences of classified sea ice images and albedo of the four surface types treated as random variables. Uncertainty in the mean estimates of surface type albedo from in situ measurements contributed some 95% of the variance of the estimated regional albedo, with the remaining variance resulting from the spatial inhomogeneity of sea ice cover. The results of the study are of relevance for the modeling of sea ice processes in climate simulations. It particularly concerns the period of summer melt, when the optical properties of sea ice undergo substantial changes, which existing sea ice models have significant diffuculty accurately reproducing.
Epistemic uncertainty in the location and magnitude of earthquakes in Italy from Macroseismic data
Bakun, W.H.; Gomez, Capera A.; Stucchi, M.
2011-01-01
Three independent techniques (Bakun and Wentworth, 1997; Boxer from Gasperini et al., 1999; and Macroseismic Estimation of Earthquake Parameters [MEEP; see Data and Resources section, deliverable D3] from R.M.W. Musson and M.J. Jimenez) have been proposed for estimating an earthquake location and magnitude from intensity data alone. The locations and magnitudes obtained for a given set of intensity data are almost always different, and no one technique is consistently best at matching instrumental locations and magnitudes of recent well-recorded earthquakes in Italy. Rather than attempting to select one of the three solutions as best, we use all three techniques to estimate the location and the magnitude and the epistemic uncertainties among them. The estimates are calculated using bootstrap resampled data sets with Monte Carlo sampling of a decision tree. The decision-tree branch weights are based on goodness-of-fit measures of location and magnitude for recent earthquakes. The location estimates are based on the spatial distribution of locations calculated from the bootstrap resampled data. The preferred source location is the locus of the maximum bootstrap location spatial density. The location uncertainty is obtained from contours of the bootstrap spatial density: 68% of the bootstrap locations are within the 68% confidence region, and so on. For large earthquakes, our preferred location is not associated with the epicenter but with a location on the extended rupture surface. For small earthquakes, the epicenters are generally consistent with the location uncertainties inferred from the intensity data if an epicenter inaccuracy of 2-3 km is allowed. The preferred magnitude is the median of the distribution of bootstrap magnitudes. As with location uncertainties, the uncertainties in magnitude are obtained from the distribution of bootstrap magnitudes: the bounds of the 68% uncertainty range enclose 68% of the bootstrap magnitudes, and so on. The instrumental magnitudes for large and small earthquakes are generally consistent with the confidence intervals inferred from the distribution of bootstrap resampled magnitudes.
Forks in the road: choices in procedures for designing wildland linkages.
Beier, Paul; Majka, Daniel R; Spencer, Wayne D
2008-08-01
Models are commonly used to identify lands that will best maintain the ability of wildlife to move between wildland blocks through matrix lands after the remaining matrix has become incompatible with wildlife movement. We offer a roadmap of 16 choices and assumptions that arise in designing linkages to facilitate movement or gene flow of focal species between 2 or more predefined wildland blocks. We recommend designing linkages to serve multiple (rather than one) focal species likely to serve as a collective umbrella for all native species and ecological processes, explicitly acknowledging untested assumptions, and using uncertainty analysis to illustrate potential effects of model uncertainty. Such uncertainty is best displayed to stakeholders as maps of modeled linkages under different assumptions. We also recommend modeling corridor dwellers (species that require more than one generation to move their genes between wildland blocks) differently from passage species (for which an individual can move between wildland blocks within a few weeks). We identify a problem, which we call the subjective translation problem, that arises because the analyst must subjectively decide how to translate measurements of resource selection into resistance. This problem can be overcome by estimating resistance from observations of animal movement, genetic distances, or interpatch movements. There is room for substantial improvement in the procedures used to design linkages robust to climate change and in tools that allow stakeholders to compare an optimal linkage design to alternative designs that minimize costs or achieve other conservation goals.
Control of bootstrap current in the pedestal region of tokamaks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaing, K. C.; Department of Engineering Physics, University of Wisconsin, Madison, Wisconsin 53796; Lai, A. L.
2013-12-15
The high confinement mode (H-mode) plasmas in the pedestal region of tokamaks are characterized by steep gradient of the radial electric field, and sonic poloidal U{sub p,m} flow that consists of poloidal components of the E×B flow and the plasma flow velocity that is parallel to the magnetic field B. Here, E is the electric field. The bootstrap current that is important for the equilibrium, and stability of the pedestal of H-mode plasmas is shown to have an expression different from that in the conventional theory. In the limit where ‖U{sub p,m}‖≫ 1, the bootstrap current is driven by themore » electron temperature gradient and inductive electric field fundamentally different from that in the conventional theory. The bootstrap current in the pedestal region can be controlled through manipulating U{sub p,m} and the gradient of the radial electric. This, in turn, can control plasma stability such as edge-localized modes. Quantitative evaluations of various coefficients are shown to illustrate that the bootstrap current remains finite when ‖U{sub p,m}‖ approaches infinite and to provide indications how to control the bootstrap current. Approximate analytic expressions for viscous coefficients that join results in the banana and plateau-Pfirsch-Schluter regimes are presented to facilitate bootstrap and neoclassical transport simulations in the pedestal region.« less
Confidence Intervals for the Mean: To Bootstrap or Not to Bootstrap
ERIC Educational Resources Information Center
Calzada, Maria E.; Gardner, Holly
2011-01-01
The results of a simulation conducted by a research team involving undergraduate and high school students indicate that when data is symmetric the student's "t" confidence interval for a mean is superior to the studied non-parametric bootstrap confidence intervals. When data is skewed and for sample sizes n greater than or equal to 10,…
The Beginner's Guide to the Bootstrap Method of Resampling.
ERIC Educational Resources Information Center
Lane, Ginny G.
The bootstrap method of resampling can be useful in estimating the replicability of study results. The bootstrap procedure creates a mock population from a given sample of data from which multiple samples are then drawn. The method extends the usefulness of the jackknife procedure as it allows for computation of a given statistic across a maximal…
Application of a New Resampling Method to SEM: A Comparison of S-SMART with the Bootstrap
ERIC Educational Resources Information Center
Bai, Haiyan; Sivo, Stephen A.; Pan, Wei; Fan, Xitao
2016-01-01
Among the commonly used resampling methods of dealing with small-sample problems, the bootstrap enjoys the widest applications because it often outperforms its counterparts. However, the bootstrap still has limitations when its operations are contemplated. Therefore, the purpose of this study is to examine an alternative, new resampling method…
A Primer on Bootstrap Factor Analysis as Applied to Health Studies Research
ERIC Educational Resources Information Center
Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.
2014-01-01
Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…
ERIC Educational Resources Information Center
Kim, Se-Kang
2010-01-01
The aim of the current study is to validate the invariance of major profile patterns derived from multidimensional scaling (MDS) by bootstrapping. Profile Analysis via Multidimensional Scaling (PAMS) was employed to obtain profiles and bootstrapping was used to construct the sampling distributions of the profile coordinates and the empirical…
Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A
2017-06-30
Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Roy, Swapnoneel; Thakur, Ashok Kumar
2008-01-01
Genome rearrangements have been modelled by a variety of primitives such as reversals, transpositions, block moves and block interchanges. We consider such a genome rearrangement primitive Strip Exchanges. Given a permutation, the challenge is to sort it by using minimum number of strip exchanges. A strip exchanging move interchanges the positions of two chosen strips so that they merge with other strips. The strip exchange problem is to sort a permutation using minimum number of strip exchanges. We present here the first non-trivial 2-approximation algorithm to this problem. We also observe that sorting by strip-exchanges is fixed-parameter-tractable. Lastly we discuss the application of strip exchanges in a different area Optical Character Recognition (OCR) with an example.
Determination of GTA Welding Efficiencies
1993-03-01
continue on reverse if ncessary andidentify by block number) A method is developed for estimating welding efficiencies for moving arc GTAW processes...Dutta, Co-Advi r Department of Mechanical Engineering ii ABSTRACT A method is developed for estimating welding efficiencies for moving arc GTAW ...17 Figure 10. Miller Welding Equipment ............. ... 18 Figure 11. GTAW Torch Setup for Automatic Welding. . 19 Figure 12
Thermodynamics of a Block Sliding across a Frictional Surface
ERIC Educational Resources Information Center
Mungan, Carl E.
2007-01-01
The following idealized problem is intended to illustrate some basic thermodynamic concepts involved in kinetic friction. A block of mass m is sliding on top of a frictional, flat-topped table of mass M. The table is magnetically levitated, so that it can move without thermal contact and friction across a horizontal floor. The table is initially…
ERIC Educational Resources Information Center
Cui, Zhongmin; Kolen, Michael J.
2008-01-01
This article considers two methods of estimating standard errors of equipercentile equating: the parametric bootstrap method and the nonparametric bootstrap method. Using a simulation study, these two methods are compared under three sample sizes (300, 1,000, and 3,000), for two test content areas (the Iowa Tests of Basic Skills Maps and Diagrams…
Test of bootstrap current models using high- β p EAST-demonstration plasmas on DIII-D
Ren, Qilong; Lao, Lang L.; Garofalo, Andrea M.; ...
2015-01-12
Magnetic measurements together with kinetic profile and motional Stark effect measurements are used in full kinetic equilibrium reconstructions to test the Sauter and NEO bootstrap current models in a DIII-D high-more » $${{\\beta}_{\\text{p}}}$$ EAST-demonstration experiment. This aims at developing on DIII-D a high bootstrap current scenario to be extended on EAST for a demonstration of true steady-state at high performance and uses EAST-similar operational conditions: plasma shape, plasma current, toroidal magnetic field, total heating power and current ramp-up rate. It is found that the large edge bootstrap current in these high-$${{\\beta}_{\\text{p}}}$$ plasmas allows the use of magnetic measurements to clearly distinguish the two bootstrap current models. In these high collisionality and high-$${{\\beta}_{\\text{p}}}$$ plasmas, the Sauter model overpredicts the peak of the edge current density by about 30%, while the first-principle kinetic NEO model is in close agreement with the edge current density of the reconstructed equilibrium. Furthermore, these results are consistent with recent work showing that the Sauter model largely overestimates the edge bootstrap current at high collisionality.« less
Supervisory Control of Remote Manipulation with Compensation for Moving Target.
1980-07-21
Continue on reveree aide if neceeary and Identify by block number) ’The aim of this project is to evaluate automatic compensation for moving tar- gets ...slave control. Operating manipulators in this way is a tiring job and the operator gets exhausted after j a short time of work. The use of the computer...THE MANIPULATION OF MOVING OBJECTS Undersea tasks done by human divers are getting more and more costly and hazardous as they have to be done at
Safety Assessment of TACOM’s Ride Motion Simulator
1990-01-24
level (1300 to 1800 psi). 24 Step 16. Pressurize the system by moving the main pressure switch to "ON." Wait for the roll, pitch, and yaw error signals...the appropriate seat/shoulder/safety belts and harnesses. Carefully, help the test subject dismount. Step 41. Flip the main pressure switch on the...Dismount the test subject. Step 6. Move the main pressure switch to the "OFF" position. This will block any hydraulic flow to the system. Step 7. Move the
Multi-model stereo restitution
Dueholm, K.S.
1990-01-01
Methods are described that permit simultaneous orientation of many small-frame photogrammetric models in an analytical plotter. The multi-model software program enables the operator to move freely between the oriented models during interpretation and mapping. Models change automatically when the measuring mark is moved from one frame to another, moving to the same ground coordinates in the neighboring model. Thus, data collection and plotting can be performed continuously across model boundaries. The orientation of the models is accomplished by a bundle block adjustment. -from Author
Electron transport fluxes in potato plateau regime
NASA Astrophysics Data System (ADS)
Shaing, K. C.; Hazeltine, R. D.
1997-12-01
Electron transport fluxes in the potato plateau regime are calculated from the solutions of the drift kinetic equation and fluid equations. It is found that the bootstrap current density remains finite in the region close to the magnetic axis, although it decreases with increasing collision frequency. This finite amount of the bootstrap current in the relatively collisional regime is important in modeling tokamak startup with 100% bootstrap current.
Bootstrap current in a tokamak
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kessel, C.E.
1994-03-01
The bootstrap current in a tokamak is examined by implementing the Hirshman-Sigmar model and comparing the predicted current profiles with those from two popular approximations. The dependences of the bootstrap current profile on the plasma properties are illustrated. The implications for steady state tokamaks are presented through two constraints; the pressure profile must be peaked and {beta}{sub p} must be kept below a critical value.
The Building Blocks of Life Move from Ground to Tree to Animal and Back to Ground
NASA Astrophysics Data System (ADS)
Davidson, E. A.
2015-12-01
I generally use combinations of big words to describe my science, such as biogeochemistry, ecosystem ecology, nutrient cycling, stoichiometry, tropical deforestation, land-use change, agricultural intensification, eutrophication, greenhouse gas emissions, and sustainable development. I didn't expect to use any of these words, but I was surprised that I couldn't use some others that seem simple enough to me, such as farm, plant, soil, and forest. I landed on "building blocks" as my metaphor for the forms of carbon, nitrogen, phosphorus, and other elements that I study as they cycle through and among ecosystems. I study what makes trees and other kinds of life grow. We all know that they need the sun and that they take up water from the ground, but what else do trees need from the ground? What do animals that eat leaves and wood get from the trees? Just as we need building blocks to grow our bodies, trees and animals also need building blocks for growing their bodies. Trees get part of their building blocks from the ground and animals get theirs from what they eat. When animals poop and when leaves fall, some of their building blocks return to the ground. When they die, their building blocks also go back to the ground. I also study what happens to the ground, the water, and the air when we cut down trees, kill or shoo away the animals, and make fields to grow our food. Can we grow enough food and still keep the ground, water, and air clean? I think the answer is yes, but it will take better understanding of how all of those building blocks fit together and move around, from ground to tree to animal and back to ground.
NASA Technical Reports Server (NTRS)
1995-01-01
A computational fluid dynamics (CFD) analysis has been performed on the aft slot region of the Titan 4 Solid Rocket Motor Upgrade (SRMU). This analysis was performed in conjunction with MSFC structural modeling of the propellant grain to determine if the flow field induced stresses would adversely alter the propellant geometry to the extent of causing motor failure. The results of the coupled CFD/stress analysis have shown that there is a continual increase of flow field resistance at the aft slot due to the aft segment propellant grain being progressively moved radially toward the centerline of the motor port. This 'bootstrapping' effect between grain radial movement and internal flow resistance is conducive to causing a rapid motor failure.
X ray studies of the Hyades cluster
NASA Technical Reports Server (NTRS)
Stern, Robert A.
1993-01-01
The Hyades cluster occupies a unique position in both the history of astronomy and at the frontiers of contemporary astronomical research. At a distance of only 45 pc, the Hyades is the nearest star cluster in the Galaxy which is localized in the sky: the UMa cluster, which is closer, but much sparser, essentially surrounds the Solar neighborhood. The Hyades is the prototype cluster for distance determination using the 'moving-cluster' method, and thus serves to define the zero-age main sequence from which the cosmic distance scale is essentially bootstrapped. The Hyades age (0.6-0.7 Gyr), nearly 8 times younger than the Sun, guarantees the Hyades critical importance to studies of stellar evolution. The results of a complete survey of the Hyades cluster using the ROSAT All Sky Survey (RASS) are reported.
Multi-baseline bootstrapping at the Navy precision optical interferometer
NASA Astrophysics Data System (ADS)
Armstrong, J. T.; Schmitt, H. R.; Mozurkewich, D.; Jorgensen, A. M.; Muterspaugh, M. W.; Baines, E. K.; Benson, J. A.; Zavala, Robert T.; Hutter, D. J.
2014-07-01
The Navy Precision Optical Interferometer (NPOI) was designed from the beginning to support baseline boot- strapping with equally-spaced array elements. The motivation was the desire to image the surfaces of resolved stars with the maximum resolution possible with a six-element array. Bootstrapping two baselines together to track fringes on a third baseline has been used at the NPOI for many years, but the capabilities of the fringe tracking software did not permit us to bootstrap three or more baselines together. Recently, both a new backend (VISION; Tennessee State Univ.) and new hardware and firmware (AZ Embedded Systems and New Mexico Tech, respectively) for the current hybrid backend have made multi-baseline bootstrapping possible.
Bootstrap and fast wave current drive for tokamak reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ehst, D.A.
1991-09-01
Using the multi-species neoclassical treatment of Hirshman and Sigmar we study steady state bootstrap equilibria with seed currents provided by low frequency (ICRF) fast waves and with additional surface current density driven by lower hybrid waves. This study applies to reactor plasmas of arbitrary aspect ratio. IN one limit the bootstrap component can supply nearly the total equilibrium current with minimal driving power (< 20 MW). However, for larger total currents considerable driving power is required (for ITER: I{sub o} = 18 MA needs P{sub FW} = 15 MW, P{sub LH} = 75 MW). A computational survey of bootstrap fractionmore » and current drive efficiency is presented. 11 refs., 8 figs.« less
NASA Astrophysics Data System (ADS)
Komachi, Mamoru; Kudo, Taku; Shimbo, Masashi; Matsumoto, Yuji
Bootstrapping has a tendency, called semantic drift, to select instances unrelated to the seed instances as the iteration proceeds. We demonstrate the semantic drift of Espresso-style bootstrapping has the same root as the topic drift of Kleinberg's HITS, using a simplified graph-based reformulation of bootstrapping. We confirm that two graph-based algorithms, the von Neumann kernels and the regularized Laplacian, can reduce the effect of semantic drift in the task of word sense disambiguation (WSD) on Senseval-3 English Lexical Sample Task. Proposed algorithms achieve superior performance to Espresso and previous graph-based WSD methods, even though the proposed algorithms have less parameters and are easy to calibrate.
In Search of the Ultimate Building Blocks
NASA Astrophysics Data System (ADS)
't Hooft, Gerard
1996-12-01
An apology; 1. The beginning of the journey to the small: cutting paper; 2. To molecules and atoms; 3. The magic mystery of the quanta; 4. Dazzling velocities; 5. The elementary particle zoo before 1970; 6. Life and death; 7. The crazy kaons; 8. The invisible quarks; 9. Fields or bootstraps?; 10. The Yang-Mills bonanza; 11. Superconducting empty space: the Higgs-Kibble machine; 12. Models; 13. Colouring in the strong forces; 14. The magnetic monopole; 15. Gypsy; 16. The brilliance of the standard model; 17. Anomalies; 18. Deceptive perfection; 19. Weighing neutrinos; 20. The great desert; 21. Technicolor; 22. Grand unification; 23. Supergravity; 24. Eleven dimensional space-time; 25. Attaching the super string; 26. Into the black hole; 27. Theories that do not yet exist … ; 28. Dominance of the rule of the smallest.
ERIC Educational Resources Information Center
Pemberton Roben, Caroline K.; Bass, Anneliese J.; Moore, Ginger A.; Murray-Kolb, Laura; Tan, Patricia Z.; Gilmore, Rick O.; Buss, Kristin A.; Cole, Pamela M.; Teti, Laureen O.
2012-01-01
Infants' emerging ability to move independently by crawling is associated with changes in multiple domains, including an increase in expressions of anger in situations that block infants' goals, but it is unknown whether increased anger is specifically because of experience with being able to move autonomously or simply related to age. To examine…
Confidence limit calculation for antidotal potency ratio derived from lethal dose 50
Manage, Ananda; Petrikovics, Ilona
2013-01-01
AIM: To describe confidence interval calculation for antidotal potency ratios using bootstrap method. METHODS: We can easily adapt the nonparametric bootstrap method which was invented by Efron to construct confidence intervals in such situations like this. The bootstrap method is a resampling method in which the bootstrap samples are obtained by resampling from the original sample. RESULTS: The described confidence interval calculation using bootstrap method does not require the sampling distribution antidotal potency ratio. This can serve as a substantial help for toxicologists, who are directed to employ the Dixon up-and-down method with the application of lower number of animals to determine lethal dose 50 values for characterizing the investigated toxic molecules and eventually for characterizing the antidotal protections by the test antidotal systems. CONCLUSION: The described method can serve as a useful tool in various other applications. Simplicity of the method makes it easier to do the calculation using most of the programming software packages. PMID:25237618
Topics in Statistical Calibration
2014-03-27
on a parametric bootstrap where, instead of sampling directly from the residuals , samples are drawn from a normal distribution. This procedure will...addition to centering them (Davison and Hinkley, 1997). When there are outliers in the residuals , the bootstrap distribution of x̂0 can become skewed or...based and inversion methods using the linear mixed-effects model. Then, a simple parametric bootstrap algorithm is proposed that can be used to either
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaing, K.C.; Hazeltine, R.D.
Electron transport fluxes in the potato plateau regime are calculated from the solutions of the drift kinetic equation and fluid equations. It is found that the bootstrap current density remains finite in the region close to the magnetic axis, although it decreases with increasing collision frequency. This finite amount of the bootstrap current in the relatively collisional regime is important in modeling tokamak startup with 100{percent} bootstrap current. {copyright} {ital 1997 American Institute of Physics.}
Variable selection under multiple imputation using the bootstrap in a prognostic study
Heymans, Martijn W; van Buuren, Stef; Knol, Dirk L; van Mechelen, Willem; de Vet, Henrica CW
2007-01-01
Background Missing data is a challenging problem in many prognostic studies. Multiple imputation (MI) accounts for imputation uncertainty that allows for adequate statistical testing. We developed and tested a methodology combining MI with bootstrapping techniques for studying prognostic variable selection. Method In our prospective cohort study we merged data from three different randomized controlled trials (RCTs) to assess prognostic variables for chronicity of low back pain. Among the outcome and prognostic variables data were missing in the range of 0 and 48.1%. We used four methods to investigate the influence of respectively sampling and imputation variation: MI only, bootstrap only, and two methods that combine MI and bootstrapping. Variables were selected based on the inclusion frequency of each prognostic variable, i.e. the proportion of times that the variable appeared in the model. The discriminative and calibrative abilities of prognostic models developed by the four methods were assessed at different inclusion levels. Results We found that the effect of imputation variation on the inclusion frequency was larger than the effect of sampling variation. When MI and bootstrapping were combined at the range of 0% (full model) to 90% of variable selection, bootstrap corrected c-index values of 0.70 to 0.71 and slope values of 0.64 to 0.86 were found. Conclusion We recommend to account for both imputation and sampling variation in sets of missing data. The new procedure of combining MI with bootstrapping for variable selection, results in multivariable prognostic models with good performance and is therefore attractive to apply on data sets with missing values. PMID:17629912
Assessing uncertainties in surface water security: An empirical multimodel approach
NASA Astrophysics Data System (ADS)
Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.; Oliveira, Paulo Tarso S.
2015-11-01
Various uncertainties are involved in the representation of processes that characterize interactions among societal needs, ecosystem functioning, and hydrological conditions. Here we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multimodel and resampling framework. We consider several uncertainty sources including those related to (i) observed streamflow data; (ii) hydrological model structure; (iii) residual analysis; (iv) the method for defining Environmental Flow Requirement; (v) the definition of critical conditions for water provision; and (vi) the critical demand imposed by human activities. We estimate the overall hydrological model uncertainty by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km2 agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multimodel framework and the uncertainty estimates provided by each model uncertainty estimation approach. The range of values obtained for the water security indicators suggests that the models/methods are robust and performs well in a range of plausible situations. The method is general and can be easily extended, thereby forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision-making process.
Characterizing the inverses of block tridiagonal, block Toeplitz matrices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boffi, Nicholas M.; Hill, Judith C.; Reuter, Matthew G.
2014-12-04
We consider the inversion of block tridiagonal, block Toeplitz matrices and comment on the behaviour of these inverses as one moves away from the diagonal. Using matrix M bius transformations, we first present an O(1) representation (with respect to the number of block rows and block columns) for the inverse matrix and subsequently use this representation to characterize the inverse matrix. There are four symmetry-distinct cases where the blocks of the inverse matrix (i) decay to zero on both sides of the diagonal, (ii) oscillate on both sides, (iii) decay on one side and oscillate on the other and (iv)more » decay on one side and grow on the other. This characterization exposes the necessary conditions for the inverse matrix to be numerically banded and may also aid in the design of preconditioners and fast algorithms. Finally, we present numerical examples of these matrix types.« less
Severe Weather Guide - Mediterranean Ports. 7. Marseille
1988-03-01
the afternoon. Upper—level westerlies and the associated storm track is moved northward during summer, so extratropical cyclones and associated...autumn as the extratropical storm track moves southward. Precipitation amount is the highest of the year, with an average of 3 inches (76 mm) for the...18 SUBJECT TERMS (Continue on reverse if necessary and identify by block number) Storm haven Mediterranean meteorology Marseille port
Symmetric Missile Dynamic Instabilities - A Review
1980-03-01
and a Magnus side moment must be added to the total aerodynamic moment. Since statically stable missiles are usually spun to reduce the effect of...Identify by block~ numbef) Symmetric Missile Roil Moment Resonance Spin Dynamic Stability Side Moment Damxping Moment Trim Moment Magnus Moment Moving...dlamping moments for nonspin- ning re-entry vehicles, nonlinear Magnus moments for spinning missiles, and internal resonances with moving payload
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ouyang, L; Lee, H; Wang, J
2014-06-01
Purpose: To evaluate a moving-blocker-based approach in estimating and correcting megavoltage (MV) and kilovoltage (kV) scatter contamination in kV cone-beam computed tomography (CBCT) acquired during volumetric modulated arc therapy (VMAT). Methods: XML code was generated to enable concurrent CBCT acquisition and VMAT delivery in Varian TrueBeam developer mode. A physical attenuator (i.e., “blocker”) consisting of equal spaced lead strips (3.2mm strip width and 3.2mm gap in between) was mounted between the x-ray source and patient at a source to blocker distance of 232mm. The blocker was simulated to be moving back and forth along the gantry rotation axis during themore » CBCT acquisition. Both MV and kV scatter signal were estimated simultaneously from the blocked regions of the imaging panel, and interpolated into the un-blocked regions. Scatter corrected CBCT was then reconstructed from un-blocked projections after scatter subtraction using an iterative image reconstruction algorithm based on constraint optimization. Experimental studies were performed on a Catphan 600 phantom and an anthropomorphic pelvis phantom to demonstrate the feasibility of using moving blocker for MV-kV scatter correction. Results: MV scatter greatly degrades the CBCT image quality by increasing the CT number inaccuracy and decreasing the image contrast, in addition to the shading artifacts caused by kV scatter. The artifacts were substantially reduced in the moving blocker corrected CBCT images in both Catphan and pelvis phantoms. Quantitatively, CT number error in selected regions of interest reduced from 377 in the kV-MV contaminated CBCT image to 38 for the Catphan phantom. Conclusions: The moving-blockerbased strategy can successfully correct MV and kV scatter simultaneously in CBCT projection data acquired with concurrent VMAT delivery. This work was supported in part by a grant from the Cancer Prevention and Research Institute of Texas (RP130109) and a grant from the American Cancer Society (RSG-13-326-01-CCE)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, X; Ouyang, L; Jia, X
Purpose: A moving blocker based strategy has shown promising results for scatter correction in cone-beam computed tomography (CBCT). Different geometry designs and moving speeds of the blocker affect its performance in image reconstruction accuracy. The goal of this work is to optimize the geometric design and moving speed of the moving blocker system through experimental evaluations. Methods: An Elekta Synergy XVI system and an anthropomorphic pelvis phantom CIRS 801-P were used for our experiment. A blocker consisting of lead strips was inserted between the x-ray source and the phantom moving back and forth along rotation axis to measure the scattermore » signal. Accoriding to our Monte Carlo simulation results, three blockers were used, which have the same lead strip width 3.2mm and different gap between neighboring lead strips, 3.2, 6.4 and 9.6mm. For each blocker, three moving speeds were evaluated, 10, 20 and 30 pixels per projection (on the detector plane). Scatter signal in the unblocked region was estimated by cubic B-spline based interpolation from the blocked region. CBCT image was reconstructed by a total variation (TV) based algebraic iterative reconstruction (ART) algorithm from the partially blocked projection data. Reconstruction accuracy in each condition is quantified as CT number error of region of interest (ROI) by comparing to a CBCT reconstructed image from analytically simulated unblocked and scatter free projection data. Results: Highest reconstruction accuracy is achieved when the blocker width is 3.2 mm, the gap between neighboring lead strips is 9.6 mm and the moving speed is 20 pixels per projection. RMSE of the CT number of ROIs can be reduced from 436 to 27. Conclusions: Image reconstruction accuracy is greatly affected by the geometry design of the blocker. The moving speed does not have a very strong effect on reconstruction result if it is over 20 pixels per projection.« less
Digital image modification detection using color information and its histograms.
Zhou, Haoyu; Shen, Yue; Zhu, Xinghui; Liu, Bo; Fu, Zigang; Fan, Na
2016-09-01
The rapid development of many open source and commercial image editing software makes the authenticity of the digital images questionable. Copy-move forgery is one of the most widely used tampering techniques to create desirable objects or conceal undesirable objects in a scene. Existing techniques reported in the literature to detect such tampering aim to improve the robustness of these methods against the use of JPEG compression, blurring, noise, or other types of post processing operations. These post processing operations are frequently used with the intention to conceal tampering and reduce tampering clues. A robust method based on the color moments and other five image descriptors is proposed in this paper. The method divides the image into fixed size overlapping blocks. Clustering operation divides entire search space into smaller pieces with similar color distribution. Blocks from the tampered regions will reside within the same cluster since both copied and moved regions have similar color distributions. Five image descriptors are used to extract block features, which makes the method more robust to post processing operations. An ensemble of deep compositional pattern-producing neural networks are trained with these extracted features. Similarity among feature vectors in clusters indicates possible forged regions. Experimental results show that the proposed method can detect copy-move forgery even if an image was distorted by gamma correction, addictive white Gaussian noise, JPEG compression, or blurring. Copyright © 2016. Published by Elsevier Ireland Ltd.
Toma, Tudor; Bosman, Robert-Jan; Siebes, Arno; Peek, Niels; Abu-Hanna, Ameen
2010-08-01
An important problem in the Intensive Care is how to predict on a given day of stay the eventual hospital mortality for a specific patient. A recent approach to solve this problem suggested the use of frequent temporal sequences (FTSs) as predictors. Methods following this approach were evaluated in the past by inducing a model from a training set and validating the prognostic performance on an independent test set. Although this evaluative approach addresses the validity of the specific models induced in an experiment, it falls short of evaluating the inductive method itself. To achieve this, one must account for the inherent sources of variation in the experimental design. The main aim of this work is to demonstrate a procedure based on bootstrapping, specifically the .632 bootstrap procedure, for evaluating inductive methods that discover patterns, such as FTSs. A second aim is to apply this approach to find out whether a recently suggested inductive method that discovers FTSs of organ functioning status is superior over a traditional method that does not use temporal sequences when compared on each successive day of stay at the Intensive Care Unit. The use of bootstrapping with logistic regression using pre-specified covariates is known in the statistical literature. Using inductive methods of prognostic models based on temporal sequence discovery within the bootstrap procedure is however novel at least in predictive models in the Intensive Care. Our results of applying the bootstrap-based evaluative procedure demonstrate the superiority of the FTS-based inductive method over the traditional method in terms of discrimination as well as accuracy. In addition we illustrate the insights gained by the analyst into the discovered FTSs from the bootstrap samples. Copyright 2010 Elsevier Inc. All rights reserved.
Elkomy, Mohammed H; Elmenshawe, Shahira F; Eid, Hussein M; Ali, Ahmed M A
2016-11-01
This work aimed at investigating the potential of solid lipid nanoparticles (SLN) as carriers for topical delivery of Ketoprofen (KP); evaluating a novel technique incorporating Artificial Neural Network (ANN) and clustered bootstrap for optimization of KP-loaded SLN (KP-SLN); and demonstrating a longitudinal dose response (LDR) modeling-based approach to compare the activity of topical non-steroidal anti-inflammatory drug formulations. KP-SLN was fabricated by a modified emulsion/solvent evaporation method. Box-Behnken design was implemented to study the influence of glycerylpalmitostearate-to-KP ratio, Tween 80, and lecithin concentrations on particle size, entrapment efficiency, and amount of drug permeated through rat skin in 24 hours. Following clustered bootstrap ANN optimization, the optimized KP-SLN was incorporated into an aqueous gel and evaluated for rheology, in vitro release, permeability, skin irritation and in vivo activity using carrageenan-induced rat paw edema model and LDR mathematical model to analyze the time course of anti-inflammatory effect at various application durations. Lipid-to-drug ratio of 7.85 [bootstrap 95%CI: 7.63-8.51], Tween 80 of 1.27% [bootstrap 95%CI: 0.601-2.40%], and Lecithin of 0.263% [bootstrap 95%CI: 0.263-0.328%] were predicted to produce optimal characteristics. Compared with profenid® gel, the optimized KP-SLN gel exhibited slower release, faster permeability, better texture properties, greater efficacy, and similar potency. SLNs are safe and effective permeation enhancers. ANN coupled with clustered bootstrap is a useful method for finding optimal solutions and estimating uncertainty associated with them. LDR models allow mechanistic understanding of comparative in vivo performances of different topical formulations, and help design efficient dermatological bioequivalence assessment methods.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things.
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-03-11
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-01-01
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length. PMID:26978362
Vorburger, Robert S; Habeck, Christian G; Narkhede, Atul; Guzman, Vanessa A; Manly, Jennifer J; Brickman, Adam M
2016-01-01
Diffusion tensor imaging suffers from an intrinsic low signal-to-noise ratio. Bootstrap algorithms have been introduced to provide a non-parametric method to estimate the uncertainty of the measured diffusion parameters. To quantify the variability of the principal diffusion direction, bootstrap-derived metrics such as the cone of uncertainty have been proposed. However, bootstrap-derived metrics are not independent of the underlying diffusion profile. A higher mean diffusivity causes a smaller signal-to-noise ratio and, thus, increases the measurement uncertainty. Moreover, the goodness of the tensor model, which relies strongly on the complexity of the underlying diffusion profile, influences bootstrap-derived metrics as well. The presented simulations clearly depict the cone of uncertainty as a function of the underlying diffusion profile. Since the relationship of the cone of uncertainty and common diffusion parameters, such as the mean diffusivity and the fractional anisotropy, is not linear, the cone of uncertainty has a different sensitivity. In vivo analysis of the fornix reveals the cone of uncertainty to be a predictor of memory function among older adults. No significant correlation occurs with the common diffusion parameters. The present work not only demonstrates the cone of uncertainty as a function of the actual diffusion profile, but also discloses the cone of uncertainty as a sensitive predictor of memory function. Future studies should incorporate bootstrap-derived metrics to provide more comprehensive analysis.
A neural network based reputation bootstrapping approach for service selection
NASA Astrophysics Data System (ADS)
Wu, Quanwang; Zhu, Qingsheng; Li, Peng
2015-10-01
With the concept of service-oriented computing becoming widely accepted in enterprise application integration, more and more computing resources are encapsulated as services and published online. Reputation mechanism has been studied to establish trust on prior unknown services. One of the limitations of current reputation mechanisms is that they cannot assess the reputation of newly deployed services as no record of their previous behaviours exists. Most of the current bootstrapping approaches merely assign default reputation values to newcomers. However, by this kind of methods, either newcomers or existing services will be favoured. In this paper, we present a novel reputation bootstrapping approach, where correlations between features and performance of existing services are learned through an artificial neural network (ANN) and they are then generalised to establish a tentative reputation when evaluating new and unknown services. Reputations of services published previously by the same provider are also incorporated for reputation bootstrapping if available. The proposed reputation bootstrapping approach is seamlessly embedded into an existing reputation model and implemented in the extended service-oriented architecture. Empirical studies of the proposed approach are shown at last.
Using Cluster Bootstrapping to Analyze Nested Data With a Few Clusters.
Huang, Francis L
2018-04-01
Cluster randomized trials involving participants nested within intact treatment and control groups are commonly performed in various educational, psychological, and biomedical studies. However, recruiting and retaining intact groups present various practical, financial, and logistical challenges to evaluators and often, cluster randomized trials are performed with a low number of clusters (~20 groups). Although multilevel models are often used to analyze nested data, researchers may be concerned of potentially biased results due to having only a few groups under study. Cluster bootstrapping has been suggested as an alternative procedure when analyzing clustered data though it has seen very little use in educational and psychological studies. Using a Monte Carlo simulation that varied the number of clusters, average cluster size, and intraclass correlations, we compared standard errors using cluster bootstrapping with those derived using ordinary least squares regression and multilevel models. Results indicate that cluster bootstrapping, though more computationally demanding, can be used as an alternative procedure for the analysis of clustered data when treatment effects at the group level are of primary interest. Supplementary material showing how to perform cluster bootstrapped regressions using R is also provided.
Interferometric step gauge for CMM verification
NASA Astrophysics Data System (ADS)
Hemming, B.; Esala, V.-P.; Laukkanen, P.; Rantanen, A.; Viitala, R.; Widmaier, T.; Kuosmanen, P.; Lassila, A.
2018-07-01
The verification of the measurement capability of coordinate measuring machines (CMM) is usually performed using gauge blocks or step gauges as reference standards. Gauge blocks and step gauges are robust and easy to use, but have some limitations such as finite lengths and uncertainty of thermal expansion. This paper describes the development, testing and uncertainty evaluation of an interferometric step gauge (ISG) for CMM verification. The idea of the ISG is to move a carriage bearing a gauge block along a rail and to measure the position with an interferometer. For a displacement of 1 m the standard uncertainty of the position of the gauge block is 0.2 µm. A short range periodic error of CMM can also be detected.
... Pieces of the tumor can move to the brain, eye, or limbs. If the tumor grows inside the heart, it can block blood flow. This may require emergency ... myxoma References Lenihan DJ, Yusuf SW. Tumors ...
Fish tracking by combining motion based segmentation and particle filtering
NASA Astrophysics Data System (ADS)
Bichot, E.; Mascarilla, L.; Courtellemont, P.
2006-01-01
In this paper, we suggest a new importance sampling scheme to improve a particle filtering based tracking process. This scheme relies on exploitation of motion segmentation. More precisely, we propagate hypotheses from particle filtering to blobs of similar motion to target. Hence, search is driven toward regions of interest in the state space and prediction is more accurate. We also propose to exploit segmentation to update target model. Once the moving target has been identified, a representative model is learnt from its spatial support. We refer to this model in the correction step of the tracking process. The importance sampling scheme and the strategy to update target model improve the performance of particle filtering in complex situations of occlusions compared to a simple Bootstrap approach as shown by our experiments on real fish tank sequences.
Severe Weather Guide - Mediterranean Ports. 4. Augusta Bay
1988-03-01
the year. The track o-f strong extratropical storms has moved northward and poses little tiireat to Augusta Bay. Sea breezes are daily occurrences...as temperatures, begin to moderate. Extratropi cal systems begin to transit Europe as the storm track moves southward in advance of the winter...SUB-GROUP 18. SUBJECT TERMS {Continue on reverse if necessary and identify by block number) Storm haven Mediterranean meteorology Augusta Bay
Universal RCFT correlators from the holomorphic bootstrap
NASA Astrophysics Data System (ADS)
Mukhi, Sunil; Muralidhara, Girish
2018-02-01
We elaborate and extend the method of Wronskian differential equations for conformal blocks to compute four-point correlation functions on the plane for classes of primary fields in rational (and possibly more general) conformal field theories. This approach leads to universal differential equations for families of CFT's and provides a very simple re-derivation of the BPZ results for the degenerate fields ϕ 1,2 and ϕ 2,1 in the c < 1 minimal models. We apply this technique to compute correlators for the WZW models corresponding to the Deligne-Cvitanović exceptional series of Lie algebras. The application turns out to be subtle in certain cases where there are multiple decoupled primaries. The power of this approach is demonstrated by applying it to compute four-point functions for the Baby Monster CFT, which does not belong to any minimal series.
Design of a steganographic virtual operating system
NASA Astrophysics Data System (ADS)
Ashendorf, Elan; Craver, Scott
2015-03-01
A steganographic file system is a secure file system whose very existence on a disk is concealed. Customarily, these systems hide an encrypted volume within unused disk blocks, slack space, or atop conventional encrypted volumes. These file systems are far from undetectable, however: aside from their ciphertext footprint, they require a software or driver installation whose presence can attract attention and then targeted surveillance. We describe a new steganographic operating environment that requires no visible software installation, launching instead from a concealed bootstrap program that can be extracted and invoked with a chain of common Unix commands. Our system conceals its payload within innocuous files that typically contain high-entropy data, producing a footprint that is far less conspicuous than existing methods. The system uses a local web server to provide a file system, user interface and applications through a web architecture.
Effective Light Directed Assembly of Building Blocks with Microscale Control.
Dinh, Ngoc-Duy; Luo, Rongcong; Christine, Maria Tankeh Asuncion; Lin, Weikang Nicholas; Shih, Wei-Chuan; Goh, James Cho-Hong; Chen, Chia-Hung
2017-06-01
Light-directed forces have been widely used to pattern micro/nanoscale objects with precise control, forming functional assemblies. However, a substantial laser intensity is required to generate sufficient optical gradient forces to move a small object in a certain direction, causing limited throughput for applications. A high-throughput light-directed assembly is demonstrated as a printing technology by introducing gold nanorods to induce thermal convection flows that move microparticles (diameter = 40 µm to several hundreds of micrometers) to specific light-guided locations, forming desired patterns. With the advantage of effective light-directed assembly, the microfluidic-fabricated monodispersed biocompatible microparticles are used as building blocks to construct a structured assembly (≈10 cm scale) in ≈2 min. The control with microscale precision is approached by changing the size of the laser light spot. After crosslinking assembly of building blocks, a novel soft material with wanted pattern is approached. To demonstrate its application, the mesenchymal stem-cell-seeded hydrogel microparticles are prepared as functional building blocks to construct scaffold-free tissues with desired structures. This light-directed fabrication method can be applied to integrate different building units, enabling the bottom-up formation of materials with precise control over their internal structure for bioprinting, tissue engineering, and advanced manufacturing. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Miehls, Scott M.; Johnson, Nicholas S.; Hrodey, Pete J.
2017-01-01
Control of the invasive Sea Lamprey Petromyzon marinus is critical for management of commercial and recreational fisheries in the Laurentian Great Lakes. Use of physical barriers to block Sea Lampreys from spawning habitat is a major component of the control program. However, the resulting interruption of natural streamflow and blockage of nontarget species present substantial challenges. Development of an effective nonphysical barrier would aid the control of Sea Lampreys by eliminating their access to spawning locations while maintaining natural streamflow. We tested the effect of a nonphysical barrier consisting of strobe lights, low-frequency sound, and a bubble screen on the movement of Sea Lampreys in an experimental raceway designed as a two-choice maze with a single main channel fed by two identical inflow channels (one control and one blocked). Sea Lampreys were more likely to move upstream during trials when the strobe light and low-frequency sound were active compared with control trials and trials using the bubble screen alone. For those Sea Lampreys that did move upstream to the confluence of inflow channels, no combination of stimuli or any individual stimulus significantly influenced the likelihood that Sea Lampreys would enter the blocked inflow channel, enter the control channel, or return downstream.
30 CFR 250.1604 - General requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... moving equipment such as a well-drilling, well-completion, or well-workover rig or associated equipment..., platform age, and previous stresses. (f) Traveling-block safety device. All drilling units being used for...
30 CFR 250.1604 - General requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... moving equipment such as a well-drilling, well-completion, or well-workover rig or associated equipment..., platform age, and previous stresses. (f) Traveling-block safety device. All drilling units being used for...
30 CFR 250.1604 - General requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... moving equipment such as a well-drilling, well-completion, or well-workover rig or associated equipment..., platform age, and previous stresses. (f) Traveling-block safety device. All drilling units being used for...
Genetics Home Reference: cystinosis
... the amino acid cystine (a building block of proteins) within cells. Excess cystine damages cells and often ... gene lead to a deficiency of a transporter protein called cystinosin. Within cells, this protein normally moves ...
Mechanistic solutions to the opening of the Gulf of Mexico
Schouten, Hans; Klitgord, Kim D.
1994-01-01
Two mechanistic models-which are unlike the traditional plate-tectonic landfill models used for most proposed Pangea reconstructions of the Yucatán block-relate the Mesozoic opening of the Gulf of Mexico directly to the movement of the North and South American plates: (1) a previous piggyback model in which Yucatán moves with South America out of the western gulf and (2) a new edge-driven model in which the motion of the Yucatán block is caused by forces applied to its margins by the movement of the North and South American plates. In the second model, Yucatán moves out of the northern Gulf of Mexico as a gear or roller bearing. On the basis of magnetic edge anomalies around the gulf, this edge-driven model predicts that from the Bathonian to Tithonian (~170 to ~50 Ma), Yucatán was rotated ~60° counterclockwise as a rigid block between North and South America with rift propagation and extension occurring simultaneously in the Gulf of Mexico and Yucatán Basin.
Bootstrap investigation of the stability of a Cox regression model.
Altman, D G; Andersen, P K
1989-07-01
We describe a bootstrap investigation of the stability of a Cox proportional hazards regression model resulting from the analysis of a clinical trial of azathioprine versus placebo in patients with primary biliary cirrhosis. We have considered stability to refer both to the choice of variables included in the model and, more importantly, to the predictive ability of the model. In stepwise Cox regression analyses of 100 bootstrap samples using 17 candidate variables, the most frequently selected variables were those selected in the original analysis, and no other important variable was identified. Thus there was no reason to doubt the model obtained in the original analysis. For each patient in the trial, bootstrap confidence intervals were constructed for the estimated probability of surviving two years. It is shown graphically that these intervals are markedly wider than those obtained from the original model.
NASA Astrophysics Data System (ADS)
Artemenko, M. V.; Chernetskaia, I. E.; Kalugina, N. M.; Shchekina, E. N.
2018-04-01
This article describes the solution of the actual problem of the productive formation of a cortege of informative measured features of the object of observation and / or control using author's algorithms for the use of bootstraps and counter-bootstraps technologies for processing the results of measurements of various states of the object on the basis of different volumes of the training sample. The work that is presented in this paper considers aggregation by specific indicators of informative capacity by linear, majority, logical and “greedy” methods, applied both individually and integrally. The results of the computational experiment are discussed, and in conclusion is drawn that the application of the proposed methods contributes to an increase in the efficiency of classification of the states of the object from the results of measurements.
How bootstrap can help in forecasting time series with more than one seasonal pattern
NASA Astrophysics Data System (ADS)
Cordeiro, Clara; Neves, M. Manuela
2012-09-01
The search for the future is an appealing challenge in time series analysis. The diversity of forecasting methodologies is inevitable and is still in expansion. Exponential smoothing methods are the launch platform for modelling and forecasting in time series analysis. Recently this methodology has been combined with bootstrapping revealing a good performance. The algorithm (Boot. EXPOS) using exponential smoothing and bootstrap methodologies, has showed promising results for forecasting time series with one seasonal pattern. In case of more than one seasonal pattern, the double seasonal Holt-Winters methods and the exponential smoothing methods were developed. A new challenge was now to combine these seasonal methods with bootstrap and carry over a similar resampling scheme used in Boot. EXPOS procedure. The performance of such partnership will be illustrated for some well-know data sets existing in software.
Phu, Jack; Bui, Bang V; Kalloniatis, Michael; Khuu, Sieu K
2018-03-01
The number of subjects needed to establish the normative limits for visual field (VF) testing is not known. Using bootstrap resampling, we determined whether the ground truth mean, distribution limits, and standard deviation (SD) could be approximated using different set size ( x ) levels, in order to provide guidance for the number of healthy subjects required to obtain robust VF normative data. We analyzed the 500 Humphrey Field Analyzer (HFA) SITA-Standard results of 116 healthy subjects and 100 HFA full threshold results of 100 psychophysically experienced healthy subjects. These VFs were resampled (bootstrapped) to determine mean sensitivity, distribution limits (5th and 95th percentiles), and SD for different ' x ' and numbers of resamples. We also used the VF results of 122 glaucoma patients to determine the performance of ground truth and bootstrapped results in identifying and quantifying VF defects. An x of 150 (for SITA-Standard) and 60 (for full threshold) produced bootstrapped descriptive statistics that were no longer different to the original distribution limits and SD. Removing outliers produced similar results. Differences between original and bootstrapped limits in detecting glaucomatous defects were minimized at x = 250. Ground truth statistics of VF sensitivities could be approximated using set sizes that are significantly smaller than the original cohort. Outlier removal facilitates the use of Gaussian statistics and does not significantly affect the distribution limits. We provide guidance for choosing the cohort size for different levels of error when performing normative comparisons with glaucoma patients.
A bootstrap estimation scheme for chemical compositional data with nondetects
Palarea-Albaladejo, J; Martín-Fernández, J.A; Olea, Ricardo A.
2014-01-01
The bootstrap method is commonly used to estimate the distribution of estimators and their associated uncertainty when explicit analytic expressions are not available or are difficult to obtain. It has been widely applied in environmental and geochemical studies, where the data generated often represent parts of whole, typically chemical concentrations. This kind of constrained data is generically called compositional data, and they require specialised statistical methods to properly account for their particular covariance structure. On the other hand, it is not unusual in practice that those data contain labels denoting nondetects, that is, concentrations falling below detection limits. Nondetects impede the implementation of the bootstrap and represent an additional source of uncertainty that must be taken into account. In this work, a bootstrap scheme is devised that handles nondetects by adding an imputation step within the resampling process and conveniently propagates their associated uncertainly. In doing so, it considers the constrained relationships between chemical concentrations originated from their compositional nature. Bootstrap estimates using a range of imputation methods, including new stochastic proposals, are compared across scenarios of increasing difficulty. They are formulated to meet compositional principles following the log-ratio approach, and an adjustment is introduced in the multivariate case to deal with nonclosed samples. Results suggest that nondetect bootstrap based on model-based imputation is generally preferable. A robust approach based on isometric log-ratio transformations appears to be particularly suited in this context. Computer routines in the R statistical programming language are provided.
NASA Astrophysics Data System (ADS)
Gigli, G.; Casagli, N.; Lombardi, L.; Nocentini, M.; Balducci, M.; Venanti, L.
2009-04-01
In the past few years the Maiolica (micritic limestone) quarry of Torgiovannetto (Perugia, Italy) has suffered an increasing amount of rockfalls. The rock mass has loosened progressively and a perimetral crack longer than 100 meters has appeared. The huge block bounded by this crack, two lateral discontinuities and a stratigraphic layer, threatens two roads at the base of the slope. Since these are very important and busy traffic routes the Department of Earth Sciences of the University of Firenze performed magnitude estimations and runout analyses regarding two different aspects: 1) investigate the trajectories of single falling blocks and; 2) forecast the runout distance and the debris intensity distribution in case a large rockslide occurs. The magnitude of a landslide is, actually, the most important input parameter for correctly estimating the trajectory, the runout distance and the kinetic energy of a landslide. A detailed and updated knowledge of the actual morphological conditions is a good starting point for defining as accurately as possible the extent of a moving block. Due to the very high urgency and precision required, a detailed survey of the quarry area has been performed by means of a High Accuracy & Long Range 3D laser scanner (RIEGLE, LMS-Z.420i). In order to avoid shadow zones and to obtain a comprehensive digital elevation model of the quarry area, a total of more than 30 million points were taken from three different scan positions. The resulting point cloud was dense enough to reveal the main structural features of the rock mass, including the discontinuities bounding the moving block, which has a calculated volume of 180 000 m3. With the aim of confirming the block volume and assessing the deformational field of the moving mass, a multitemporal ground-based interferometric SAR survey was performed. The results of the survey precisely confirm the geometry of the unstable block and also indicate that the displacements decrease from E to W, due to the greater lateral friction in the western portion of the wedge. This deformational behaviour has been confirmed by a wireless real time monitoring system installed for the time of failure forecast. Laboratory tests and stability analyses of the unstable wedge allowed us to hypothesize a sudden and brittle failure behavior, which can be associated to a long runout distance. Both empirical (energy line approach) and numerical methods (DAN-W and DAN3D softwares) were employed for estimating the runout distance and debris intensity distribution associated with the failure of the main block. The results of this analysis indicate that the potential rockslide will likely reach the nearest road. The estimated velocity, debris depth, and kinetic energy of the moving mass can be used to project defensive structures at the base of the artificial slope.
Feder, Paul I; Ma, Zhenxu J; Bull, Richard J; Teuschler, Linda K; Rice, Glenn
2009-01-01
In chemical mixtures risk assessment, the use of dose-response data developed for one mixture to estimate risk posed by a second mixture depends on whether the two mixtures are sufficiently similar. While evaluations of similarity may be made using qualitative judgments, this article uses nonparametric statistical methods based on the "bootstrap" resampling technique to address the question of similarity among mixtures of chemical disinfectant by-products (DBP) in drinking water. The bootstrap resampling technique is a general-purpose, computer-intensive approach to statistical inference that substitutes empirical sampling for theoretically based parametric mathematical modeling. Nonparametric, bootstrap-based inference involves fewer assumptions than parametric normal theory based inference. The bootstrap procedure is appropriate, at least in an asymptotic sense, whether or not the parametric, distributional assumptions hold, even approximately. The statistical analysis procedures in this article are initially illustrated with data from 5 water treatment plants (Schenck et al., 2009), and then extended using data developed from a study of 35 drinking-water utilities (U.S. EPA/AMWA, 1989), which permits inclusion of a greater number of water constituents and increased structure in the statistical models.
Seol, Hyunsoo
2016-06-01
The purpose of this study was to apply the bootstrap procedure to evaluate how the bootstrapped confidence intervals (CIs) for polytomous Rasch fit statistics might differ according to sample sizes and test lengths in comparison with the rule-of-thumb critical value of misfit. A total of 25 simulated data sets were generated to fit the Rasch measurement and then a total of 1,000 replications were conducted to compute the bootstrapped CIs under each of 25 testing conditions. The results showed that rule-of-thumb critical values for assessing the magnitude of misfit were not applicable because the infit and outfit mean square error statistics showed different magnitudes of variability over testing conditions and the standardized fit statistics did not exactly follow the standard normal distribution. Further, they also do not share the same critical range for the item and person misfit. Based on the results of the study, the bootstrapped CIs can be used to identify misfitting items or persons as they offer a reasonable alternative solution, especially when the distributions of the infit and outfit statistics are not well known and depend on sample size. © The Author(s) 2016.
1993-09-10
1993). A bootstrap generalizedlikelihood ratio test in discriminant analysis, Proc. 15th Annual Seismic Research Symposium, in press. I Hedlin, M., J... ratio indicate that the event does not belong to the first class. The bootstrap technique is used here as well to set the critical value of the test ...Methodist University. Baek, J., H. L. Gray, W. A. Woodward and M.D. Fisk (1993). A Bootstrap Generalized Likelihood Ratio Test in Discriminant
ERIC Educational Resources Information Center
Bremner, J. Gavin; Andreasen, Gillian
1997-01-01
Had children draw two blocks arranged in depth, and then moved either child or array and had children draw what was then a left-right arrangement; the transformation was then reversed for a final drawing. Found that when children moved to a new standpoint, there was a significant increase in vertical portrayal (as depth portrayal) between first…
... thinning medicines such as Coumadin Trauma Uncontrolled (severe) high blood pressure STROKES Most strokes are caused when blood clots move to a blood vessel in the brain and block blood flow to that area. For such strokes (ischemic strokes), ...
Locking apparatus for gate valves
Fabyan, J.; Williams, C.W.
A locking apparatus for fluid operated valves having a piston connected to the valve actuator which moves in response to applied pressure within a cylinder housing having a cylinder head, a catch block is secured to the piston, and the cylinder head incorporates a catch pin. Pressure applied to the cylinder to open the valve moves the piston adjacent to the cylinder head where the catch pin automatically engages the catch block preventing further movement of the piston or premature closure of the valve. Application of pressure to the cylinder to close the valve, retracts the catch pin, allowing the valve to close. Included are one or more selector valves, for selecting pressure application to other apparatus depending on the gate valve position, open or closed, protecting such apparatus from damage due to premature closing caused by pressure loss or operational error.
Uddameri, Venkatesh; Singaraju, Sreeram; Hernandez, E Annette
2018-02-21
Seasonal and cyclic trends in nutrient concentrations at four agricultural drainage ditches were assessed using a dataset generated from a multivariate, multiscale, multiyear water quality monitoring effort in the agriculturally dominant Lower Rio Grande Valley (LRGV) River Watershed in South Texas. An innovative bootstrap sampling-based power analysis procedure was developed to evaluate the ability of Mann-Whitney and Noether tests to discern trends and to guide future monitoring efforts. The Mann-Whitney U test was able to detect significant changes between summer and winter nutrient concentrations at sites with lower depths and unimpeded flows. Pollutant dilution, non-agricultural loadings, and in-channel flow structures (weirs) masked the effects of seasonality. The detection of cyclical trends using the Noether test was highest in the presence of vegetation mainly for total phosphorus and oxidized nitrogen (nitrite + nitrate) compared to dissolved phosphorus and reduced nitrogen (total Kjeldahl nitrogen-TKN). Prospective power analysis indicated that while increased monitoring can lead to higher statistical power, the effect size (i.e., the total number of trend sequences within a time-series) had a greater influence on the Noether test. Both Mann-Whitney and Noether tests provide complementary information on seasonal and cyclic behavior of pollutant concentrations and are affected by different processes. The results from these statistical tests when evaluated in the context of flow, vegetation, and in-channel hydraulic alterations can help guide future data collection and monitoring efforts. The study highlights the need for long-term monitoring of agricultural drainage ditches to properly discern seasonal and cyclical trends.
Cui, Ming; Xu, Lili; Wang, Huimin; Ju, Shaoqing; Xu, Shuizhu; Jing, Rongrong
2017-12-01
Measurement uncertainty (MU) is a metrological concept, which can be used for objectively estimating the quality of test results in medical laboratories. The Nordtest guide recommends an approach that uses both internal quality control (IQC) and external quality assessment (EQA) data to evaluate the MU. Bootstrap resampling is employed to simulate the unknown distribution based on the mathematical statistics method using an existing small sample of data, where the aim is to transform the small sample into a large sample. However, there have been no reports of the utilization of this method in medical laboratories. Thus, this study applied the Nordtest guide approach based on bootstrap resampling for estimating the MU. We estimated the MU for the white blood cell (WBC) count, red blood cell (RBC) count, hemoglobin (Hb), and platelets (Plt). First, we used 6months of IQC data and 12months of EQA data to calculate the MU according to the Nordtest method. Second, we combined the Nordtest method and bootstrap resampling with the quality control data and calculated the MU using MATLAB software. We then compared the MU results obtained using the two approaches. The expanded uncertainty results determined for WBC, RBC, Hb, and Plt using the bootstrap resampling method were 4.39%, 2.43%, 3.04%, and 5.92%, respectively, and 4.38%, 2.42%, 3.02%, and 6.00% with the existing quality control data (U [k=2]). For WBC, RBC, Hb, and Plt, the differences between the results obtained using the two methods were lower than 1.33%. The expanded uncertainty values were all less than the target uncertainties. The bootstrap resampling method allows the statistical analysis of the MU. Combining the Nordtest method and bootstrap resampling is considered a suitable alternative method for estimating the MU. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Exploring the Replicability of a Study's Results: Bootstrap Statistics for the Multivariate Case.
ERIC Educational Resources Information Center
Thompson, Bruce
1995-01-01
Use of the bootstrap method in a canonical correlation analysis to evaluate the replicability of a study's results is illustrated. More confidence may be vested in research results that replicate. (SLD)
The Role of GRAIL Orbit Determination in Preprocessing of Gravity Science Measurements
NASA Technical Reports Server (NTRS)
Kruizinga, Gerhard; Asmar, Sami; Fahnestock, Eugene; Harvey, Nate; Kahan, Daniel; Konopliv, Alex; Oudrhiri, Kamal; Paik, Meegyeong; Park, Ryan; Strekalov, Dmitry;
2013-01-01
The Gravity Recovery And Interior Laboratory (GRAIL) mission has constructed a lunar gravity field with unprecedented uniform accuracy on the farside and nearside of the Moon. GRAIL lunar gravity field determination begins with preprocessing of the gravity science measurements by applying corrections for time tag error, general relativity, measurement noise and biases. Gravity field determination requires the generation of spacecraft ephemerides of an accuracy not attainable with the pre-GRAIL lunar gravity fields. Therefore, a bootstrapping strategy was developed, iterating between science data preprocessing and lunar gravity field estimation in order to construct sufficiently accurate orbit ephemerides.This paper describes the GRAIL measurements, their dependence on the spacecraft ephemerides and the role of orbit determination in the bootstrapping strategy. Simulation results will be presented that validate the bootstrapping strategy followed by bootstrapping results for flight data, which have led to the latest GRAIL lunar gravity fields.
The economics of bootstrapping space industries - Development of an analytic computer model
NASA Technical Reports Server (NTRS)
Goldberg, A. H.; Criswell, D. R.
1982-01-01
A simple economic model of 'bootstrapping' industrial growth in space and on the Moon is presented. An initial space manufacturing facility (SMF) is assumed to consume lunar materials to enlarge the productive capacity in space. After reaching a predetermined throughput, the enlarged SMF is devoted to products which generate revenue continuously in proportion to the accumulated output mass (such as space solar power stations). Present discounted value and physical estimates for the general factors of production (transport, capital efficiency, labor, etc.) are combined to explore optimum growth in terms of maximized discounted revenues. It is found that 'bootstrapping' reduces the fractional cost to a space industry of transport off-Earth, permits more efficient use of a given transport fleet. It is concluded that more attention should be given to structuring 'bootstrapping' scenarios in which 'learning while doing' can be more fully incorporated in program analysis.
Towards a bootstrap approach to higher orders of epsilon expansion
NASA Astrophysics Data System (ADS)
Dey, Parijat; Kaviraj, Apratim
2018-02-01
We employ a hybrid approach in determining the anomalous dimension and OPE coefficient of higher spin operators in the Wilson-Fisher theory. First we do a large spin analysis for CFT data where we use results obtained from the usual and the Mellin bootstrap and also from Feynman diagram literature. This gives new predictions at O( ɛ 4) and O( ɛ 5) for anomalous dimensions and OPE coefficients, and also provides a cross-check for the results from Mellin bootstrap. These higher orders get contributions from all higher spin operators in the crossed channel. We also use the bootstrap in Mellin space method for ϕ 3 in d = 6 - ɛ CFT where we calculate general higher spin OPE data. We demonstrate a higher loop order calculation in this approach by summing over contributions from higher spin operators of the crossed channel in the same spirit as before.
Point Set Denoising Using Bootstrap-Based Radial Basis Function.
Liew, Khang Jie; Ramli, Ahmad; Abd Majid, Ahmad
2016-01-01
This paper examines the application of a bootstrap test error estimation of radial basis functions, specifically thin-plate spline fitting, in surface smoothing. The presence of noisy data is a common issue of the point set model that is generated from 3D scanning devices, and hence, point set denoising is one of the main concerns in point set modelling. Bootstrap test error estimation, which is applied when searching for the smoothing parameters of radial basis functions, is revisited. The main contribution of this paper is a smoothing algorithm that relies on a bootstrap-based radial basis function. The proposed method incorporates a k-nearest neighbour search and then projects the point set to the approximated thin-plate spline surface. Therefore, the denoising process is achieved, and the features are well preserved. A comparison of the proposed method with other smoothing methods is also carried out in this study.
Abstract: Inference and Interval Estimation for Indirect Effects With Latent Variable Models.
Falk, Carl F; Biesanz, Jeremy C
2011-11-30
Models specifying indirect effects (or mediation) and structural equation modeling are both popular in the social sciences. Yet relatively little research has compared methods that test for indirect effects among latent variables and provided precise estimates of the effectiveness of different methods. This simulation study provides an extensive comparison of methods for constructing confidence intervals and for making inferences about indirect effects with latent variables. We compared the percentile (PC) bootstrap, bias-corrected (BC) bootstrap, bias-corrected accelerated (BC a ) bootstrap, likelihood-based confidence intervals (Neale & Miller, 1997), partial posterior predictive (Biesanz, Falk, and Savalei, 2010), and joint significance tests based on Wald tests or likelihood ratio tests. All models included three reflective latent variables representing the independent, dependent, and mediating variables. The design included the following fully crossed conditions: (a) sample size: 100, 200, and 500; (b) number of indicators per latent variable: 3 versus 5; (c) reliability per set of indicators: .7 versus .9; (d) and 16 different path combinations for the indirect effect (α = 0, .14, .39, or .59; and β = 0, .14, .39, or .59). Simulations were performed using a WestGrid cluster of 1680 3.06GHz Intel Xeon processors running R and OpenMx. Results based on 1,000 replications per cell and 2,000 resamples per bootstrap method indicated that the BC and BC a bootstrap methods have inflated Type I error rates. Likelihood-based confidence intervals and the PC bootstrap emerged as methods that adequately control Type I error and have good coverage rates.
Combining test statistics and models in bootstrapped model rejection: it is a balancing act
2014-01-01
Background Model rejections lie at the heart of systems biology, since they provide conclusive statements: that the corresponding mechanistic assumptions do not serve as valid explanations for the experimental data. Rejections are usually done using e.g. the chi-square test (χ2) or the Durbin-Watson test (DW). Analytical formulas for the corresponding distributions rely on assumptions that typically are not fulfilled. This problem is partly alleviated by the usage of bootstrapping, a computationally heavy approach to calculate an empirical distribution. Bootstrapping also allows for a natural extension to estimation of joint distributions, but this feature has so far been little exploited. Results We herein show that simplistic combinations of bootstrapped tests, like the max or min of the individual p-values, give inconsistent, i.e. overly conservative or liberal, results. A new two-dimensional (2D) approach based on parametric bootstrapping, on the other hand, is found both consistent and with a higher power than the individual tests, when tested on static and dynamic examples where the truth is known. In the same examples, the most superior test is a 2D χ2vsχ2, where the second χ2-value comes from an additional help model, and its ability to describe bootstraps from the tested model. This superiority is lost if the help model is too simple, or too flexible. If a useful help model is found, the most powerful approach is the bootstrapped log-likelihood ratio (LHR). We show that this is because the LHR is one-dimensional, because the second dimension comes at a cost, and because LHR has retained most of the crucial information in the 2D distribution. These approaches statistically resolve a previously published rejection example for the first time. Conclusions We have shown how to, and how not to, combine tests in a bootstrap setting, when the combination is advantageous, and when it is advantageous to include a second model. These results also provide a deeper insight into the original motivation for formulating the LHR, for the more general setting of nonlinear and non-nested models. These insights are valuable in cases when accuracy and power, rather than computational speed, are prioritized. PMID:24742065
Changing Your Habits: Steps to Better Health
... to overcome things that sometimes block your success Maintenance: “I have a new routine.” In this final ... your health by moving more and eating healthier. Maintenance: Have you created a new routine? Make your ...
Sheridan, Heather; Reingold, Eyal M
2013-01-01
In a wide range of problem-solving settings, the presence of a familiar solution can block the discovery of better solutions (i.e., the Einstellung effect). To investigate this effect, we monitored the eye movements of expert and novice chess players while they solved chess problems that contained a familiar move (i.e., the Einstellung move), as well as an optimal move that was located in a different region of the board. When the Einstellung move was an advantageous (but suboptimal) move, both the expert and novice chess players who chose the Einstellung move continued to look at this move throughout the trial, whereas the subset of expert players who chose the optimal move were able to gradually disengage their attention from the Einstellung move. However, when the Einstellung move was a blunder, all of the experts and the majority of the novices were able to avoid selecting the Einstellung move, and both the experts and novices gradually disengaged their attention from the Einstellung move. These findings shed light on the boundary conditions of the Einstellung effect, and provide convergent evidence for Bilalić, McLeod, & Gobet (2008)'s conclusion that the Einstellung effect operates by biasing attention towards problem features that are associated with the familiar solution rather than the optimal solution.
Perceptual Integration and Differentiation of Directions in Moving Patterns
1981-08-01
ceBssay and identify by block numnbe,) o ~ b 20 ABSTRACT (Continue oil rel’erse side II necosary aid idonlty, by block number) F . A- 1981. (-7 ATTACHED...process, are discussed. REFERENCES Mather, G. and Moulden, B . A simultaneous shift in apparent direction: Further evidence for a "distribution- shift" model...summing process, are discussed. REFERENCES Mather, G. and Moulden, B . A simultaneous shift in apparent direction: Further evidence for a "distribution
2014-09-01
rod moves about the illumination scene, the pixels in the detector start to flicker . The ‘ flickering ’ effect is due to the metal rod blocking THz...still possible to mitigate convective heat exchange between the sensor and the ambient surroundings. To mitigate the effects of convective heat...detector start to flicker . The ‘ flickering ’ effect is due to the metal rod blocking THz radiation. This effect is more apparent in the video
Contextual effects on preattentive processing of sound motion as revealed by spatial MMN.
Shestopalova, L B; Petropavlovskaia, E A; Vaitulevich, S Ph; Nikitin, N I
2015-04-01
The magnitude of spatial distance between sound stimuli is critically important for their preattentive discrimination, yet the effect of stimulus context on auditory motion processing is not clear. This study investigated the effects of acoustical change and stimulus context on preattentive spatial change detection. Auditory event-related potentials (ERPs) were recorded for stationary midline noises and two patterns of sound motion produced by linear or abrupt changes of interaural time differences. Each of the three types of stimuli was used as standard or deviant in different blocks. Context effects on mismatch negativity (MMN) elicited by stationary and moving sound stimuli were investigated by reversing the role of standard and deviant stimuli, while the acoustical stimulus parameters were kept the same. That is, MMN amplitudes were calculated by subtracting ERPs to identical stimuli presented as standard in one block and deviant in another block. In contrast, effects of acoustical change on MMN amplitudes were calculated by subtracting ERPs of standards and deviants presented within the same block. Preattentive discrimination of moving and stationary sounds indexed by MMN was strongly dependent on the stimulus context. Higher MMNs were produced in oddball configurations where deviance represented increments of the sound velocity, as compared to configurations with velocity decrements. The effect of standard-deviant reversal was more pronounced with the abrupt sound displacement than with gradual sound motion. Copyright © 2015 Elsevier B.V. All rights reserved.
Improved memory loading techniques for the TSRV display system
NASA Technical Reports Server (NTRS)
Easley, W. C.; Lynn, W. A.; Mcluer, D. G.
1986-01-01
A recent upgrade of the TSRV research flight system at NASA Langley Research Center retained the original monochrome display system. However, the display memory loading equipment was replaced requiring design and development of new methods of performing this task. This paper describes the new techniques developed to load memory in the display system. An outdated paper tape method for loading the BOOTSTRAP control program was replaced by EPROM storage of the characters contained on the tape. Rather than move a tape past an optical reader, a counter was implemented which steps sequentially through EPROM addresses and presents the same data to the loader circuitry. A cumbersome cassette tape method for loading the applications software was replaced with a floppy disk method using a microprocessor terminal installed as part of the upgrade. The cassette memory image was transferred to disk and a specific software loader was written for the terminal which duplicates the function of the cassette loader.
Simulating realistic predator signatures in quantitative fatty acid signature analysis
Bromaghin, Jeffrey F.
2015-01-01
Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.
A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules
NASA Astrophysics Data System (ADS)
Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.
2012-08-01
Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.
Bootstrapping conformal field theories with the extremal functional method.
El-Showk, Sheer; Paulos, Miguel F
2013-12-13
The existence of a positive linear functional acting on the space of (differences between) conformal blocks has been shown to rule out regions in the parameter space of conformal field theories (CFTs). We argue that at the boundary of the allowed region the extremal functional contains, in principle, enough information to determine the dimensions and operator product expansion (OPE) coefficients of an infinite number of operators appearing in the correlator under analysis. Based on this idea we develop the extremal functional method (EFM), a numerical procedure for deriving the spectrum and OPE coefficients of CFTs lying on the boundary (of solution space). We test the EFM by using it to rederive the low lying spectrum and OPE coefficients of the two-dimensional Ising model based solely on the dimension of a single scalar quasiprimary--no Virasoro algebra required. Our work serves as a benchmark for applications to more interesting, less known CFTs in the near future.
Significance tests for functional data with complex dependence structure.
Staicu, Ana-Maria; Lahiri, Soumen N; Carroll, Raymond J
2015-01-01
We propose an L 2 -norm based global testing procedure for the null hypothesis that multiple group mean functions are equal, for functional data with complex dependence structure. Specifically, we consider the setting of functional data with a multilevel structure of the form groups-clusters or subjects-units, where the unit-level profiles are spatially correlated within the cluster, and the cluster-level data are independent. Orthogonal series expansions are used to approximate the group mean functions and the test statistic is estimated using the basis coefficients. The asymptotic null distribution of the test statistic is developed, under mild regularity conditions. To our knowledge this is the first work that studies hypothesis testing, when data have such complex multilevel functional and spatial structure. Two small-sample alternatives, including a novel block bootstrap for functional data, are proposed, and their performance is examined in simulation studies. The paper concludes with an illustration of a motivating experiment.
Minimum area rig concept update: H and P 101 modifications and first infield move
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sigurdson, S.R.
1987-03-01
The minimum area rig concept (MARC) is a cost-effective alternative to the typical self-contained platform rig (SCPR). Helmerich and Payne (HandP) built the first MARC rig, HandP 101, to drill and to work over wells up to 16,000 ft (4877 m) measured depth. This rig began operation in May 1983 in the Gulf of Mexico at Arco Oil and Gas Co.'s South Pass Block 61 field and has undergone one infield move. Since the rig's initial mobilization, several rig modifications have been added to increase storage area, to promote safety, to provide a more efficient drilling/workover rig, and to reducemore » overall move time. This paper describes the modifications and recaps the rig's first move. This provides further insight into the MARC rig and show the benefits of the MARC design in relation to a move.« less
Parameters affecting the frequency of a fluid oscillator
NASA Astrophysics Data System (ADS)
Cheng, R. M. H.; Kwok, C. K.; Lee, R. S.
1983-06-01
A new type of liquid-operated low-frequency oscillator is introduced. The oscillator consists of a cone-shaped housing with a fluid inlet and two outlet discharging tubes. The fluid discharge is controlled by a ball which blocks one of the outlet tubes. A strong vacuum develops due to the inertial effect of the column of liquid moving downward in the blocked tube. When the initial energy and velocity of the liquid slug are reduced to zero, it starts to return toward the ball. Eventually the combined force of the pressure inside the housing and the momentum of the upcoming slug is large enough to displace the ball to the other outlet tube, and the same procedure is then repeated. The main part of the paper consists of an analysis of the time required for the forward and reverse motion of the slug and for the ball to move from one discharge hole to the other.
Determining biological tissue optical properties via integrating sphere spatial measurements
Baba, Justin S [Knoxville, TN; Letzen, Brian S [Coral Springs, FL
2011-01-11
An optical sample is mounted on a spatial-acquisition apparatus that is placed in or on an enclosure. An incident beam is irradiated on a surface of the sample and the specular reflection is allowed to escape from the enclosure through an opening. The spatial-acquisition apparatus is provided with a light-occluding slider that moves in front of the sample to block portions of diffuse scattering from the sample. As the light-occluding slider moves across the front of the sample, diffuse light scattered into the area of the backside of the light-occluding slider is absorbed by back side surface of the light-occluding slider. By measuring a baseline diffuse reflectance without a light-occluding slider and subtracting measured diffuse reflectance with a light-occluding slider therefrom, diffuse reflectance for the area blocked by the light-occluding slider can be calculated.
Bootstrap Methods: A Very Leisurely Look.
ERIC Educational Resources Information Center
Hinkle, Dennis E.; Winstead, Wayland H.
The Bootstrap method, a computer-intensive statistical method of estimation, is illustrated using a simple and efficient Statistical Analysis System (SAS) routine. The utility of the method for generating unknown parameters, including standard errors for simple statistics, regression coefficients, discriminant function coefficients, and factor…
Bootstrapping Student Understanding of What Is Going on in Econometrics.
ERIC Educational Resources Information Center
Kennedy, Peter E.
2001-01-01
Explains that econometrics is an intellectual game played by rules based on the sampling distribution concept. Contains explanations for why many students are uncomfortable with econometrics. Encourages instructors to use explain-how-to-bootstrap exercises to promote student understanding. (RLH)
Sándor, Bulcsú; Járai-Szabó, Ferenc; Tél, Tamás; Néda, Zoltán
2013-04-01
The dynamics of a spring-block train placed on a moving conveyor belt is investigated both by simple experiments and computer simulations. The first block is connected by a spring to an external static point and, due to the dragging effect of the belt, the blocks undergo complex stick-slip dynamics. A qualitative agreement with the experimental results can be achieved only by taking into account the spatial inhomogeneity of the friction force on the belt's surface, modeled as noise. As a function of the velocity of the conveyor belt and the noise strength, the system exhibits complex, self-organized critical, sometimes chaotic, dynamics and phase transition-like behavior. Noise-induced chaos and intermittency is also observed. Simulations suggest that the maximum complexity of the dynamical states is achieved for a relatively small number of blocks (around five).
Distribution majorization of corner points by reinforcement learning for moving object detection
NASA Astrophysics Data System (ADS)
Wu, Hao; Yu, Hao; Zhou, Dongxiang; Cheng, Yongqiang
2018-04-01
Corner points play an important role in moving object detection, especially in the case of free-moving camera. Corner points provide more accurate information than other pixels and reduce the computation which is unnecessary. Previous works only use intensity information to locate the corner points, however, the information that former and the last frames provided also can be used. We utilize the information to focus on more valuable area and ignore the invaluable area. The proposed algorithm is based on reinforcement learning, which regards the detection of corner points as a Markov process. In the Markov model, the video to be detected is regarded as environment, the selections of blocks for one corner point are regarded as actions and the performance of detection is regarded as state. Corner points are assigned to be the blocks which are seperated from original whole image. Experimentally, we select a conventional method which uses marching and Random Sample Consensus algorithm to obtain objects as the main framework and utilize our algorithm to improve the result. The comparison between the conventional method and the same one with our algorithm show that our algorithm reduce 70% of the false detection.
A complete passive blind image copy-move forensics scheme based on compound statistics features.
Peng, Fei; Nie, Yun-ying; Long, Min
2011-10-10
Since most sensor pattern noise based image copy-move forensics methods require a known reference sensor pattern noise, it generally results in non-blinded passive forensics, which significantly confines the application circumstances. In view of this, a novel passive-blind image copy-move forensics scheme is proposed in this paper. Firstly, a color image is transformed into a grayscale one, and wavelet transform based de-noising filter is used to extract the sensor pattern noise, then the variance of the pattern noise, the signal noise ratio between the de-noised image and the pattern noise, the information entropy and the average energy gradient of the original grayscale image are chosen as features, non-overlapping sliding window operations are done to the images to divide them into different sub-blocks. Finally, the tampered areas are detected by analyzing the correlation of the features between the sub-blocks and the whole image. Experimental results and analysis show that the proposed scheme is completely passive-blind, has a good detection rate, and is robust against JPEG compression, noise, rotation, scaling and blurring. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Anticipatory adjustments to abrupt changes of opposing forces.
Rapp, Katrin; Heuer, Herbert
2015-01-01
Anticipatory adjustments to abrupt load changes are based on task-specific predictive information. The authors asked whether anticipatory adjustments to abrupt offsets of horizontal forces are related to expectancy. In two experiments participants held a position against an opposing force or moved against it. At force offset they had to stop rapidly. Duration of the opposing force or distance moved against it varied between blocks of trials and was constant within each block, or it varied from trial to trial. These two variations resulted in opposite changes of the expectancy of force offset with the passage of time or distance. With constant force durations or distances in each block of trials, anticipatory adjustments tended to be poorest with the longest duration or distance, but with variable force durations or distances they tended to be best with the longest duration or distance. Thus anticipatory adjustments were related to expectancy rather than time or distance per se. Anticipatory adjustments resulted in shorter peak amplitudes of the involuntary movements, accompanied by longer movement times in Experiment 1 and faster movement times in Experiment 2. Thus, for different states of the limb at abrupt dynamic changes anticipatory adjustments involve different mechanisms that modulate different mechanical characteristics.
2013-01-01
Background Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Methods Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups of patients compared to the most discriminating (reference) measure. Statistical significance of RV was quantified by the 95% bootstrap confidence interval. Simulations examined the effects of sample size, denominator F-statistic, correlation between comparator and reference measures, and number of bootstrap replicates. Results The statistical significance of the RV increased as the magnitude of denominator F-statistic increased or as the correlation between comparator and reference measures increased. A denominator F-statistic of 57 conveyed sufficient power (80%) to detect an RV of 0.6 for two measures correlated at r = 0.7. Larger denominator F-statistics or higher correlations provided greater power. Larger sample size with a fixed denominator F-statistic or more bootstrap replicates (beyond 500) had minimal impact. Conclusions The bootstrap is valuable for establishing the statistical significance of RV estimates. A reasonably large denominator F-statistic (F > 57) is required for adequate power when using the RV to compare the validity of measures with small or moderate correlations (r < 0.7). Substantially greater power can be achieved when comparing measures of a very high correlation (r > 0.9). PMID:23721463
Lee, Ho; Fahimian, Benjamin P; Xing, Lei
2017-03-21
This paper proposes a binary moving-blocker (BMB)-based technique for scatter correction in cone-beam computed tomography (CBCT). In concept, a beam blocker consisting of lead strips, mounted in front of the x-ray tube, moves rapidly in and out of the beam during a single gantry rotation. The projections are acquired in alternating phases of blocked and unblocked cone beams, where the blocked phase results in a stripe pattern in the width direction. To derive the scatter map from the blocked projections, 1D B-Spline interpolation/extrapolation is applied by using the detected information in the shaded regions. The scatter map of the unblocked projections is corrected by averaging two scatter maps that correspond to their adjacent blocked projections. The scatter-corrected projections are obtained by subtracting the corresponding scatter maps from the projection data and are utilized to generate the CBCT image by a compressed-sensing (CS)-based iterative reconstruction algorithm. Catphan504 and pelvis phantoms were used to evaluate the method's performance. The proposed BMB-based technique provided an effective method to enhance the image quality by suppressing scatter-induced artifacts, such as ring artifacts around the bowtie area. Compared to CBCT without a blocker, the spatial nonuniformity was reduced from 9.1% to 3.1%. The root-mean-square error of the CT numbers in the regions of interest (ROIs) was reduced from 30.2 HU to 3.8 HU. In addition to high resolution, comparable to that of the benchmark image, the CS-based reconstruction also led to a better contrast-to-noise ratio in seven ROIs. The proposed technique enables complete scatter-corrected CBCT imaging with width-truncated projections and allows reducing the acquisition time to approximately half. This work may have significant implications for image-guided or adaptive radiation therapy, where CBCT is often used.
NASA Astrophysics Data System (ADS)
Lee, Ho; Fahimian, Benjamin P.; Xing, Lei
2017-03-01
This paper proposes a binary moving-blocker (BMB)-based technique for scatter correction in cone-beam computed tomography (CBCT). In concept, a beam blocker consisting of lead strips, mounted in front of the x-ray tube, moves rapidly in and out of the beam during a single gantry rotation. The projections are acquired in alternating phases of blocked and unblocked cone beams, where the blocked phase results in a stripe pattern in the width direction. To derive the scatter map from the blocked projections, 1D B-Spline interpolation/extrapolation is applied by using the detected information in the shaded regions. The scatter map of the unblocked projections is corrected by averaging two scatter maps that correspond to their adjacent blocked projections. The scatter-corrected projections are obtained by subtracting the corresponding scatter maps from the projection data and are utilized to generate the CBCT image by a compressed-sensing (CS)-based iterative reconstruction algorithm. Catphan504 and pelvis phantoms were used to evaluate the method’s performance. The proposed BMB-based technique provided an effective method to enhance the image quality by suppressing scatter-induced artifacts, such as ring artifacts around the bowtie area. Compared to CBCT without a blocker, the spatial nonuniformity was reduced from 9.1% to 3.1%. The root-mean-square error of the CT numbers in the regions of interest (ROIs) was reduced from 30.2 HU to 3.8 HU. In addition to high resolution, comparable to that of the benchmark image, the CS-based reconstruction also led to a better contrast-to-noise ratio in seven ROIs. The proposed technique enables complete scatter-corrected CBCT imaging with width-truncated projections and allows reducing the acquisition time to approximately half. This work may have significant implications for image-guided or adaptive radiation therapy, where CBCT is often used.
COS NUV Target Acquisition Monitor
NASA Astrophysics Data System (ADS)
Penton, Steven V.
2017-08-01
Visits PA, BA, & BB of this program verify all ACQ/IMAGE mode co-alignments by bootstrapping from PSA+MIRRORA. The assumption, which should be tested at some point, is that the PSA+MIRRORA WCA-to-PSA FSW offsets are still as accurate in defining the center of the PSA relative to the WCA as there were in SMOV. The details of the observations are given is the observing section.Visit PB was an on-hold contingency visit in case, for whatever reason, visit 2A of 14452, did not execute as planned in the fall of 2017. This program was replaced with a better program for aligning the FGGs so we needed to activate this visit to obtain the PSA/MIRRORA to PSA/MIRRORB ACQ/IMAGE alignment. Visit BA of this program takes back-to-back PSA/MIRRORB & BOA/MIRRORA ACQ/Images and images (with flashes) and also takes G230L, G285M as well as FUV LP3 G130M and G140L spectra to test the WCA-to-PSA offsets.Visit BB of this program takes back-to-back BOA/MIRRORA & BOA/MIRRORB ACQ/Images and images (with flashes) and also takes G225M, G185M, and FUV LP3 G160M spectra to test the WCA-to-PSA offsets. Visit BA of this program bootstraps off VIsit PB to co-align the PSA+MIRRORB ACQ/IMAGE mode to the BOA+MIRRORA. Visit BB of this program follows the style of Visit BA and bootstraps from the BOA+MIRRORA mode to the BOA+MIRRORB TA imaging mode. In all visits, lamp+target images are taken before and after the TA imaging mode that is being co-aligned (the second ACQ/IMAGE of the program.)All visits in this program are single orbit visits. This program is very similar to the NUV portion of the C24 version (14857). This program differs from the Cycle 23 version in that Visit PB (the old Visit 03) has been permanently upgraded from contingency to operational status. NOTE: Beginning with Cycle 25. ALL FUV exposures in this program have been moved to a separate monitoring program. This program will sequentially test the XD accuracy of FUV LP4 spectra. As needed, NUV ACQ/IMAGEs will reset the centering between grating tests.
Moving Beam-Blocker-Based Low-Dose Cone-Beam CT
NASA Astrophysics Data System (ADS)
Lee, Taewon; Lee, Changwoo; Baek, Jongduk; Cho, Seungryong
2016-10-01
This paper experimentally demonstrates a feasibility of moving beam-blocker-based low-dose cone-beam CT (CBCT) and exploits the beam-blocking configurations to reach an optimal one that leads to the highest contrast-to-noise ratio (CNR). Sparse-view CT takes projections at sparse view angles and provides a viable option to reducing dose. We have earlier proposed a many-view under-sampling (MVUS) technique as an alternative to sparse-view CT. Instead of switching the x-ray tube power, one can place a reciprocating multi-slit beam-blocker between the x-ray tube and the patient to partially block the x-ray beam. We used a bench-top circular cone-beam CT system with a lab-made moving beam-blocker. For image reconstruction, we used a modified total-variation minimization (TV) algorithm that masks the blocked data in the back-projection step leaving only the measured data through the slits to be used in the computation. The number of slits and the reciprocation frequency have been varied and the effects of them on the image quality were investigated. For image quality assessment, we used CNR and the detectability. We also analyzed the sampling efficiency in the context of compressive sensing: the sampling density and data incoherence in each case. We tested three sets of slits with their number of 6, 12 and 18, each at reciprocation frequencies of 10, 30, 50 and 70 Hz/rot. The optimum condition out of the tested sets was found to be using 12 slits at 30 Hz/rot.
Four Bootstrap Confidence Intervals for the Binomial-Error Model.
ERIC Educational Resources Information Center
Lin, Miao-Hsiang; Hsiung, Chao A.
1992-01-01
Four bootstrap methods are identified for constructing confidence intervals for the binomial-error model. The extent to which similar results are obtained and the theoretical foundation of each method and its relevance and ranges of modeling the true score uncertainty are discussed. (SLD)
Nonparametric Regression and the Parametric Bootstrap for Local Dependence Assessment.
ERIC Educational Resources Information Center
Habing, Brian
2001-01-01
Discusses ideas underlying nonparametric regression and the parametric bootstrap with an overview of their application to item response theory and the assessment of local dependence. Illustrates the use of the method in assessing local dependence that varies with examinee trait levels. (SLD)
Application of the Bootstrap Statistical Method in Deriving Vibroacoustic Specifications
NASA Technical Reports Server (NTRS)
Hughes, William O.; Paez, Thomas L.
2006-01-01
This paper discusses the Bootstrap Method for specification of vibroacoustic test specifications. Vibroacoustic test specifications are necessary to properly accept or qualify a spacecraft and its components for the expected acoustic, random vibration and shock environments seen on an expendable launch vehicle. Traditionally, NASA and the U.S. Air Force have employed methods of Normal Tolerance Limits to derive these test levels based upon the amount of data available, and the probability and confidence levels desired. The Normal Tolerance Limit method contains inherent assumptions about the distribution of the data. The Bootstrap is a distribution-free statistical subsampling method which uses the measured data themselves to establish estimates of statistical measures of random sources. This is achieved through the computation of large numbers of Bootstrap replicates of a data measure of interest and the use of these replicates to derive test levels consistent with the probability and confidence desired. The comparison of the results of these two methods is illustrated via an example utilizing actual spacecraft vibroacoustic data.
The Reliability and Stability of an Inferred Phylogenetic Tree from Empirical Data.
Katsura, Yukako; Stanley, Craig E; Kumar, Sudhir; Nei, Masatoshi
2017-03-01
The reliability of a phylogenetic tree obtained from empirical data is usually measured by the bootstrap probability (Pb) of interior branches of the tree. If the bootstrap probability is high for most branches, the tree is considered to be reliable. If some interior branches show relatively low bootstrap probabilities, we are not sure that the inferred tree is really reliable. Here, we propose another quantity measuring the reliability of the tree called the stability of a subtree. This quantity refers to the probability of obtaining a subtree (Ps) of an inferred tree obtained. We then show that if the tree is to be reliable, both Pb and Ps must be high. We also show that Ps is given by a bootstrap probability of the subtree with the closest outgroup sequence, and computer program RESTA for computing the Pb and Ps values will be presented. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Lin, Jyh-Jiuan; Chang, Ching-Hui; Pal, Nabendu
2015-01-01
To test the mutual independence of two qualitative variables (or attributes), it is a common practice to follow the Chi-square tests (Pearson's as well as likelihood ratio test) based on data in the form of a contingency table. However, it should be noted that these popular Chi-square tests are asymptotic in nature and are useful when the cell frequencies are "not too small." In this article, we explore the accuracy of the Chi-square tests through an extensive simulation study and then propose their bootstrap versions that appear to work better than the asymptotic Chi-square tests. The bootstrap tests are useful even for small-cell frequencies as they maintain the nominal level quite accurately. Also, the proposed bootstrap tests are more convenient than the Fisher's exact test which is often criticized for being too conservative. Finally, all test methods are applied to a few real-life datasets for demonstration purposes.
User’s Manual for the Ride Motion Simulator
1989-08-01
1800 psi). Step 16. Pressurize the system by moving the main pressure switch to "ON." Wait for the roll, pitch, and yaw error signals to go to "Zero...Carefully, help the test subject dismount. Step 41. Flip the main pressure switch on the hydraulic control panel to "OFF." This will block hydraulic...1.13, thus lowering the seat. Release the "Low Limit Override" switch. Step 5. Dismount the test subject. Step 6. Move the main pressure switch to the
CORE SATURATION BLOCKING OSCILLATOR
Spinrad, R.J.
1961-10-17
A blocking oscillator which relies on core saturation regulation to control the output pulse width is described. In this arrangement an external magnetic loop is provided in which a saturable portion forms the core of a feedback transformer used with the thermionic or semi-conductor active element. A first stationary magnetic loop establishes a level of flux through the saturation portion of the loop. A second adjustable magnet moves the flux level to select a saturation point giving the desired output pulse width. (AEC)
On Directional Selectivity in Vertebrate Retina: An Experimental and Computational Study
1992-01-01
Borg-Graham MIT Artificial Intelligence Laboratory Approved for public re•l•:sl i istzibu4 93-01232 98 1. 2 114- REPORT DOCUMENTATION PAGE OM[ B o J1...PAGE OF ABSTRACT UNCLASSIFIED UNCLASSIFIED UNCLASSIFIED UNCLASSIFIED %S .4 _ ýB-. u5%Q (*j Block 13 continued: preparation and b ) a whole-cell patch...currents and b ) by re- moving ATP from the electrodes which, in turn, blocks the inhibitory input over time. This finding implies that the necessary and
Portable flooring protects finished surfaces, is easily moved
NASA Technical Reports Server (NTRS)
Carmody, R. J.
1964-01-01
To protect curved, finished surface and provide support for workmen, portable flooring has been made from rigid plastic foam blocks, faced with aluminum strips. Held together by nylon webbing, the flooring can be rolled up for easy carrying.
A Bullet-Block Experiment that Explains the Chain Fountain
NASA Astrophysics Data System (ADS)
Pantaleone, J.; Smith, R.
2018-05-01
It is common in science for two phenomena to appear to be very different, but in fact follow from the same basic principles. Here we consider such a case, the connection between the chain fountain and a bullet-block collision experiment. When an upward moving bullet strikes a wooden block resting on a horizontal table, the block will rise to a higher height when the bullet strikes near the end of the block. This is because the quickly rotating block experiences an additional upward "reaction" force from its contact with the table. Such a reaction force also explains the chain fountain. When a chain falls from a pile in a container to the floor below, the chain rises up above the container. This rise occurs because the quickly rotating links in the container push off of the surface beneath them. We derive a model that accurately describes our measurements in the bullet-block experiment, and then use this same model to calculate an approximate expression for the distance the chain rises above the container. More extensive discussions of the chain fountain are available elsewhere.
Genetic Algorithm Based Framework for Automation of Stochastic Modeling of Multi-Season Streamflows
NASA Astrophysics Data System (ADS)
Srivastav, R. K.; Srinivasan, K.; Sudheer, K.
2009-05-01
Synthetic streamflow data generation involves the synthesis of likely streamflow patterns that are statistically indistinguishable from the observed streamflow data. The various kinds of stochastic models adopted for multi-season streamflow generation in hydrology are: i) parametric models which hypothesize the form of the periodic dependence structure and the distributional form a priori (examples are PAR, PARMA); disaggregation models that aim to preserve the correlation structure at the periodic level and the aggregated annual level; ii) Nonparametric models (examples are bootstrap/kernel based methods), which characterize the laws of chance, describing the stream flow process, without recourse to prior assumptions as to the form or structure of these laws; (k-nearest neighbor (k-NN), matched block bootstrap (MABB)); non-parametric disaggregation model. iii) Hybrid models which blend both parametric and non-parametric models advantageously to model the streamflows effectively. Despite many of these developments that have taken place in the field of stochastic modeling of streamflows over the last four decades, accurate prediction of the storage and the critical drought characteristics has been posing a persistent challenge to the stochastic modeler. This is partly because, usually, the stochastic streamflow model parameters are estimated by minimizing a statistically based objective function (such as maximum likelihood (MLE) or least squares (LS) estimation) and subsequently the efficacy of the models is being validated based on the accuracy of prediction of the estimates of the water-use characteristics, which requires large number of trial simulations and inspection of many plots and tables. Still accurate prediction of the storage and the critical drought characteristics may not be ensured. In this study a multi-objective optimization framework is proposed to find the optimal hybrid model (blend of a simple parametric model, PAR(1) model and matched block bootstrap (MABB) ) based on the explicit objective functions of minimizing the relative bias and relative root mean square error in estimating the storage capacity of the reservoir. The optimal parameter set of the hybrid model is obtained based on the search over a multi- dimensional parameter space (involving simultaneous exploration of the parametric (PAR(1)) as well as the non-parametric (MABB) components). This is achieved using the efficient evolutionary search based optimization tool namely, non-dominated sorting genetic algorithm - II (NSGA-II). This approach helps in reducing the drudgery involved in the process of manual selection of the hybrid model, in addition to predicting the basic summary statistics dependence structure, marginal distribution and water-use characteristics accurately. The proposed optimization framework is used to model the multi-season streamflows of River Beaver and River Weber of USA. In case of both the rivers, the proposed GA-based hybrid model yields a much better prediction of the storage capacity (where simultaneous exploration of both parametric and non-parametric components is done) when compared with the MLE-based hybrid models (where the hybrid model selection is done in two stages, thus probably resulting in a sub-optimal model). This framework can be further extended to include different linear/non-linear hybrid stochastic models at other temporal and spatial scales as well.
Confidence Interval Coverage for Cohen's Effect Size Statistic
ERIC Educational Resources Information Center
Algina, James; Keselman, H. J.; Penfield, Randall D.
2006-01-01
Kelley compared three methods for setting a confidence interval (CI) around Cohen's standardized mean difference statistic: the noncentral-"t"-based, percentile (PERC) bootstrap, and biased-corrected and accelerated (BCA) bootstrap methods under three conditions of nonnormality, eight cases of sample size, and six cases of population…
A Bootstrap Procedure of Propensity Score Estimation
ERIC Educational Resources Information Center
Bai, Haiyan
2013-01-01
Propensity score estimation plays a fundamental role in propensity score matching for reducing group selection bias in observational data. To increase the accuracy of propensity score estimation, the author developed a bootstrap propensity score. The commonly used propensity score matching methods: nearest neighbor matching, caliper matching, and…
2014-01-01
Background Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. Methods We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. Results In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. Conclusions The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes. PMID:24888356
Sadatsafavi, Mohsen; Marra, Carlo; Aaron, Shawn; Bryan, Stirling
2014-06-03
Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes.
Access block in NSW hospitals, 1999-2001: does the definition matter?
Forero, Roberto; Mohsin, Mohammed; Bauman, Adrian E; Ieraci, Sue; Young, Lis; Phung, Hai N; Hillman, Kenneth M; McCarthy, Sally M; Hugelmeyer, C David
2004-01-19
To estimate the magnitude of access block and its trend over time in New South Wales hospitals, using different definitions of access block, and to explore its association with clinical and non-clinical factors. An epidemiological study using the Emergency Department Information System datasets (1 January 1999 to 31 December 2001) from a sample of 55 NSW hospitals. Prevalence of access block measured by four different definitions; strength of association between access block, type of hospital, year of presentation, mode and time of arrival, triage category (an indicator of urgency), age and sex. Rates of access block (for all four definitions) increased between 1999 and 2001 by 1%-2% per year. There were increases across all regions of NSW, but urban regions in particular. Patients presenting to Principal Referral hospitals and those who arrived at night were more likely to experience access block. After adjusting for triage category and year of presentation, the mode of arrival, time of arrival, type of hospital, age and sex were significantly associated with access block. Access block continues to increase across NSW, whatever the definition used. We recommend that hospitals in NSW and Australia move to the use of one standard definition of access block, as our study suggests there is no significant additional information emerging from the use of multiple definitions.
High performance advanced tokamak regimes in DIII-D for next-step experiments
NASA Astrophysics Data System (ADS)
Greenfield, C. M.; Murakami, M.; Ferron, J. R.; Wade, M. R.; Luce, T. C.; Petty, C. C.; Menard, J. E.; Petrie, T. W.; Allen, S. L.; Burrell, K. H.; Casper, T. A.; DeBoo, J. C.; Doyle, E. J.; Garofalo, A. M.; Gorelov, I. A.; Groebner, R. J.; Hobirk, J.; Hyatt, A. W.; Jayakumar, R. J.; Kessel, C. E.; La Haye, R. J.; Jackson, G. L.; Lohr, J.; Makowski, M. A.; Pinsker, R. I.; Politzer, P. A.; Prater, R.; Strait, E. J.; Taylor, T. S.; West, W. P.; DIII-D Team
2004-05-01
Advanced Tokamak (AT) research in DIII-D [K. H. Burrell for the DIII-D Team, in Proceedings of the 19th Fusion Energy Conference, Lyon, France, 2002 (International Atomic Energy Agency, Vienna, 2002) published on CD-ROM] seeks to provide a scientific basis for steady-state high performance operation in future devices. These regimes require high toroidal beta to maximize fusion output and poloidal beta to maximize the self-driven bootstrap current. Achieving these conditions requires integrated, simultaneous control of the current and pressure profiles, and active magnetohydrodynamic stability control. The building blocks for AT operation are in hand. Resistive wall mode stabilization via plasma rotation and active feedback with nonaxisymmetric coils allows routine operation above the no-wall beta limit. Neoclassical tearing modes are stabilized by active feedback control of localized electron cyclotron current drive (ECCD). Plasma shaping and profile control provide further improvements. Under these conditions, bootstrap supplies most of the current. Steady-state operation requires replacing the remaining Ohmic current, mostly located near the half radius, with noninductive external sources. In DIII-D this current is provided by ECCD, and nearly stationary AT discharges have been sustained with little remaining Ohmic current. Fast wave current drive is being developed to control the central magnetic shear. Density control, with divertor cryopumps, of AT discharges with edge localized moding H-mode edges facilitates high current drive efficiency at reactor relevant collisionalities. A sophisticated plasma control system allows integrated control of these elements. Close coupling between modeling and experiment is key to understanding the separate elements, their complex nonlinear interactions, and their integration into self-consistent high performance scenarios. Progress on this development, and its implications for next-step devices, will be illustrated by results of recent experiment and simulation efforts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oberije, Cary, E-mail: cary.oberije@maastro.nl; De Ruysscher, Dirk; Universitaire Ziekenhuizen Leuven, KU Leuven
Purpose: Although patients with stage III non-small cell lung cancer (NSCLC) are homogeneous according to the TNM staging system, they form a heterogeneous group, which is reflected in the survival outcome. The increasing amount of information for an individual patient and the growing number of treatment options facilitate personalized treatment, but they also complicate treatment decision making. Decision support systems (DSS), which provide individualized prognostic information, can overcome this but are currently lacking. A DSS for stage III NSCLC requires the development and integration of multiple models. The current study takes the first step in this process by developing andmore » validating a model that can provide physicians with a survival probability for an individual NSCLC patient. Methods and Materials: Data from 548 patients with stage III NSCLC were available to enable the development of a prediction model, using stratified Cox regression. Variables were selected by using a bootstrap procedure. Performance of the model was expressed as the c statistic, assessed internally and on 2 external data sets (n=174 and n=130). Results: The final multivariate model, stratified for treatment, consisted of age, gender, World Health Organization performance status, overall treatment time, equivalent radiation dose, number of positive lymph node stations, and gross tumor volume. The bootstrapped c statistic was 0.62. The model could identify risk groups in external data sets. Nomograms were constructed to predict an individual patient's survival probability ( (www.predictcancer.org)). The data set can be downloaded at (https://www.cancerdata.org/10.1016/j.ijrobp.2015.02.048). Conclusions: The prediction model for overall survival of patients with stage III NSCLC highlights the importance of combining patient, clinical, and treatment variables. Nomograms were developed and validated. This tool could be used as a first building block for a decision support system.« less
Bootstrapping Methods Applied for Simulating Laboratory Works
ERIC Educational Resources Information Center
Prodan, Augustin; Campean, Remus
2005-01-01
Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…
ERIC Educational Resources Information Center
Zhang, Guangjian; Preacher, Kristopher J.; Luo, Shanhong
2010-01-01
This article is concerned with using the bootstrap to assign confidence intervals for rotated factor loadings and factor correlations in ordinary least squares exploratory factor analysis. Coverage performances of "SE"-based intervals, percentile intervals, bias-corrected percentile intervals, bias-corrected accelerated percentile…
Bootstrap Estimation and Testing for Variance Equality.
ERIC Educational Resources Information Center
Olejnik, Stephen; Algina, James
The purpose of this study was to develop a single procedure for comparing population variances which could be used for distribution forms. Bootstrap methodology was used to estimate the variability of the sample variance statistic when the population distribution was normal, platykurtic and leptokurtic. The data for the study were generated and…
Bootstrapping the Syntactic Bootstrapper: Probabilistic Labeling of Prosodic Phrases
ERIC Educational Resources Information Center
Gutman, Ariel; Dautriche, Isabelle; Crabbé, Benoît; Christophe, Anne
2015-01-01
The "syntactic bootstrapping" hypothesis proposes that syntactic structure provides children with cues for learning the meaning of novel words. In this article, we address the question of how children might start acquiring some aspects of syntax before they possess a sizeable lexicon. The study presents two models of early syntax…
ERIC Educational Resources Information Center
Larwin, Karen H.; Larwin, David A.
2011-01-01
Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…
MS Wisoff in the Mir space station Base Block
1997-02-20
STS081-347-031 (12-22 Jan. 1997) --- Astronaut Peter J. K. (Jeff) Wisoff, is pictured with a small sampling of supplies moved from the Spacehab Double Module (DM) aboard the Space Shuttle Atlantis to Russia's Mir Space Station.
Cold cratonic roots and thermal blankets: How continents affect mantle convection
Trubitsyn, V.P.; Mooney, W.D.; Abbott, D.H.
2003-01-01
Two-dimensional convection models with moving continents show that continents profoundly affect the pattern of mantle convection. If the continents are wider than the wavelength of the convection cells (???3000 km, the thickness of the mantle), they cause neighboring deep mantle thermal upwellings to coalesce into a single focused upwelling. This focused upwelling zone will have a potential temperature anomaly of about 200??C, much higher than the 100??C temperature anomaly of upwelling zones generated beneath typical oceanic lithosphere. Extensive high-temperature melts (including flood basalts and late potassic granites) will be produced, and the excess temperature anomaly will induce continental uplift (as revealed in sea level changes) and the eventual breakup of the supercontinent. The mantle thermal anomaly will persist for several hundred million years after such a breakup. In contrast, small continental blocks (<1000 km diameter) do not induce focused mantle upwelling zones. Instead, small continental blocks are dragged to mantle downwelling zones, where they spend most of their time, and will migrate laterally with the downwelling. As a result of sitting over relatively cold mantle (downwellings), small continental blocks are favored to keep their cratonic roots. This may explain the long-term survival of small cratonic blocks (e.g., the Yilgarn and Pilbara cratons of western Australia, and the West African craton). The optimum size for long-term stability of a continental block is <3000 km. These results show that continents profoundly affect the pattern of mantle convection. These effects are illustrated in terms of the timing and history of supercontinent breakup, the production of high-temperature melts, and sea level changes. Such two-dimensional calculations can be further refined and tested by three-dimensional numerical simulations of mantle convection with moving continental and oceanic plates.
O'Scanaill, P; Keane, S; Wall, V; Flood, G; Buggy, D J
2018-04-01
Pectoral plane blocks (PECs) are increasingly used in analgesia for patients undergoing breast surgery, and were recently found to be at least equivalent to single-shot paravertebral anaesthesia. However, there are no data comparing PECs with the popular practice of continuous local anaesthetic wound infusion (LA infusion) analgesia for breast surgery. Therefore, we compared the efficacy and safety of PECs blocks with LA infusion, or a combination of both in patients undergoing non-ambulatory breast-cancer surgery. This single-centre, prospective, randomised, double-blind trial analysed 45 women to receive either PECs blocks [levobupivacaine 0.25%, 10 ml PECs I and levobupivacaine 0.25%, 20 ml PECs II (PECs group); LA infusion catheter (levobupivacaine 0.1% at 10 ml h -1 for 24 h (LA infusion group); or both (PECs and LA infusion)]. The primary outcome measure was area under the curve of the pain verbal rating score whilst moving vs time (AUC) over 24 h. Secondary outcomes included total opioid consumption at 24 h. AUC moving was mean (SD) 71 (34) mm h -1 vs 58 (41) vs 23 (20) in PECs, LA infusion, and both, respectively; P=0.002. AUC at rest was also significantly lower in patients receiving both. The total 24 h opioid consumption [median (25-75%)] was 14 mg (9-26) vs 11 (8-24) vs 9 (5-11); P=0.4. No adverse events were observed. The combination of both pre-incisional PECs blocks and postoperative LA infusion provides better analgesia over 24 h than either technique alone after non-ambulatory breast-cancer surgery. NCT 03024697. Copyright © 2018 British Journal of Anaesthesia. Published by Elsevier Ltd. All rights reserved.
Carving out the end of the world or (superconformal bootstrap in six dimensions)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Chi-Ming; Lin, Ying-Hsuan
We bootstrap N=(1,0) superconformal field theories in six dimensions, by analyzing the four-point function of flavor current multiplets. By assuming E 8 flavor group, we present universal bounds on the central charge C T and the flavor central charge C J. Based on the numerical data, we conjecture that the rank-one E-string theory saturates the universal lower bound on C J , and numerically determine the spectrum of long multiplets in the rank-one E-string theory. We comment on the possibility of solving the higher-rank E-string theories by bootstrap and thereby probing M-theory on AdS 7×S 4/Z 2 .
Carving out the end of the world or (superconformal bootstrap in six dimensions)
Chang, Chi-Ming; Lin, Ying-Hsuan
2017-08-29
We bootstrap N=(1,0) superconformal field theories in six dimensions, by analyzing the four-point function of flavor current multiplets. By assuming E 8 flavor group, we present universal bounds on the central charge C T and the flavor central charge C J. Based on the numerical data, we conjecture that the rank-one E-string theory saturates the universal lower bound on C J , and numerically determine the spectrum of long multiplets in the rank-one E-string theory. We comment on the possibility of solving the higher-rank E-string theories by bootstrap and thereby probing M-theory on AdS 7×S 4/Z 2 .
Bootstrapping N=2 chiral correlators
NASA Astrophysics Data System (ADS)
Lemos, Madalena; Liendo, Pedro
2016-01-01
We apply the numerical bootstrap program to chiral operators in four-dimensional N=2 SCFTs. In the first part of this work we study four-point functions in which all fields have the same conformal dimension. We give special emphasis to bootstrapping a specific theory: the simplest Argyres-Douglas fixed point with no flavor symmetry. In the second part we generalize our setup and consider correlators of fields with unequal dimension. This is an example of a mixed correlator and allows us to probe new regions in the parameter space of N=2 SCFTs. In particular, our results put constraints on relations in the Coulomb branch chiral ring and on the curvature of the Zamolodchikov metric.
High-resolution echocardiography
NASA Technical Reports Server (NTRS)
Nathan, R.
1979-01-01
High resolution computer aided ultrasound system provides two-and three-dimensional images of beating heart from many angles. System provides means for determining whether small blood vessels around the heart are blocked or if heart wall is moving normally without interference of dead and noncontracting muscle tissue.
Exploring the Replicability of a Study's Results: Bootstrap Statistics for the Multivariate Case.
ERIC Educational Resources Information Center
Thompson, Bruce
Conventional statistical significance tests do not inform the researcher regarding the likelihood that results will replicate. One strategy for evaluating result replication is to use a "bootstrap" resampling of a study's data so that the stability of results across numerous configurations of the subjects can be explored. This paper…
Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization
ERIC Educational Resources Information Center
Lock, Robin H.; Lock, Patti Frazer
2008-01-01
Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…
Computing Robust, Bootstrap-Adjusted Fit Indices for Use with Nonnormal Data
ERIC Educational Resources Information Center
Walker, David A.; Smith, Thomas J.
2017-01-01
Nonnormality of data presents unique challenges for researchers who wish to carry out structural equation modeling. The subsequent SPSS syntax program computes bootstrap-adjusted fit indices (comparative fit index, Tucker-Lewis index, incremental fit index, and root mean square error of approximation) that adjust for nonnormality, along with the…
Forgetski Vygotsky: Or, a Plea for Bootstrapping Accounts of Learning
ERIC Educational Resources Information Center
Luntley, Michael
2017-01-01
This paper argues that sociocultural accounts of learning fail to answer the key question about learning--how is it possible? Accordingly, we should adopt an individualist bootstrapping methodology in providing a theory of learning. Such a methodology takes seriously the idea that learning is staged and distinguishes between a non-comprehending…
Higher curvature gravities, unlike GR, cannot be bootstrapped from their (usual) linearizations
NASA Astrophysics Data System (ADS)
Deser, S.
2017-12-01
We show that higher curvature order gravities, in particular the propagating quadratic curvature models, cannot be derived by self-coupling from their linear, flat space, forms, except through an unphysical version of linearization; only GR can. Separately, we comment on an early version of the self-coupling bootstrap.
The new version of EPA’s positive matrix factorization (EPA PMF) software, 5.0, includes three error estimation (EE) methods for analyzing factor analytic solutions: classical bootstrap (BS), displacement of factor elements (DISP), and bootstrap enhanced by displacement (BS-DISP)...
Bootsie: estimation of coefficient of variation of AFLP data by bootstrap analysis
USDA-ARS?s Scientific Manuscript database
Bootsie is an English-native replacement for ASG Coelho’s “DBOOT” utility for estimating coefficient of variation of a population of AFLP marker data using bootstrapping. Bootsie improves on DBOOT by supporting batch processing, time-to-completion estimation, built-in graphs, and a suite of export t...
How to Bootstrap a Human Communication System
ERIC Educational Resources Information Center
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified…
Li, Hao; Dong, Siping
2015-01-01
China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. © The Author(s) 2015.
Weak percolation on multiplex networks
NASA Astrophysics Data System (ADS)
Baxter, Gareth J.; Dorogovtsev, Sergey N.; Mendes, José F. F.; Cellai, Davide
2014-04-01
Bootstrap percolation is a simple but nontrivial model. It has applications in many areas of science and has been explored on random networks for several decades. In single-layer (simplex) networks, it has been recently observed that bootstrap percolation, which is defined as an incremental process, can be seen as the opposite of pruning percolation, where nodes are removed according to a connectivity rule. Here we propose models of both bootstrap and pruning percolation for multiplex networks. We collectively refer to these two models with the concept of "weak" percolation, to distinguish them from the somewhat classical concept of ordinary ("strong") percolation. While the two models coincide in simplex networks, we show that they decouple when considering multiplexes, giving rise to a wealth of critical phenomena. Our bootstrap model constitutes the simplest example of a contagion process on a multiplex network and has potential applications in critical infrastructure recovery and information security. Moreover, we show that our pruning percolation model may provide a way to diagnose missing layers in a multiplex network. Finally, our analytical approach allows us to calculate critical behavior and characterize critical clusters.
Calia, Clara; Darling, Stephen; Havelka, Jelena; Allen, Richard J
2018-05-01
Immediate serial recall of digits is better when the digits are shown by highlighting them in a familiar array, such as a phone keypad, compared with presenting them serially in a single location, a pattern referred to as "visuospatial bootstrapping." This pattern implies the establishment of temporary links between verbal and spatial working memory, alongside access to information in long-term memory. However, the role of working memory control processes like those implied by the "Central Executive" in bootstrapping has not been directly investigated. Here, we report a study addressing this issue, focusing on executive processes of attentional shifting. Tasks in which information has to be sequenced are thought to be heavily dependent on shifting. Memory for digits presented in keypads versus single locations was assessed under two secondary task load conditions, one with and one without a sequencing requirement, and hence differing in the degree to which they invoke shifting. Results provided clear evidence that multimodal binding (visuospatial bootstrapping) can operate independently of this form of executive control process.
NASA Astrophysics Data System (ADS)
Peraza-Rodriguez, H.; Reynolds-Barredo, J. M.; Sanchez, R.; Tribaldos, V.; Geiger, J.
2018-02-01
The recently developed free-plasma-boundary version of the SIESTA MHD equilibrium code (Hirshman et al 2011 Phys. Plasmas 18 062504; Peraza-Rodriguez et al 2017 Phys. Plasmas 24 082516) is used for the first time to study scenarios with considerable bootstrap currents for the Wendelstein 7-X (W7-X) stellarator. Bootstrap currents in the range of tens of kAs can lead to the formation of unwanted magnetic island chains or stochastic regions within the plasma and alter the boundary rotational transform due to the small shear in W7-X. The latter issue is of relevance since the island divertor operation of W7-X relies on a proper positioning of magnetic island chains at the plasma edge to control the particle and energy exhaust towards the divertor plates. Two scenarios are examined with the new free-plasma-boundary capabilities of SIESTA: a freely evolving bootstrap current one that illustrates the difficulties arising from the dislocation of the boundary islands, and a second one in which off-axis electron cyclotron current drive (ECCD) is applied to compensate the effects of the bootstrap current and keep the island divertor configuration intact. SIESTA finds that off-axis ECCD is indeed able to keep the location and phase of the edge magnetic island chain unchanged, but it may also lead to an undesired stochastization of parts of the confined plasma if the EC deposition radial profile becomes too narrow.
Generalized Bootstrap Method for Assessment of Uncertainty in Semivariogram Inference
Olea, R.A.; Pardo-Iguzquiza, E.
2011-01-01
The semivariogram and its related function, the covariance, play a central role in classical geostatistics for modeling the average continuity of spatially correlated attributes. Whereas all methods are formulated in terms of the true semivariogram, in practice what can be used are estimated semivariograms and models based on samples. A generalized form of the bootstrap method to properly model spatially correlated data is used to advance knowledge about the reliability of empirical semivariograms and semivariogram models based on a single sample. Among several methods available to generate spatially correlated resamples, we selected a method based on the LU decomposition and used several examples to illustrate the approach. The first one is a synthetic, isotropic, exhaustive sample following a normal distribution, the second example is also a synthetic but following a non-Gaussian random field, and a third empirical sample consists of actual raingauge measurements. Results show wider confidence intervals than those found previously by others with inadequate application of the bootstrap. Also, even for the Gaussian example, distributions for estimated semivariogram values and model parameters are positively skewed. In this sense, bootstrap percentile confidence intervals, which are not centered around the empirical semivariogram and do not require distributional assumptions for its construction, provide an achieved coverage similar to the nominal coverage. The latter cannot be achieved by symmetrical confidence intervals based on the standard error, regardless if the standard error is estimated from a parametric equation or from bootstrap. ?? 2010 International Association for Mathematical Geosciences.
Reverse actin sliding triggers strong myosin binding that moves tropomyosin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bekyarova, T.I.; Reedy, M.C.; Baumann, B.A.J.
2008-09-03
Actin/myosin interactions in vertebrate striated muscles are believed to be regulated by the 'steric blocking' mechanism whereby the binding of calcium to the troponin complex allows tropomyosin (TM) to change position on actin, acting as a molecular switch that blocks or allows myosin heads to interact with actin. Movement of TM during activation is initiated by interaction of Ca{sup 2+} with troponin, then completed by further displacement by strong binding cross-bridges. We report x-ray evidence that TM in insect flight muscle (IFM) moves in a manner consistent with the steric blocking mechanism. We find that both isometric contraction, at highmore » [Ca{sup 2+}], and stretch activation, at lower [Ca{sup 2+}], develop similarly high x-ray intensities on the IFM fourth actin layer line because of TM movement, coinciding with x-ray signals of strong-binding cross-bridge attachment to helically favored 'actin target zones.' Vanadate (Vi), a phosphate analog that inhibits active cross-bridge cycling, abolishes all active force in IFM, allowing high [Ca{sup 2+}] to elicit initial TM movement without cross-bridge attachment or other changes from relaxed structure. However, when stretched in high [Ca{sup 2+}], Vi-'paralyzed' fibers produce force substantially above passive response at pCa {approx} 9, concurrent with full conversion from resting to active x-ray pattern, including x-ray signals of cross-bridge strong-binding and TM movement. This argues that myosin heads can be recruited as strong-binding 'brakes' by backward-sliding, calcium-activated thin filaments, and are as effective in moving TM as actively force-producing cross-bridges. Such recruitment of myosin as brakes may be the major mechanism resisting extension during lengthening contractions.« less
Sheridan, Rebecca; van Rooijen, Maaike; Giles, Oscar; Mushtaq, Faisal; Steenbergen, Bert; Mon-Williams, Mark; Waterman, Amanda
2017-10-01
Mathematics is often conducted with a writing implement. But is there a relationship between numerical processing and sensorimotor 'pen' control? We asked participants to move a stylus so it crossed an unmarked line at a location specified by a symbolic number (1-9), where number colour indicated whether the line ran left-right ('normal') or vice versa ('reversed'). The task could be simplified through the use of a 'mental number line' (MNL). Many modern societies use number lines in mathematical education and the brain's representation of number appears to follow a culturally determined spatial organisation (so better task performance is associated with this culturally normal orientation-the MNL effect). Participants (counter-balanced) completed two consistent blocks of trials, 'normal' and 'reversed', followed by a mixed block where line direction varied randomly. Experiment 1 established that the MNL effect was robust, and showed that the cognitive load associated with reversing the MNL not only affected response selection but also the actual movement execution (indexed by duration) within the mixed trials. Experiment 2 showed that an individual's motor abilities predicted performance in the difficult (mixed) condition but not the easier blocks. These results suggest that numerical processing is not isolated from motor capabilities-a finding with applied consequences.
Nogueira, Renato Luiz Maia; Osterne, Rafael Lima Verde; Abreu, Ricardo Teixeira; Araújo, Phelype Maia
2017-07-01
An alternative technique to reconstruct atrophic alveolar vertical bone after implant placement is presented. The technique consists of distraction osteogenesis or direct surgical repositioning of an implant-and-bone block segment after segmental osteotomies that can be used in esthetic or unesthetic cases. Initially, casts indicating the implant position are obtained and the future ideal prosthetic position is determined to guide the model surgery. After the model surgery, a new provisional prosthesis is fabricated, and an occlusal splint, which is used as a surgical guide and a device for distraction osteogenesis, is custom fabricated. Then, the surgery is performed. For mobilization of the implant-and-bone block segment, 2 vertical osteotomies are performed and then joined by a horizontal osteotomy. The implant-and-bone block segment is moved to the planned position. If a small movement is planned, then the implant-and-bone segment is stabilized; for larger movements, the implant-and-bone segment can be gradually moved to the final position by distraction osteogenesis. This technique has good predictability of the final position of the implant-and-bone segment and relatively fast esthetic rehabilitation. It can be considered for dental implants in regions of vertical bone atrophy. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Computer simulation of concentrated solid solution strengthening
NASA Technical Reports Server (NTRS)
Kuo, C. T. K.; Arsenault, R. J.
1976-01-01
The interaction forces between a straight edge dislocation moving through a three-dimensional block containing a random array of solute atoms were determined. The yield stress at 0 K was obtained by determining the average maximum solute-dislocation interaction force that is encountered by edge dislocation, and an expression relating the yield stress to the length of the dislocation and the solute concentration is provided. The magnitude of the solid solution strengthening due to solute atoms can be determined directly from the numerical results, provided the dislocation line length that moves as a unit is specified.
Sloan, D.H.; Yockey, H.P.; Schmidt, F.H.
1959-04-14
An improvement in the mounting arrangement for an ion source within the vacuum tank of a calutron device is reported. The cathode and arc block of the source are independently supported from a stem passing through the tank wall. The arc block may be pivoted and moved longitudinally with respect to the stem to thereby align the arc chamber in the biock with the cathode and magnetic field in the tank. With this arrangement the elements of the ion source are capable of precise adjustment with respect to one another, promoting increased source efficiency.
Continuous air monitor filter changeout apparatus
Rodgers, John C [Santa Fe, NM
2008-07-15
An apparatus and corresponding method for automatically changing out a filter cartridge in a continuous air monitor. The apparatus includes: a first container sized to hold filter cartridge replacements; a second container sized to hold used filter cartridges; a transport insert connectively attached to the first and second containers; a shuttle block, sized to hold the filter cartridges that is located within the transport insert; a transport driver mechanism means used to supply a motive force to move the shuttle block within the transport insert; and, a control means for operating the transport driver mechanism.
NASA Astrophysics Data System (ADS)
Ülker, Erkan; Turanboy, Alparslan
2009-07-01
The block stone industry is one of the main commercial use of rock. The economic potential of any block quarry depends on the recovery rate, which is defined as the total volume of useful rough blocks extractable from a fixed rock volume in relation to the total volume of moved material. The natural fracture system, the rock type(s) and the extraction method used directly influence the recovery rate. The major aims of this study are to establish a theoretical framework for optimising the extraction process in marble quarries for a given fracture system, and for predicting the recovery rate of the excavated blocks. We have developed a new approach by taking into consideration only the fracture structure for maximum block recovery in block quarries. The complete model uses a linear approach based on basic geometric features of discontinuities for 3D models, a tree structure (TS) for individual investigation and finally a genetic algorithm (GA) for the obtained cuboid volume(s). We tested our new model in a selected marble quarry in the town of İscehisar (AFYONKARAHİSAR—TURKEY).
Zhang, Hong; Ren, Lei; Kong, Vic; Giles, William; Zhang, You; Jin, Jian-Yue
2016-01-01
A preobject grid can reduce and correct scatter in cone beam computed tomography (CBCT). However, half of the signal in each projection is blocked by the grid. A synchronized moving grid (SMOG) has been proposed to acquire two complimentary projections at each gantry position and merge them into one complete projection. That approach, however, suffers from increased scanning time and the technical difficulty of accurately merging the two projections per gantry angle. Herein, the authors present a new SMOG approach which acquires a single projection per gantry angle, with complimentary grid patterns for any two adjacent projections, and use an interprojection sensor fusion (IPSF) technique to estimate the blocked signal in each projection. The method may have the additional benefit of reduced imaging dose due to the grid blocking half of the incident radiation. The IPSF considers multiple paired observations from two adjacent gantry angles as approximations of the blocked signal and uses a weighted least square regression of these observations to finally determine the blocked signal. The method was first tested with a simulated SMOG on a head phantom. The signal to noise ratio (SNR), which represents the difference of the recovered CBCT image to the original image without the SMOG, was used to evaluate the ability of the IPSF in recovering the missing signal. The IPSF approach was then tested using a Catphan phantom on a prototype SMOG assembly installed in a bench top CBCT system. In the simulated SMOG experiment, the SNRs were increased from 15.1 and 12.7 dB to 35.6 and 28.9 dB comparing with a conventional interpolation method (inpainting method) for a projection and the reconstructed 3D image, respectively, suggesting that IPSF successfully recovered most of blocked signal. In the prototype SMOG experiment, the authors have successfully reconstructed a CBCT image using the IPSF-SMOG approach. The detailed geometric features in the Catphan phantom were mostly recovered according to visual evaluation. The scatter related artifacts, such as cupping artifacts, were almost completely removed. The IPSF-SMOG is promising in reducing scatter artifacts and improving image quality while reducing radiation dose.
An Efficient Moving Target Detection Algorithm Based on Sparsity-Aware Spectrum Estimation
Shen, Mingwei; Wang, Jie; Wu, Di; Zhu, Daiyin
2014-01-01
In this paper, an efficient direct data domain space-time adaptive processing (STAP) algorithm for moving targets detection is proposed, which is achieved based on the distinct spectrum features of clutter and target signals in the angle-Doppler domain. To reduce the computational complexity, the high-resolution angle-Doppler spectrum is obtained by finding the sparsest coefficients in the angle domain using the reduced-dimension data within each Doppler bin. Moreover, we will then present a knowledge-aided block-size detection algorithm that can discriminate between the moving targets and the clutter based on the extracted spectrum features. The feasibility and effectiveness of the proposed method are validated through both numerical simulations and raw data processing results. PMID:25222035
Pulling Econometrics Students up by Their Bootstraps
ERIC Educational Resources Information Center
O'Hara, Michael E.
2014-01-01
Although the concept of the sampling distribution is at the core of much of what we do in econometrics, it is a concept that is often difficult for students to grasp. The thought process behind bootstrapping provides a way for students to conceptualize the sampling distribution in a way that is intuitive and visual. However, teaching students to…
Accuracy assessment of percent canopy cover, cover type, and size class
H. T. Schreuder; S. Bain; R. C. Czaplewski
2003-01-01
Truth for vegetation cover percent and type is obtained from very large-scale photography (VLSP), stand structure as measured by size classes, and vegetation types from a combination of VLSP and ground sampling. We recommend using the Kappa statistic with bootstrap confidence intervals for overall accuracy, and similarly bootstrap confidence intervals for percent...
ERIC Educational Resources Information Center
Barner, David; Chow, Katherine; Yang, Shu-Ju
2009-01-01
We explored children's early interpretation of numerals and linguistic number marking, in order to test the hypothesis (e.g., Carey (2004). Bootstrapping and the origin of concepts. "Daedalus", 59-68) that children's initial distinction between "one" and other numerals (i.e., "two," "three," etc.) is bootstrapped from a prior distinction between…
A Class of Population Covariance Matrices in the Bootstrap Approach to Covariance Structure Analysis
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Hayashi, Kentaro; Yanagihara, Hirokazu
2007-01-01
Model evaluation in covariance structure analysis is critical before the results can be trusted. Due to finite sample sizes and unknown distributions of real data, existing conclusions regarding a particular statistic may not be applicable in practice. The bootstrap procedure automatically takes care of the unknown distribution and, for a given…
ERIC Educational Resources Information Center
Hand, Michael L.
1990-01-01
Use of the bootstrap resampling technique (BRT) is assessed in its application to resampling analysis associated with measurement of payment allocation errors by federally funded Family Assistance Programs. The BRT is applied to a food stamp quality control database in Oregon. This analysis highlights the outlier-sensitivity of the…
Donald B.K. English
2000-01-01
In this paper I use bootstrap procedures to develop confidence intervals for estimates of total industrial output generated per thousand tourist visits. Mean expenditures from replicated visitor expenditure data included weights to correct for response bias. Impacts were estimated with IMPLAN. Ninety percent interval endpoints were 6 to 16 percent above or below the...
Comparison of Methods for Estimating Low Flow Characteristics of Streams
Tasker, Gary D.
1987-01-01
Four methods for estimating the 7-day, 10-year and 7-day, 20-year low flows for streams are compared by the bootstrap method. The bootstrap method is a Monte Carlo technique in which random samples are drawn from an unspecified sampling distribution defined from observed data. The nonparametric nature of the bootstrap makes it suitable for comparing methods based on a flow series for which the true distribution is unknown. Results show that the two methods based on hypothetical distribution (Log-Pearson III and Weibull) had lower mean square errors than did the G. E. P. Box-D. R. Cox transformation method or the Log-W. C. Boughton method which is based on a fit of plotting positions.
Paula, T O M; Marinho, C D; Amaral Júnior, A T; Peternelli, L A; Gonçalves, L S A
2013-06-27
The objective of this study was to determine the optimal number of repetitions to be used in competition trials of popcorn traits related to production and quality, including grain yield and expansion capacity. The experiments were conducted in 3 environments representative of the north and northwest regions of the State of Rio de Janeiro with 10 Brazilian genotypes of popcorn, consisting by 4 commercial hybrids (IAC 112, IAC 125, Zélia, and Jade), 4 improved varieties (BRS Ângela, UFVM-2 Barão de Viçosa, Beija-flor, and Viçosa) and 2 experimental populations (UNB2U-C3 and UNB2U-C4). The experimental design utilized was a randomized complete block design with 7 repetitions. The Bootstrap method was employed to obtain samples of all of the possible combinations within the 7 blocks. Subsequently, the confidence intervals of the parameters of interest were calculated for all simulated data sets. The optimal number of repetition for all of the traits was considered when all of the estimates of the parameters in question were encountered within the confidence interval. The estimates of the number of repetitions varied according to the parameter estimated, variable evaluated, and environment cultivated, ranging from 2 to 7. It is believed that only the expansion capacity traits in the Colégio Agrícola environment (for residual variance and coefficient of variation), and number of ears per plot, in the Itaocara environment (for coefficient of variation) needed 7 repetitions to fall within the confidence interval. Thus, for the 3 studies conducted, we can conclude that 6 repetitions are optimal for obtaining high experimental precision.
Benchmarking of a T-wave alternans detection method based on empirical mode decomposition.
Blanco-Velasco, Manuel; Goya-Esteban, Rebeca; Cruz-Roldán, Fernando; García-Alberola, Arcadi; Rojo-Álvarez, José Luis
2017-07-01
T-wave alternans (TWA) is a fluctuation of the ST-T complex occurring on an every-other-beat basis of the surface electrocardiogram (ECG). It has been shown to be an informative risk stratifier for sudden cardiac death, though the lack of gold standard to benchmark detection methods has promoted the use of synthetic signals. This work proposes a novel signal model to study the performance of a TWA detection. Additionally, the methodological validation of a denoising technique based on empirical mode decomposition (EMD), which is used here along with the spectral method, is also tackled. The proposed test bed system is based on the following guidelines: (1) use of open source databases to enable experimental replication; (2) use of real ECG signals and physiological noise; (3) inclusion of randomized TWA episodes. Both sensitivity (Se) and specificity (Sp) are separately analyzed. Also a nonparametric hypothesis test, based on Bootstrap resampling, is used to determine whether the presence of the EMD block actually improves the performance. The results show an outstanding specificity when the EMD block is used, even in very noisy conditions (0.96 compared to 0.72 for SNR = 8 dB), being always superior than that of the conventional SM alone. Regarding the sensitivity, using the EMD method also outperforms in noisy conditions (0.57 compared to 0.46 for SNR=8 dB), while it decreases in noiseless conditions. The proposed test setting designed to analyze the performance guarantees that the actual physiological variability of the cardiac system is reproduced. The use of the EMD-based block in noisy environment enables the identification of most patients with fatal arrhythmias. Copyright © 2017 Elsevier B.V. All rights reserved.
Microplate model for the present-day deformation of Tibet
Thatcher, W.
2007-01-01
Site velocities from 349 Global Positioning System (GPS) stations are used to construct an 11-element quasi-rigid block model of the Tibetan Plateau and its surroundings. Rigid rotations of five major blocks are well determined, and average translation velocities of six smaller blocks can be constrained. Where data are well distributed the velocity field can be explained well by rigid block motion and fault slip across block boundaries. Residual misfits average 1.6 mm/yr compared to typical one standard deviation velocity uncertainties of 1.3 mm/yr. Any residual internal straining of the blocks is small and heterogeneous. However, residual substructure might well represent currently unresolved motions of smaller blocks. Although any smaller blocks must move at nearly the same rate as the larger blocks within which they lie, undetected relative motions between them could be significant, particularly where there are gaps in GPS coverage. Predicted relative motions between major blocks agree with the observed sense of slip and along-strike partitioning of motion across major faults. However, predicted slip rates across Tibet's major strike-slip faults are low, only 5-12 mm/yr, a factor of 2-3 smaller than most rates estimated from fault offset features dated by radiometric methods as ???2000 to ???100,000 year old. Previous work has suggested that both GPS data and low fault slip rates are incompatible with rigid block motions of Tibet. The results reported here overcome these objections.
Clifford removes bolts on new Russian gyrodyne to be transferred to Mir
1996-04-22
STS076-323-034 (22 - 31 March 1996) --- Astronaut Michael R. (Rich) Clifford, mission specialist, prepares to move a gyrodyne from the Space Shuttle Atlantis onto Russia's Mir Space Station. The gyrodyne was later installed in the Base Block Module onboard Mir.
2013-06-11
Serina Diniega, JPL Systems Engineer, describes the discovery that Martian gullies that end in pits rather than fan deltas are likely caused by block of frozen carbon dioxide (dry ice) sliding down slopes on a cushion of carbon dioxide gas. The pits are formed as the "dry ice" sublimates away.
Fonteyne, Margot; Vercruysse, Jurgen; De Leersnyder, Fien; Besseling, Rut; Gerich, Ad; Oostra, Wim; Remon, Jean Paul; Vervaet, Chris; De Beer, Thomas
2016-09-07
This study focuses on the twin screw granulator of a continuous from-powder-to-tablet production line. Whereas powder dosing into the granulation unit is possible from a container of preblended material, a truly continuous process uses several feeders (each one dosing an individual ingredient) and relies on a continuous blending step prior to granulation. The aim of the current study was to investigate the in-line blending capacity of this twin screw granulator, equipped with conveying elements only. The feasibility of in-line NIR (SentroPAT, Sentronic GmbH, Dresden, Germany) spectroscopy for evaluating the blend uniformity of powders after the granulator was tested. Anhydrous theophylline was used as a tracer molecule and was blended with lactose monohydrate. Theophylline and lactose were both fed from a different feeder into the twin screw granulator barrel. Both homogeneous mixtures and mixing experiments with induced errors were investigated. The in-line spectroscopic analyses showed that the twin screw granulator is a useful tool for in-line blending in different conditions. The blend homogeneity was evaluated by means of a novel statistical method being the moving F-test method in which the variance between two blocks of collected NIR spectra is evaluated. The α- and β-error of the moving F-test are controlled by using the appropriate block size of spectra. The moving F-test method showed to be an appropriate calibration and maintenance free method for blend homogeneity evaluation during continuous mixing. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Myrow, P.; Chen, J.
2013-12-01
A wide variety of unusual penecontemporaneous deformation structures exist in grainstone and flat-pebble conglomerate beds of the Upper Cambrian strata, western Colorado, including slide scarps, thrusted beds, irregular blocks and internally deformed beds. Slide scarps are characterized by concave-up, sharp surfaces that truncate one or more underlying beds. Thrusted beds record movement of a part of a bed onto itself along a moderate to steeply inclined (generally 25°-40°) ramp. The hanging wall lenses in cases show fault-bend geometries, with either intact or mildly deformed bedding. Irregular bedded to internally deformed blocks isolated on generally flat upper bedding surfaces are similar in composition to the underlying beds. These features represent parts of beds that were detached, moved up onto, and some distances across, the laterally adjacent undisturbed bed surfaces. The blocks moved either at the sediment-water interface or intrastratally at shallow depths within overlying muddy deposits. Finally, internally deformed beds have large blocks, fitted fabrics of highly irregular fragments, and contorted lamination, which represent heterogeneous deformation, such as brecciation and liquefaction. The various deformation structures were most probably triggered by earthquakes, considering the nature of deformation (regional distribution of liquefaction structures, and the brittle segmentation and subsequent transportation of semi-consolidated beds) and the reactivation of Mesoproterozoic, crustal-scale shear zones in the central Rockies during the Late Cambrian. Features produced by initial brittle deformation are unusual relative to most reported seismites, and may represent poorly recognized to unrecognized seismogenic structures in the rock record.
Technical and scale efficiency in public and private Irish nursing homes - a bootstrap DEA approach.
Ni Luasa, Shiovan; Dineen, Declan; Zieba, Marta
2016-10-27
This article provides methodological and empirical insights into the estimation of technical efficiency in the nursing home sector. Focusing on long-stay care and using primary data, we examine technical and scale efficiency in 39 public and 73 private Irish nursing homes by applying an input-oriented data envelopment analysis (DEA). We employ robust bootstrap methods to validate our nonparametric DEA scores and to integrate the effects of potential determinants in estimating the efficiencies. Both the homogenous and two-stage double bootstrap procedures are used to obtain confidence intervals for the bias-corrected DEA scores. Importantly, the application of the double bootstrap approach affords true DEA technical efficiency scores after adjusting for the effects of ownership, size, case-mix, and other determinants such as location, and quality. Based on our DEA results for variable returns to scale technology, the average technical efficiency score is 62 %, and the mean scale efficiency is 88 %, with nearly all units operating on the increasing returns to scale part of the production frontier. Moreover, based on the double bootstrap results, Irish nursing homes are less technically efficient, and more scale efficient than the conventional DEA estimates suggest. Regarding the efficiency determinants, in terms of ownership, we find that private facilities are less efficient than the public units. Furthermore, the size of the nursing home has a positive effect, and this reinforces our finding that Irish homes produce at increasing returns to scale. Also, notably, we find that a tendency towards quality improvements can lead to poorer technical efficiency performance.
Empirical single sample quantification of bias and variance in Q-ball imaging.
Hainline, Allison E; Nath, Vishwesh; Parvathaneni, Prasanna; Blaber, Justin A; Schilling, Kurt G; Anderson, Adam W; Kang, Hakmook; Landman, Bennett A
2018-02-06
The bias and variance of high angular resolution diffusion imaging methods have not been thoroughly explored in the literature and may benefit from the simulation extrapolation (SIMEX) and bootstrap techniques to estimate bias and variance of high angular resolution diffusion imaging metrics. The SIMEX approach is well established in the statistics literature and uses simulation of increasingly noisy data to extrapolate back to a hypothetical case with no noise. The bias of calculated metrics can then be computed by subtracting the SIMEX estimate from the original pointwise measurement. The SIMEX technique has been studied in the context of diffusion imaging to accurately capture the bias in fractional anisotropy measurements in DTI. Herein, we extend the application of SIMEX and bootstrap approaches to characterize bias and variance in metrics obtained from a Q-ball imaging reconstruction of high angular resolution diffusion imaging data. The results demonstrate that SIMEX and bootstrap approaches provide consistent estimates of the bias and variance of generalized fractional anisotropy, respectively. The RMSE for the generalized fractional anisotropy estimates shows a 7% decrease in white matter and an 8% decrease in gray matter when compared with the observed generalized fractional anisotropy estimates. On average, the bootstrap technique results in SD estimates that are approximately 97% of the true variation in white matter, and 86% in gray matter. Both SIMEX and bootstrap methods are flexible, estimate population characteristics based on single scans, and may be extended for bias and variance estimation on a variety of high angular resolution diffusion imaging metrics. © 2018 International Society for Magnetic Resonance in Medicine.
Uncertainty Estimation using Bootstrapped Kriging Predictions for Precipitation Isoscapes
NASA Astrophysics Data System (ADS)
Ma, C.; Bowen, G. J.; Vander Zanden, H.; Wunder, M.
2017-12-01
Isoscapes are spatial models representing the distribution of stable isotope values across landscapes. Isoscapes of hydrogen and oxygen in precipitation are now widely used in a diversity of fields, including geology, biology, hydrology, and atmospheric science. To generate isoscapes, geostatistical methods are typically applied to extend predictions from limited data measurements. Kriging is a popular method in isoscape modeling, but quantifying the uncertainty associated with the resulting isoscapes is challenging. Applications that use precipitation isoscapes to determine sample origin require estimation of uncertainty. Here we present a simple bootstrap method (SBM) to estimate the mean and uncertainty of the krigged isoscape and compare these results with a generalized bootstrap method (GBM) applied in previous studies. We used hydrogen isotopic data from IsoMAP to explore these two approaches for estimating uncertainty. We conducted 10 simulations for each bootstrap method and found that SBM results in more kriging predictions (9/10) compared to GBM (4/10). Prediction from SBM was closer to the original prediction generated without bootstrapping and had less variance than GBM. SBM was tested on different datasets from IsoMAP with different numbers of observation sites. We determined that predictions from the datasets with fewer than 40 observation sites using SBM were more variable than the original prediction. The approaches we used for estimating uncertainty will be compiled in an R package that is under development. We expect that these robust estimates of precipitation isoscape uncertainty can be applied in diagnosing the origin of samples ranging from various type of waters to migratory animals, food products, and humans.
NASA Astrophysics Data System (ADS)
Pierini, J. O.; Restrepo, J. C.; Aguirre, J.; Bustamante, A. M.; Velásquez, G. J.
2017-04-01
A measure of the variability in seasonal extreme streamflow was estimated for the Colombian Caribbean coast, using monthly time series of freshwater discharge from ten watersheds. The aim was to detect modifications in the streamflow monthly distribution, seasonal trends, variance and extreme monthly values. A 20-year length time moving window, with 1-year successive shiftments, was applied to the monthly series to analyze the seasonal variability of streamflow. The seasonal-windowed data were statistically fitted through the Gamma distribution function. Scale and shape parameters were computed using the Maximum Likelihood Estimation (MLE) and the bootstrap method for 1000 resample. A trend analysis was performed for each windowed-serie, allowing to detect the window of maximum absolute values for trends. Significant temporal shifts in seasonal streamflow distribution and quantiles (QT), were obtained for different frequencies. Wet and dry extremes periods increased significantly in the last decades. Such increase did not occur simultaneously through the region. Some locations exhibited continuous increases only at minimum QT.
Bootstrapping rapidity anomalous dimensions for transverse-momentum resummation
Li, Ye; Zhu, Hua Xing
2017-01-11
Soft function relevant for transverse-momentum resummation for Drell-Yan or Higgs production at hadron colliders are computed through to three loops in the expansion of strong coupling, with the help of bootstrap technique and supersymmetric decomposition. The corresponding rapidity anomalous dimension is extracted. Furthermore, an intriguing relation between anomalous dimensions for transverse-momentum resummation and threshold resummation is found.
H. T. Schreuder; M. S. Williams
2000-01-01
In simulation sampling from forest populations using sample sizes of 20, 40, and 60 plots respectively, confidence intervals based on the bootstrap (accelerated, percentile, and t-distribution based) were calculated and compared with those based on the classical t confidence intervals for mapped populations and subdomains within those populations. A 68.1 ha mapped...
ERIC Educational Resources Information Center
Ural, A. Engin; Yuret, Deniz; Ketrez, F. Nihan; Kocbas, Dilara; Kuntay, Aylin C.
2009-01-01
The syntactic bootstrapping mechanism of verb learning was evaluated against child-directed speech in Turkish, a language with rich morphology, nominal ellipsis and free word order. Machine-learning algorithms were run on transcribed caregiver speech directed to two Turkish learners (one hour every two weeks between 0;9 to 1;10) of different…
ERIC Educational Resources Information Center
Seco, Guillermo Vallejo; Izquierdo, Marcelino Cuesta; Garcia, M. Paula Fernandez; Diez, F. Javier Herrero
2006-01-01
The authors compare the operating characteristics of the bootstrap-F approach, a direct extension of the work of Berkovits, Hancock, and Nevitt, with Huynh's improved general approximation (IGA) and the Brown-Forsythe (BF) multivariate approach in a mixed repeated measures design when normality and multisample sphericity assumptions do not hold.…
Sample-based estimation of tree species richness in a wet tropical forest compartment
Steen Magnussen; Raphael Pelissier
2007-01-01
Petersen's capture-recapture ratio estimator and the well-known bootstrap estimator are compared across a range of simulated low-intensity simple random sampling with fixed-area plots of 100 m? in a rich wet tropical forest compartment with 93 tree species in the Western Ghats of India. Petersen's ratio estimator was uniformly superior to the bootstrap...
Common Ground between Form and Content: The Pragmatic Solution to the Bootstrapping Problem
ERIC Educational Resources Information Center
Oller, John W.
2005-01-01
The frame of reference for this article is second or foreign language (L2 or FL) acquisition, but the pragmatic bootstrapping hypothesis applies to language processing and acquisition in any context or modality. It is relevant to teaching children to read. It shows how connections between target language surface forms and their content can be made…
2006-06-13
with arithmetic mean ( UPGMA ) using random tie breaking and uncorrected pairwise distances in MacVector 7.0 (Oxford Molecular). Numbers on branches...denote the UPGMA bootstrap percentage using a highly stringent number (1000) of replications (Felsenstein, 1985). All bootstrap values are 50%, as shown
A Comparison of Single Sample and Bootstrap Methods to Assess Mediation in Cluster Randomized Trials
ERIC Educational Resources Information Center
Pituch, Keenan A.; Stapleton, Laura M.; Kang, Joo Youn
2006-01-01
A Monte Carlo study examined the statistical performance of single sample and bootstrap methods that can be used to test and form confidence interval estimates of indirect effects in two cluster randomized experimental designs. The designs were similar in that they featured random assignment of clusters to one of two treatment conditions and…
Multilingual Phoneme Models for Rapid Speech Processing System Development
2006-09-01
processes are used to develop an Arabic speech recognition system starting from monolingual English models, In- ternational Phonetic Association (IPA...clusters. It was found that multilingual bootstrapping methods out- perform monolingual English bootstrapping methods on the Arabic evaluation data initially...International Phonetic Alphabet . . . . . . . . . 7 2.3.2 Multilingual vs. Monolingual Speech Recognition 7 2.3.3 Data-Driven Approaches
Ramírez-Prado, Dolores; Cortés, Ernesto; Aguilar-Segura, María Soledad; Gil-Guillén, Vicente Francisco
2016-01-01
In January 2012, a review of the cases of chromosome 15q24 microdeletion syndrome was published. However, this study did not include inferential statistics. The aims of the present study were to update the literature search and calculate confidence intervals for the prevalence of each phenotype using bootstrap methodology. Published case reports of patients with the syndrome that included detailed information about breakpoints and phenotype were sought and 36 were included. Deletions in megabase (Mb) pairs were determined to calculate the size of the interstitial deletion of the phenotypes studied in 2012. To determine confidence intervals for the prevalence of the phenotype and the interstitial loss, we used bootstrap methodology. Using the bootstrap percentiles method, we found wide variability in the prevalence of the different phenotypes (3–100%). The mean interstitial deletion size was 2.72 Mb (95% CI [2.35–3.10 Mb]). In comparison with our work, which expanded the literature search by 45 months, there were differences in the prevalence of 17% of the phenotypes, indicating that more studies are needed to analyze this rare disease. PMID:26925314
van Walraven, Carl
2017-04-01
Diagnostic codes used in administrative databases cause bias due to misclassification of patient disease status. It is unclear which methods minimize this bias. Serum creatinine measures were used to determine severe renal failure status in 50,074 hospitalized patients. The true prevalence of severe renal failure and its association with covariates were measured. These were compared to results for which renal failure status was determined using surrogate measures including the following: (1) diagnostic codes; (2) categorization of probability estimates of renal failure determined from a previously validated model; or (3) bootstrap methods imputation of disease status using model-derived probability estimates. Bias in estimates of severe renal failure prevalence and its association with covariates were minimal when bootstrap methods were used to impute renal failure status from model-based probability estimates. In contrast, biases were extensive when renal failure status was determined using codes or methods in which model-based condition probability was categorized. Bias due to misclassification from inaccurate diagnostic codes can be minimized using bootstrap methods to impute condition status using multivariable model-derived probability estimates. Copyright © 2017 Elsevier Inc. All rights reserved.
Reference interval computation: which method (not) to choose?
Pavlov, Igor Y; Wilson, Andrew R; Delgado, Julio C
2012-07-11
When different methods are applied to reference interval (RI) calculation the results can sometimes be substantially different, especially for small reference groups. If there are no reliable RI data available, there is no way to confirm which method generates results closest to the true RI. We randomly drawn samples obtained from a public database for 33 markers. For each sample, RIs were calculated by bootstrapping, parametric, and Box-Cox transformed parametric methods. Results were compared to the values of the population RI. For approximately half of the 33 markers, results of all 3 methods were within 3% of the true reference value. For other markers, parametric results were either unavailable or deviated considerably from the true values. The transformed parametric method was more accurate than bootstrapping for sample size of 60, very close to bootstrapping for sample size 120, but in some cases unavailable. We recommend against using parametric calculations to determine RIs. The transformed parametric method utilizing Box-Cox transformation would be preferable way of RI calculation, if it satisfies normality test. If not, the bootstrapping is always available, and is almost as accurate and precise as the transformed parametric method. Copyright © 2012 Elsevier B.V. All rights reserved.
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution
Imai, Mutsumi; Kita, Sotaro
2014-01-01
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. PMID:25092666
Estimating uncertainty in respondent-driven sampling using a tree bootstrap method.
Baraff, Aaron J; McCormick, Tyler H; Raftery, Adrian E
2016-12-20
Respondent-driven sampling (RDS) is a network-based form of chain-referral sampling used to estimate attributes of populations that are difficult to access using standard survey tools. Although it has grown quickly in popularity since its introduction, the statistical properties of RDS estimates remain elusive. In particular, the sampling variability of these estimates has been shown to be much higher than previously acknowledged, and even methods designed to account for RDS result in misleadingly narrow confidence intervals. In this paper, we introduce a tree bootstrap method for estimating uncertainty in RDS estimates based on resampling recruitment trees. We use simulations from known social networks to show that the tree bootstrap method not only outperforms existing methods but also captures the high variability of RDS, even in extreme cases with high design effects. We also apply the method to data from injecting drug users in Ukraine. Unlike other methods, the tree bootstrap depends only on the structure of the sampled recruitment trees, not on the attributes being measured on the respondents, so correlations between attributes can be estimated as well as variability. Our results suggest that it is possible to accurately assess the high level of uncertainty inherent in RDS.
Correlation functions of warped CFT
NASA Astrophysics Data System (ADS)
Song, Wei; Xu, Jianfei
2018-04-01
Warped conformal field theory (WCFT) is a two dimensional quantum field theory whose local symmetry algebra consists of a Virasoro algebra and a U(1) Kac-Moody algebra. In this paper, we study correlation functions for primary operators in WCFT. Similar to conformal symmetry, warped conformal symmetry is very constraining. The form of the two and three point functions are determined by the global warped conformal symmetry while the four point functions can be determined up to an arbitrary function of the cross ratio. The warped conformal bootstrap equation are constructed by formulating the notion of crossing symmetry. In the large central charge limit, four point functions can be decomposed into global warped conformal blocks, which can be solved exactly. Furthermore, we revisit the scattering problem in warped AdS spacetime (WAdS), and give a prescription on how to match the bulk result to a WCFT retarded Green's function. Our result is consistent with the conjectured holographic dualities between WCFT and WAdS.
Exact finite volume expectation values of local operators in excited states
NASA Astrophysics Data System (ADS)
Pozsgay, B.; Szécsényi, I. M.; Takács, G.
2015-04-01
We present a conjecture for the exact expression of finite volume expectation values in excited states in integrable quantum field theories, which is an extension of an earlier conjecture to the case of general diagonal factorized scattering with bound states and a nontrivial bootstrap structure. The conjectured expression is a spectral expansion which uses the exact form factors and the excited state thermodynamic Bethe Ansatz as building blocks. The conjecture is proven for the case of the trace of the energy-moment tensor. Concerning its validity for more general operators, we provide numerical evidence using the truncated conformal space approach. It is found that the expansion fails to be well-defined for small values of the volume in cases when the singularity structure of the TBA equations undergoes a non-trivial rearrangement under some critical value of the volume. Despite these shortcomings, the conjectured expression is expected to be valid for all volumes for most of the excited states, and as an expansion above the critical volume for the rest.
A bootstrap lunar base: Preliminary design review 2
NASA Technical Reports Server (NTRS)
1987-01-01
A bootstrap lunar base is the gateway to manned solar system exploration and requires new ideas and new designs on the cutting edge of technology. A preliminary design for a Bootstrap Lunar Base, the second provided by this contractor, is presented. An overview of the work completed is discussed as well as the technical, management, and cost strategies to complete the program requirements. The lunar base design stresses the transforming capabilities of its lander vehicles to aid in base construction. The design also emphasizes modularity and expandability in the base configuration to support the long-term goals of scientific research and profitable lunar resource exploitation. To successfully construct, develop, and inhabit a permanent lunar base, however, several technological advancements must first be realized. Some of these technological advancements are also discussed.
Spheres, charges, instantons, and bootstrap: A five-dimensional odyssey
NASA Astrophysics Data System (ADS)
Chang, Chi-Ming; Fluder, Martin; Lin, Ying-Hsuan; Wang, Yifan
2018-03-01
We combine supersymmetric localization and the conformal bootstrap to study five-dimensional superconformal field theories. To begin, we classify the admissible counter-terms and derive a general relation between the five-sphere partition function and the conformal and flavor central charges. Along the way, we discover a new superconformal anomaly in five dimensions. We then propose a precise triple factorization formula for the five-sphere partition function, that incorporates instantons and is consistent with flavor symmetry enhancement. We numerically evaluate the central charges for the rank-one Seiberg and Morrison-Seiberg theories, and find strong evidence for their saturation of bootstrap bounds, thereby determining the spectra of long multiplets in these theories. Lastly, our results provide new evidence for the F-theorem and possibly a C-theorem in five-dimensional superconformal theories.
Method and apparatus for setting precise nozzle/belt and nozzle/edge dam block gaps
Carmichael, Robert J.; Dykes, Charles D.; Woodrow, Ronald
1989-05-16
A pair of guide pins are mounted on sideplate extensions of the caster and mating roller pairs are mounted on the nozzle assembly. The nozzle is advanced toward the caster so that the roller pairs engage the guide pins. Both guide pins are remotely adjustable in the vertical direction by hydraulic cylinders acting through eccentrics. This moves the nozzle vertically. The guide pin on the inboard side of the caster is similarly horizontally adjustable. The nozzle roller pair which engage the inboard guide pin are flanged so that the nozzle moves horizontally with the inboard guide pin.
House moves to block Internet censorship.
Mirken, B
1995-08-18
Congress has created conflicting amendments to the Communications Decency Act, an amendment to the telecommunications deregulation bill. The Senate amendment contains sweeping language barring objectionable communications online. Observers fear this will block online distribution of AIDS prevention information as well as bar responses by activist's of drug companies and government officials. The House amendments seek to protect online services from liability if they restrict access to objectionable materials, while another amendment seeks to modify obscenity laws to criminalize only some forms of online speech. These conflicts are to be resolved in a House-Senate conference committee whose meeting date is yet to be decided.
How They (Should Have) Built the Pyramids
NASA Astrophysics Data System (ADS)
Gallagher, Gregory; West, Joseph; Waters, Kevin
2014-03-01
A novel ``polygon method'' is proposed for moving large stone blocks. The method is implemented by the attachment of rods of analytically chosen radii to the block by means of rope. The chosen rods are placed on each side of the square-prism block in order to transform the square prism into a prism of higher order polygon, i.e. octagon, dodecagon etc. Experimental results are presented and compared to other methods proposed by the authors, including a dragging method and a rail method which includes the idea of dragging the block on rails made from arbitrarily chosen rod-shaped ``tracks,'' and to independent work by another group which utilized wooden attachments providing a cylindrical shape. It is found that the polygon method when used on small scale stone blocks across level open ground has an equivalent of a coefficient of friction order of 0.1. For full scale pyramid blocks, the wooden ``rods'' would need to be of order 30 cm in diameter, certainly within reason, given the diameter of wooden masts used on ships in that region during the relevant time period in Egypt. This project also inspired a ``spin-off'' project in which the behavior or rolling polygons is investigated and preliminary data is presented.
ERIC Educational Resources Information Center
Wagstaff, David A.; Elek, Elvira; Kulis, Stephen; Marsiglia, Flavio
2009-01-01
A nonparametric bootstrap was used to obtain an interval estimate of Pearson's "r," and test the null hypothesis that there was no association between 5th grade students' positive substance use expectancies and their intentions to not use substances. The students were participating in a substance use prevention program in which the unit of…
Bootstrapping a five-loop amplitude using Steinmann relations
Caron-Huot, Simon; Dixon, Lance J.; McLeod, Andrew; ...
2016-12-05
Here, the analytic structure of scattering amplitudes is restricted by Steinmann relations, which enforce the vanishing of certain discontinuities of discontinuities. We show that these relations dramatically simplify the function space for the hexagon function bootstrap in planar maximally supersymmetric Yang-Mills theory. Armed with this simplification, along with the constraints of dual conformal symmetry and Regge exponentiation, we obtain the complete five-loop six-particle amplitude.
A Bootstrap Algorithm for Mixture Models and Interval Data in Inter-Comparisons
2001-07-01
parametric bootstrap. The present algorithm will be applied to a thermometric inter-comparison, where data cannot be assumed to be normally distributed. 2 Data...experimental methods, used in each laboratory) often imply that the statistical assumptions are not satisfied, as for example in several thermometric ...triangular). Indeed, in thermometric experiments these three probabilistic models can represent several common stochastic variabilities for
On the Model-Based Bootstrap with Missing Data: Obtaining a "P"-Value for a Test of Exact Fit
ERIC Educational Resources Information Center
Savalei, Victoria; Yuan, Ke-Hai
2009-01-01
Evaluating the fit of a structural equation model via bootstrap requires a transformation of the data so that the null hypothesis holds exactly in the sample. For complete data, such a transformation was proposed by Beran and Srivastava (1985) for general covariance structure models and applied to structural equation modeling by Bollen and Stine…
ERIC Educational Resources Information Center
Choi, Sae Il
2009-01-01
This study used simulation (a) to compare the kernel equating method to traditional equipercentile equating methods under the equivalent-groups (EG) design and the nonequivalent-groups with anchor test (NEAT) design and (b) to apply the parametric bootstrap method for estimating standard errors of equating. A two-parameter logistic item response…
ERIC Educational Resources Information Center
Essid, Hedi; Ouellette, Pierre; Vigeant, Stephane
2010-01-01
The objective of this paper is to measure the efficiency of high schools in Tunisia. We use a statistical data envelopment analysis (DEA)-bootstrap approach with quasi-fixed inputs to estimate the precision of our measure. To do so, we developed a statistical model serving as the foundation of the data generation process (DGP). The DGP is…
BootGraph: probabilistic fiber tractography using bootstrap algorithms and graph theory.
Vorburger, Robert S; Reischauer, Carolin; Boesiger, Peter
2013-02-01
Bootstrap methods have recently been introduced to diffusion-weighted magnetic resonance imaging to estimate the measurement uncertainty of ensuing diffusion parameters directly from the acquired data without the necessity to assume a noise model. These methods have been previously combined with deterministic streamline tractography algorithms to allow for the assessment of connection probabilities in the human brain. Thereby, the local noise induced disturbance in the diffusion data is accumulated additively due to the incremental progression of streamline tractography algorithms. Graph based approaches have been proposed to overcome this drawback of streamline techniques. For this reason, the bootstrap method is in the present work incorporated into a graph setup to derive a new probabilistic fiber tractography method, called BootGraph. The acquired data set is thereby converted into a weighted, undirected graph by defining a vertex in each voxel and edges between adjacent vertices. By means of the cone of uncertainty, which is derived using the wild bootstrap, a weight is thereafter assigned to each edge. Two path finding algorithms are subsequently applied to derive connection probabilities. While the first algorithm is based on the shortest path approach, the second algorithm takes all existing paths between two vertices into consideration. Tracking results are compared to an established algorithm based on the bootstrap method in combination with streamline fiber tractography and to another graph based algorithm. The BootGraph shows a very good performance in crossing situations with respect to false negatives and permits incorporating additional constraints, such as a curvature threshold. By inheriting the advantages of the bootstrap method and graph theory, the BootGraph method provides a computationally efficient and flexible probabilistic tractography setup to compute connection probability maps and virtual fiber pathways without the drawbacks of streamline tractography algorithms or the assumption of a noise distribution. Moreover, the BootGraph can be applied to common DTI data sets without further modifications and shows a high repeatability. Thus, it is very well suited for longitudinal studies and meta-studies based on DTI. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Yang, P.; Ng, T. L.; Yang, W.
2015-12-01
Effective water resources management depends on the reliable estimation of the uncertainty of drought events. Confidence intervals (CIs) are commonly applied to quantify this uncertainty. A CI seeks to be at the minimal length necessary to cover the true value of the estimated variable with the desired probability. In drought analysis where two or more variables (e.g., duration and severity) are often used to describe a drought, copulas have been found suitable for representing the joint probability behavior of these variables. However, the comprehensive assessment of the parameter uncertainties of copulas of droughts has been largely ignored, and the few studies that have recognized this issue have not explicitly compared the various methods to produce the best CIs. Thus, the objective of this study to compare the CIs generated using two widely applied uncertainty estimation methods, bootstrapping and Markov Chain Monte Carlo (MCMC). To achieve this objective, (1) the marginal distributions lognormal, Gamma, and Generalized Extreme Value, and the copula functions Clayton, Frank, and Plackett are selected to construct joint probability functions of two drought related variables. (2) The resulting joint functions are then fitted to 200 sets of simulated realizations of drought events with known distribution and extreme parameters and (3) from there, using bootstrapping and MCMC, CIs of the parameters are generated and compared. The effect of an informative prior on the CIs generated by MCMC is also evaluated. CIs are produced for different sample sizes (50, 100, and 200) of the simulated drought events for fitting the joint probability functions. Preliminary results assuming lognormal marginal distributions and the Clayton copula function suggest that for cases with small or medium sample sizes (~50-100), MCMC to be superior method if an informative prior exists. Where an informative prior is unavailable, for small sample sizes (~50), both bootstrapping and MCMC yield the same level of performance, and for medium sample sizes (~100), bootstrapping is better. For cases with a large sample size (~200), there is little difference between the CIs generated using bootstrapping and MCMC regardless of whether or not an informative prior exists.
Signal-Conditioning Block of a 1 × 200 CMOS Detector Array for a Terahertz Real-Time Imaging System
Yang, Jong-Ryul; Lee, Woo-Jae; Han, Seong-Tae
2016-01-01
A signal conditioning block of a 1 × 200 Complementary Metal-Oxide-Semiconductor (CMOS) detector array is proposed to be employed with a real-time 0.2 THz imaging system for inspecting large areas. The plasmonic CMOS detector array whose pixel size including an integrated antenna is comparable to the wavelength of the THz wave for the imaging system, inevitably carries wide pixel-to-pixel variation. To make the variant outputs from the array uniform, the proposed signal conditioning block calibrates the responsivity of each pixel by controlling the gate bias of each detector and the voltage gain of the lock-in amplifiers in the block. The gate bias of each detector is modulated to 1 MHz to improve the signal-to-noise ratio of the imaging system via the electrical modulation by the conditioning block. In addition, direct current (DC) offsets of the detectors in the array are cancelled by initializing the output voltage level from the block. Real-time imaging using the proposed signal conditioning block is demonstrated by obtaining images at the rate of 19.2 frame-per-sec of an object moving on the conveyor belt with a scan width of 20 cm and a scan speed of 25 cm/s. PMID:26950128
Signal-Conditioning Block of a 1 × 200 CMOS Detector Array for a Terahertz Real-Time Imaging System.
Yang, Jong-Ryul; Lee, Woo-Jae; Han, Seong-Tae
2016-03-02
A signal conditioning block of a 1 × 200 Complementary Metal-Oxide-Semiconductor (CMOS) detector array is proposed to be employed with a real-time 0.2 THz imaging system for inspecting large areas. The plasmonic CMOS detector array whose pixel size including an integrated antenna is comparable to the wavelength of the THz wave for the imaging system, inevitably carries wide pixel-to-pixel variation. To make the variant outputs from the array uniform, the proposed signal conditioning block calibrates the responsivity of each pixel by controlling the gate bias of each detector and the voltage gain of the lock-in amplifiers in the block. The gate bias of each detector is modulated to 1 MHz to improve the signal-to-noise ratio of the imaging system via the electrical modulation by the conditioning block. In addition, direct current (DC) offsets of the detectors in the array are cancelled by initializing the output voltage level from the block. Real-time imaging using the proposed signal conditioning block is demonstrated by obtaining images at the rate of 19.2 frame-per-sec of an object moving on the conveyor belt with a scan width of 20 cm and a scan speed of 25 cm/s.
A spring-block analogy for the dynamics of stock indexes
NASA Astrophysics Data System (ADS)
Sándor, Bulcsú; Néda, Zoltán
2015-06-01
A spring-block chain placed on a running conveyor belt is considered for modeling stylized facts observed in the dynamics of stock indexes. Individual stocks are modeled by the blocks, while the stock-stock correlations are introduced via simple elastic forces acting in the springs. The dragging effect of the moving belt corresponds to the expected economic growth. The spring-block system produces collective behavior and avalanche like phenomena, similar to the ones observed in stock markets. An artificial index is defined for the spring-block chain, and its dynamics is compared with the one measured for the Dow Jones Industrial Average. For certain parameter regions the model reproduces qualitatively well the dynamics of the logarithmic index, the logarithmic returns, the distribution of the logarithmic returns, the avalanche-size distribution and the distribution of the investment horizons. A noticeable success of the model is that it is able to account for the gain-loss asymmetry observed in the inverse statistics. Our approach has mainly a pedagogical value, bridging between a complex socio-economic phenomena and a basic (mechanical) model in physics.
Guggenheim, S. Frederic
1986-01-01
A multi-port fluid valve apparatus is used to control the flow of fluids through a plurality of valves and includes a web, which preferably is a stainless steel endless belt. The belt has an aperture therethrough and is progressed, under motor drive and control, so that its aperture is moved from one valve mechanism to another. Each of the valve mechanisms comprises a pair of valve blocks which are held in fluid-tight relationship against the belt. Each valve block consists of a block having a bore through which the fluid flows, a first seal surrounding the bore and a second seal surrounding the first seal, with the distance between the first and second seals being greater than the size of the belt aperture. In order to open a valve, the motor progresses the belt aperture to where it is aligned with the two bores of a pair of valve blocks, such alignment permitting a flow of the fluid through the valve. The valve is closed by movement of the belt aperture and its replacement, within the pair of valve blocks, by a solid portion of the belt.
Inta, Ra; Lai, Joseph C S; Fu, Eugene W; Evans, Theodore A
2007-08-22
Drywood termites are able to assess wood size using vibratory signals, although the exact mechanism behind this assessment ability is not known. Important vibratory characteristics such as the modal frequencies of a wooden block depend on its geometry and boundary conditions; however, they are also dependent on the material characteristics of the block, such as mass, density and internal damping. We report here on choice experiments that tested the ability of the drywood termite Cryptotermes secundus to assess wooden block size using a solid wooden block paired with a composite block, the latter made of either wood and aluminium or wood and rubber. Each composite block was constructed to match mass or low-frequency vibratory modes (i.e. fundamental frequency) of the solid wooden block. The termites always chose the blocks with more wood; they moved to the solid wooden blocks usually within a day and then tunnelled further into the solid wooden block by the end of the experiment. Termites offered composite blocks of wood and rubber matched for mass were the slowest to show a preference for the solid wooden block and this preference was the least definitive of any treatment, which indicated that mass and/or damping may play a role in food assessment. This result clearly shows that the termites were not fooled by composite blocks matched for mass or frequency, which implies that they probably employ more than a single simple measure in their food assessment strategy. This implies a degree of sophistication in their ability to assess their environment hitherto unknown. The potential importance of alternative features in the vibrational signals is discussed.
Reliability of dose volume constraint inference from clinical data.
Lutz, C M; Møller, D S; Hoffmann, L; Knap, M M; Alber, M
2017-04-21
Dose volume histogram points (DVHPs) frequently serve as dose constraints in radiotherapy treatment planning. An experiment was designed to investigate the reliability of DVHP inference from clinical data for multiple cohort sizes and complication incidence rates. The experimental background was radiation pneumonitis in non-small cell lung cancer and the DVHP inference method was based on logistic regression. From 102 NSCLC real-life dose distributions and a postulated DVHP model, an 'ideal' cohort was generated where the most predictive model was equal to the postulated model. A bootstrap and a Cohort Replication Monte Carlo (CoRepMC) approach were applied to create 1000 equally sized populations each. The cohorts were then analyzed to establish inference frequency distributions. This was applied to nine scenarios for cohort sizes of 102 (1), 500 (2) to 2000 (3) patients (by sampling with replacement) and three postulated DVHP models. The Bootstrap was repeated for a 'non-ideal' cohort, where the most predictive model did not coincide with the postulated model. The Bootstrap produced chaotic results for all models of cohort size 1 for both the ideal and non-ideal cohorts. For cohort size 2 and 3, the distributions for all populations were more concentrated around the postulated DVHP. For the CoRepMC, the inference frequency increased with cohort size and incidence rate. Correct inference rates >[Formula: see text] were only achieved by cohorts with more than 500 patients. Both Bootstrap and CoRepMC indicate that inference of the correct or approximate DVHP for typical cohort sizes is highly uncertain. CoRepMC results were less spurious than Bootstrap results, demonstrating the large influence that randomness in dose-response has on the statistical analysis.
Reliability of dose volume constraint inference from clinical data
NASA Astrophysics Data System (ADS)
Lutz, C. M.; Møller, D. S.; Hoffmann, L.; Knap, M. M.; Alber, M.
2017-04-01
Dose volume histogram points (DVHPs) frequently serve as dose constraints in radiotherapy treatment planning. An experiment was designed to investigate the reliability of DVHP inference from clinical data for multiple cohort sizes and complication incidence rates. The experimental background was radiation pneumonitis in non-small cell lung cancer and the DVHP inference method was based on logistic regression. From 102 NSCLC real-life dose distributions and a postulated DVHP model, an ‘ideal’ cohort was generated where the most predictive model was equal to the postulated model. A bootstrap and a Cohort Replication Monte Carlo (CoRepMC) approach were applied to create 1000 equally sized populations each. The cohorts were then analyzed to establish inference frequency distributions. This was applied to nine scenarios for cohort sizes of 102 (1), 500 (2) to 2000 (3) patients (by sampling with replacement) and three postulated DVHP models. The Bootstrap was repeated for a ‘non-ideal’ cohort, where the most predictive model did not coincide with the postulated model. The Bootstrap produced chaotic results for all models of cohort size 1 for both the ideal and non-ideal cohorts. For cohort size 2 and 3, the distributions for all populations were more concentrated around the postulated DVHP. For the CoRepMC, the inference frequency increased with cohort size and incidence rate. Correct inference rates >85 % were only achieved by cohorts with more than 500 patients. Both Bootstrap and CoRepMC indicate that inference of the correct or approximate DVHP for typical cohort sizes is highly uncertain. CoRepMC results were less spurious than Bootstrap results, demonstrating the large influence that randomness in dose-response has on the statistical analysis.
Dynamics of Disordered PI-PtBS Diblock Copolymer
NASA Astrophysics Data System (ADS)
Watanabe, Hiroshi
2009-03-01
Viscoelastic (G^*) and dielectric (ɛ'') data were examined for a LCST-type diblock copolymer composed of polyisoprene (PI; M = 53K) and poly(p-tert- butyl styrene) (PtBS; M = 42K) blocks disordered at T <=120 C^o. Only PI had the type-A dipole parallel along the chain backbone. Thus, the ɛ'' data reflected the global motion of the PI block, while the G^* data detected the motion of the copolymer chain as a whole. Comparison of these data indicated that the PI block relaxed much faster than the PtBS block at low T and the dynamic heterogeneity due to PtBS was effectively quenched to give a frictional nonuniformity for the PI block relaxation. The ɛ'' data were thermo-rheologically complex at low T, partly due to this nonuniformity. However, the block connectivity could have also led to the complexity. For testing this effect, the ɛ'' data were reduced at the iso- frictional state defined with respect to bulk PI. In this state, the ɛ'' data of the copolymer at low and high T, respectively, were close to the data for the star-branched and linear bulk PI. Thus, the PI block appeared to be effectively tethered in space at low T thereby behaving similarly to the star arm while the PI block tended to move cooperatively with the PtBS block at high T to behave similarly to the linear PI, which led to the complexity of the ɛ'' data. The PtBS block also exhibited the complexity (noted from the G^* data), which was well correlated with the complexity of the PI block.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ouyang, L; Yan, H; Jia, X
2014-06-01
Purpose: A moving blocker based strategy has shown promising results for scatter correction in cone-beam computed tomography (CBCT). Different parameters of the system design affect its performance in scatter estimation and image reconstruction accuracy. The goal of this work is to optimize the geometric design of the moving block system. Methods: In the moving blocker system, a blocker consisting of lead strips is inserted between the x-ray source and imaging object and moving back and forth along rotation axis during CBCT acquisition. CT image of an anthropomorphic pelvic phantom was used in the simulation study. Scatter signal was simulated bymore » Monte Carlo calculation with various combinations of the lead strip width and the gap between neighboring lead strips, ranging from 4 mm to 80 mm (projected at the detector plane). Scatter signal in the unblocked region was estimated by cubic B-spline interpolation from the blocked region. Scatter estimation accuracy was quantified as relative root mean squared error by comparing the interpolated scatter to the Monte Carlo simulated scatter. CBCT was reconstructed by total variation minimization from the unblocked region, under various combinations of the lead strip width and gap. Reconstruction accuracy in each condition is quantified by CT number error as comparing to a CBCT reconstructed from unblocked full projection data. Results: Scatter estimation error varied from 0.5% to 2.6% as the lead strip width and the gap varied from 4mm to 80mm. CT number error in the reconstructed CBCT images varied from 12 to 44. Highest reconstruction accuracy is achieved when the blocker lead strip width is 8 mm and the gap is 48 mm. Conclusions: Accurate scatter estimation can be achieved in large range of combinations of lead strip width and gap. However, image reconstruction accuracy is greatly affected by the geometry design of the blocker.« less
Emura, Takeshi; Konno, Yoshihiko; Michimae, Hirofumi
2015-07-01
Doubly truncated data consist of samples whose observed values fall between the right- and left- truncation limits. With such samples, the distribution function of interest is estimated using the nonparametric maximum likelihood estimator (NPMLE) that is obtained through a self-consistency algorithm. Owing to the complicated asymptotic distribution of the NPMLE, the bootstrap method has been suggested for statistical inference. This paper proposes a closed-form estimator for the asymptotic covariance function of the NPMLE, which is computationally attractive alternative to bootstrapping. Furthermore, we develop various statistical inference procedures, such as confidence interval, goodness-of-fit tests, and confidence bands to demonstrate the usefulness of the proposed covariance estimator. Simulations are performed to compare the proposed method with both the bootstrap and jackknife methods. The methods are illustrated using the childhood cancer dataset.
Comulada, W. Scott
2015-01-01
Stata’s mi commands provide powerful tools to conduct multiple imputation in the presence of ignorable missing data. In this article, I present Stata code to extend the capabilities of the mi commands to address two areas of statistical inference where results are not easily aggregated across imputed datasets. First, mi commands are restricted to covariate selection. I show how to address model fit to correctly specify a model. Second, the mi commands readily aggregate model-based standard errors. I show how standard errors can be bootstrapped for situations where model assumptions may not be met. I illustrate model specification and bootstrapping on frequency counts for the number of times that alcohol was consumed in data with missing observations from a behavioral intervention. PMID:26973439
Heptagons from the Steinmann cluster bootstrap
Dixon, Lance J.; Drummond, James; Harrington, Thomas; ...
2017-02-28
We reformulate the heptagon cluster bootstrap to take advantage of the Steinmann relations, which require certain double discontinuities of any amplitude to vanish. These constraints vastly reduce the number of functions needed to bootstrap seven-point amplitudes in planarmore » $$ \\mathcal{N} $$ = 4 supersymmetric Yang-Mills theory, making higher-loop contributions to these amplitudes more computationally accessible. In particular, dual superconformal symmetry and well-defined collinear limits suffice to determine uniquely the symbols of the three-loop NMHV and four-loop MHV seven-point amplitudes. We also show that at three loops, relaxing the dual superconformal $$\\bar{Q}$$ relations and imposing dihedral symmetry (and for NMHV the absence of spurious poles) leaves only a single ambiguity in the heptagon amplitudes. These results point to a strong tension between the collinear properties of the amplitudes and the Steinmann relations.« less
Kepler Planet Detection Metrics: Statistical Bootstrap Test
NASA Technical Reports Server (NTRS)
Jenkins, Jon M.; Burke, Christopher J.
2016-01-01
This document describes the data produced by the Statistical Bootstrap Test over the final three Threshold Crossing Event (TCE) deliveries to NExScI: SOC 9.1 (Q1Q16)1 (Tenenbaum et al. 2014), SOC 9.2 (Q1Q17) aka DR242 (Seader et al. 2015), and SOC 9.3 (Q1Q17) aka DR253 (Twicken et al. 2016). The last few years have seen significant improvements in the SOC science data processing pipeline, leading to higher quality light curves and more sensitive transit searches. The statistical bootstrap analysis results presented here and the numerical results archived at NASAs Exoplanet Science Institute (NExScI) bear witness to these software improvements. This document attempts to introduce and describe the main features and differences between these three data sets as a consequence of the software changes.
Imaging with New Classic and Vision at the NPOI
NASA Astrophysics Data System (ADS)
Jorgensen, Anders
2018-04-01
The Navy Precision Optical Interferometer (NPOI) is unique among interferometric observatories for its ability to position telescopes in an equally-spaced array configuration. This configuration is optimal for interferometric imaging because it allows the use of bootstrapping to track fringes on long baselines with signal-to-noise ratio less than one. When combined with coherent integration techniques this can produce visibilities with acceptable SNR on baselines long enough to resolve features on the surfaces of stars. The stellar surface imaging project at NPOI combines the bootstrapping array configuration of the NPOI array, real-time fringe tracking, baseline- and wavelength bootstrapping with Earth rotation to provide dense coverage in the UV plane at a wide range of spatial frequencies. In this presentation, we provide an overview of the project and an update of the latest status and results from the project.
A 2d Block Model For Landslide Simulation: An Application To The 1963 Vajont Case
NASA Astrophysics Data System (ADS)
Tinti, S.; Zaniboni, F.; Manucci, A.; Bortolucci, E.
A 2D block model to study the motion of a sliding mass is presented. The slide is par- titioned into a matrix of blocks the basis of which are quadrilaterals. The blocks move on a specified sliding surface and follow a trajectory that is computed by the model. The forces acting on the blocks are gravity, basal friction, buoyancy in case of under- water motion, and interaction with neighbouring blocks. At any time step, the position of the blocks on the sliding surface is determined in curvilinear (local) co-ordinates by computing the position of the vertices of the quadrilaterals and the position of the block centre of mass. Mathematically, the topology of the system is invariant during the motion, which means that the number of blocks is constant and that each block has always the same neighbours. Physically, this means that blocks are allowed to change form, but not to penetrate into each other, not to coalesce, not to split. The change of form is compensated by the change of height, under the computational assumption that the block volume is constant during motion: consequently lateral expansion or contraction yield respectively height reduction or increment of the blocks. This model is superior to the analogous 1D model where the mass is partitioned into a chain of interacting blocks. 1D models require the a-priori specification of the sliding path, that is of the trajectory of the blocks, which the 2D block model supplies as one of its output. In continuation of previous studies on the catastrophic slide of Vajont that occurred in 1963 in northern Italy and caused more than 2000 victims, the 2D block model has been applied to the Vajont case. The results are compared to the outcome of the 1D model, and more importantly to the observational data concerning the deposit position and morphology. The agreement between simulation and data is found to be quite good.
Bootstrapping and Maintaining Trust in the Cloud
2016-12-01
proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as Amazon Web Services and Google Compute Engine means...IaaS trusted computing system: • Secure Bootstrapping – the system should enable the tenant to securely install an initial root secret into each cloud ...elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features, but none achieve all. Excalibur [31] sup
Sample Reuse in Statistical Remodeling.
1987-08-01
as the jackknife and bootstrap, is an expansion of the functional, T(Fn), or of its distribution function or both. Frangos and Schucany (1987a) used...accelerated bootstrap. In the same report Frangos and Schucany demonstrated the small sample superiority of that approach over the proposals that take...higher order terms of an Edgeworth expansion into account. In a second report Frangos and Schucany (1987b) examined the small sample performance of
Innovation cascades: artefacts, organization and attributions
2016-01-01
Innovation cascades inextricably link the introduction of new artefacts, transformations in social organization, and the emergence of new functionalities and new needs. This paper describes a positive feedback dynamic, exaptive bootstrapping, through which these cascades proceed, and the characteristics of the relationships in which the new attributions that drive this dynamic are generated. It concludes by arguing that the exaptive bootstrapping dynamic is the principal driver of our current Innovation Society. PMID:26926284
ERIC Educational Resources Information Center
Ramanarayanan, Vikram; Suendermann-Oeft, David; Lange, Patrick; Ivanov, Alexei V.; Evanini, Keelan; Yu, Zhou; Tsuprun, Eugene; Qian, Yao
2016-01-01
We propose a crowdsourcing-based framework to iteratively and rapidly bootstrap a dialog system from scratch for a new domain. We leverage the open-source modular HALEF dialog system to deploy dialog applications. We illustrate the usefulness of this framework using four different prototype dialog items with applications in the educational domain…
Brunetti, Riccardo; Del Gatto, Claudia; Cavallina, Clarissa; Farina, Benedetto; Delogu, Franco
2018-05-01
The Corsi Block Tapping Task is a widespread test used to assess spatial working memory. Previous research hypothesized that the discrepancy found in some cases between the traditional and the digital (touchscreen) version of the Corsi block tapping task may be due to a direct motor resonance between the experimenter's and the participant's hand movements. However, we hypothesize that this discrepancy might be due to extra movement-related information included in the traditional version, lacking in the digital one. We investigated the effects of such task-irrelevant information using eCorsi, a touchscreen version of the task. In Experiment 1, we manipulate timing in sequence presentation, creating three conditions. In the Congruent condition, the inter-stimulus intervals reflected the physical distance in which the stimuli were spatially placed: The longer the spatial distance, the longer the temporal interval. In the Incongruent condition the timing changed randomly. Finally, in the Isochronous condition every stimulus appeared after a fixed interval, independently from its spatial position. The results showed a performance enhancement in the Congruent condition, suggesting an incidental spatio-temporal binding. In Experiment 2, we added straight lines between each location in the sequences: In the Trajectories condition participants saw trajectories from one spatial position to the other during sequence presentation, while a condition without such trajectories served as control. Results showed better performances in the Trajectories condition. We suggest that the timing and trajectories information play a significant role in the discrepancies found between the traditional and the touchscreen version of the Corsi Block Tapping Task, without the necessity of explanations involving direct motor resonance (e.g. seeing an actual hand moving) as a causal factor.
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution.
Imai, Mutsumi; Kita, Sotaro
2014-09-19
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Guerrero, Erick G; Fenwick, Karissa; Kong, Yinfei
2017-11-14
Leadership style and specific organizational climates have emerged as critical mechanisms to implement targeted practices in organizations. Drawing from relevant theories, we propose that climate for implementation of cultural competence reflects how transformational leadership may enhance the organizational implementation of culturally responsive practices in health care organizations. Using multilevel data from 427 employees embedded in 112 addiction treatment programs collected in 2013, confirmatory factor analysis showed adequate fit statistics for our measure of climate for implementation of cultural competence (Cronbach's alpha = .88) and three outcomes: knowledge (Cronbach's alpha = .88), services (Cronbach's alpha = .86), and personnel (Cronbach's alpha = .86) practices. Results from multilevel path analyses indicate a positive relationship between employee perceptions of transformational leadership and climate for implementation of cultural competence (standardized indirect effect = .057, bootstrap p < .001). We also found a positive indirect effect between transformational leadership and each of the culturally competent practices: knowledge (standardized indirect effect = .006, bootstrap p = .004), services (standardized indirect effect = .019, bootstrap p < .001), and personnel (standardized indirect effect = .014, bootstrap p = .005). Findings contribute to implementation science. They build on leadership theory and offer evidence of the mediating role of climate in the implementation of cultural competence in addiction health service organizations.
Nixon, Richard M; Wonderling, David; Grieve, Richard D
2010-03-01
Cost-effectiveness analyses (CEA) alongside randomised controlled trials commonly estimate incremental net benefits (INB), with 95% confidence intervals, and compute cost-effectiveness acceptability curves and confidence ellipses. Two alternative non-parametric methods for estimating INB are to apply the central limit theorem (CLT) or to use the non-parametric bootstrap method, although it is unclear which method is preferable. This paper describes the statistical rationale underlying each of these methods and illustrates their application with a trial-based CEA. It compares the sampling uncertainty from using either technique in a Monte Carlo simulation. The experiments are repeated varying the sample size and the skewness of costs in the population. The results showed that, even when data were highly skewed, both methods accurately estimated the true standard errors (SEs) when sample sizes were moderate to large (n>50), and also gave good estimates for small data sets with low skewness. However, when sample sizes were relatively small and the data highly skewed, using the CLT rather than the bootstrap led to slightly more accurate SEs. We conclude that while in general using either method is appropriate, the CLT is easier to implement, and provides SEs that are at least as accurate as the bootstrap. (c) 2009 John Wiley & Sons, Ltd.
Learn to Avoid or Overcome Leadership Obstacles
ERIC Educational Resources Information Center
D'Auria, John
2015-01-01
Leadership is increasingly recognized as an important factor in moving schools forward, yet we have been relatively random in how we prepare and support them. Four obstacles often block or diminish their effectiveness. Avoiding or overcoming each of these requires an underlying set of skills and knowledge that we believe can be learned and…
ERIC Educational Resources Information Center
DiJulio, Betsy
2012-01-01
"Gateway to the Future" pairs a painting of a gateway constructed from children's building blocks with an ink drawing of a personal symbol on a collaged background. The main objective of this lesson is to create a metaphoric artwork about moving from the present through a symbolic portal to the future. So, space--foreground, middle ground, and…
IRM Concepts: Building Blocks for the 1990s.
ERIC Educational Resources Information Center
Owen, Darrell E.
1989-01-01
Presents a conceptual overview of information resources management (IRM) by synthesizing concepts put forward during the 1980s and charts opportunities to move these concepts into practice. It is argued that the reorganization required by IRM is justified by better use of resources, better decision making, and an improved corporate structure. (21…
Code of Federal Regulations, 2014 CFR
2014-10-01
..., or special instructions. (b) After August 1, 1977, each railroad must have in effect an operating..., within yard limits must move prepared to stop within onehalf the range of vision but not exceeding 20 m.p.h. unless the main track is known to be clear by block signal indications. (3) Within yard limits...
Code of Federal Regulations, 2013 CFR
2013-10-01
..., or special instructions. (b) After August 1, 1977, each railroad must have in effect an operating..., within yard limits must move prepared to stop within onehalf the range of vision but not exceeding 20 m.p.h. unless the main track is known to be clear by block signal indications. (3) Within yard limits...
Code of Federal Regulations, 2011 CFR
2011-10-01
..., or special instructions. (b) After August 1, 1977, each railroad must have in effect an operating..., within yard limits must move prepared to stop within onehalf the range of vision but not exceeding 20 m.p.h. unless the main track is known to be clear by block signal indications. (3) Within yard limits...
Code of Federal Regulations, 2010 CFR
2010-10-01
..., or special instructions. (b) After August 1, 1977, each railroad must have in effect an operating..., within yard limits must move prepared to stop within onehalf the range of vision but not exceeding 20 m.p.h. unless the main track is known to be clear by block signal indications. (3) Within yard limits...
Code of Federal Regulations, 2012 CFR
2012-10-01
..., or special instructions. (b) After August 1, 1977, each railroad must have in effect an operating..., within yard limits must move prepared to stop within onehalf the range of vision but not exceeding 20 m.p.h. unless the main track is known to be clear by block signal indications. (3) Within yard limits...
A Comparison of the Language Features of Basic and HyperCard.
ERIC Educational Resources Information Center
Henry, M. J.; Southerly, T. W.
This paper examines the structure of the Applesoft BASIC programming language and the Macintosh authoring language, HyperCard, and scrutinizes the language structures as the building blocks for moving along a chain of cognitive outcomes that culminates in the acquisition of problem solving skills which allow the programmer to learn new formal…
Jafari, Masoumeh; Salimifard, Maryam; Dehghani, Maryam
2014-07-01
This paper presents an efficient method for identification of nonlinear Multi-Input Multi-Output (MIMO) systems in the presence of colored noises. The method studies the multivariable nonlinear Hammerstein and Wiener models, in which, the nonlinear memory-less block is approximated based on arbitrary vector-based basis functions. The linear time-invariant (LTI) block is modeled by an autoregressive moving average with exogenous (ARMAX) model which can effectively describe the moving average noises as well as the autoregressive and the exogenous dynamics. According to the multivariable nature of the system, a pseudo-linear-in-the-parameter model is obtained which includes two different kinds of unknown parameters, a vector and a matrix. Therefore, the standard least squares algorithm cannot be applied directly. To overcome this problem, a Hierarchical Least Squares Iterative (HLSI) algorithm is used to simultaneously estimate the vector and the matrix of unknown parameters as well as the noises. The efficiency of the proposed identification approaches are investigated through three nonlinear MIMO case studies. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Hong; Kong, Vic; Ren, Lei
2016-01-15
Purpose: A preobject grid can reduce and correct scatter in cone beam computed tomography (CBCT). However, half of the signal in each projection is blocked by the grid. A synchronized moving grid (SMOG) has been proposed to acquire two complimentary projections at each gantry position and merge them into one complete projection. That approach, however, suffers from increased scanning time and the technical difficulty of accurately merging the two projections per gantry angle. Herein, the authors present a new SMOG approach which acquires a single projection per gantry angle, with complimentary grid patterns for any two adjacent projections, and usemore » an interprojection sensor fusion (IPSF) technique to estimate the blocked signal in each projection. The method may have the additional benefit of reduced imaging dose due to the grid blocking half of the incident radiation. Methods: The IPSF considers multiple paired observations from two adjacent gantry angles as approximations of the blocked signal and uses a weighted least square regression of these observations to finally determine the blocked signal. The method was first tested with a simulated SMOG on a head phantom. The signal to noise ratio (SNR), which represents the difference of the recovered CBCT image to the original image without the SMOG, was used to evaluate the ability of the IPSF in recovering the missing signal. The IPSF approach was then tested using a Catphan phantom on a prototype SMOG assembly installed in a bench top CBCT system. Results: In the simulated SMOG experiment, the SNRs were increased from 15.1 and 12.7 dB to 35.6 and 28.9 dB comparing with a conventional interpolation method (inpainting method) for a projection and the reconstructed 3D image, respectively, suggesting that IPSF successfully recovered most of blocked signal. In the prototype SMOG experiment, the authors have successfully reconstructed a CBCT image using the IPSF-SMOG approach. The detailed geometric features in the Catphan phantom were mostly recovered according to visual evaluation. The scatter related artifacts, such as cupping artifacts, were almost completely removed. Conclusions: The IPSF-SMOG is promising in reducing scatter artifacts and improving image quality while reducing radiation dose.« less
An electrostatic Particle-In-Cell code on multi-block structured meshes
NASA Astrophysics Data System (ADS)
Meierbachtol, Collin S.; Svyatskiy, Daniil; Delzanno, Gian Luca; Vernon, Louis J.; Moulton, J. David
2017-12-01
We present an electrostatic Particle-In-Cell (PIC) code on multi-block, locally structured, curvilinear meshes called Curvilinear PIC (CPIC). Multi-block meshes are essential to capture complex geometries accurately and with good mesh quality, something that would not be possible with single-block structured meshes that are often used in PIC and for which CPIC was initially developed. Despite the structured nature of the individual blocks, multi-block meshes resemble unstructured meshes in a global sense and introduce several new challenges, such as the presence of discontinuities in the mesh properties and coordinate orientation changes across adjacent blocks, and polyjunction points where an arbitrary number of blocks meet. In CPIC, these challenges have been met by an approach that features: (1) a curvilinear formulation of the PIC method: each mesh block is mapped from the physical space, where the mesh is curvilinear and arbitrarily distorted, to the logical space, where the mesh is uniform and Cartesian on the unit cube; (2) a mimetic discretization of Poisson's equation suitable for multi-block meshes; and (3) a hybrid (logical-space position/physical-space velocity), asynchronous particle mover that mitigates the performance degradation created by the necessity to track particles as they move across blocks. The numerical accuracy of CPIC was verified using two standard plasma-material interaction tests, which demonstrate good agreement with the corresponding analytic solutions. Compared to PIC codes on unstructured meshes, which have also been used for their flexibility in handling complex geometries but whose performance suffers from issues associated with data locality and indirect data access patterns, PIC codes on multi-block structured meshes may offer the best compromise for capturing complex geometries while also maintaining solution accuracy and computational efficiency.
An electrostatic Particle-In-Cell code on multi-block structured meshes
Meierbachtol, Collin S.; Svyatskiy, Daniil; Delzanno, Gian Luca; ...
2017-09-14
We present an electrostatic Particle-In-Cell (PIC) code on multi-block, locally structured, curvilinear meshes called Curvilinear PIC (CPIC). Multi-block meshes are essential to capture complex geometries accurately and with good mesh quality, something that would not be possible with single-block structured meshes that are often used in PIC and for which CPIC was initially developed. In spite of the structured nature of the individual blocks, multi-block meshes resemble unstructured meshes in a global sense and introduce several new challenges, such as the presence of discontinuities in the mesh properties and coordinate orientation changes across adjacent blocks, and polyjunction points where anmore » arbitrary number of blocks meet. In CPIC, these challenges have been met by an approach that features: (1) a curvilinear formulation of the PIC method: each mesh block is mapped from the physical space, where the mesh is curvilinear and arbitrarily distorted, to the logical space, where the mesh is uniform and Cartesian on the unit cube; (2) a mimetic discretization of Poisson's equation suitable for multi-block meshes; and (3) a hybrid (logical-space position/physical-space velocity), asynchronous particle mover that mitigates the performance degradation created by the necessity to track particles as they move across blocks. The numerical accuracy of CPIC was verified using two standard plasma–material interaction tests, which demonstrate good agreement with the corresponding analytic solutions. And compared to PIC codes on unstructured meshes, which have also been used for their flexibility in handling complex geometries but whose performance suffers from issues associated with data locality and indirect data access patterns, PIC codes on multi-block structured meshes may offer the best compromise for capturing complex geometries while also maintaining solution accuracy and computational efficiency.« less
An electrostatic Particle-In-Cell code on multi-block structured meshes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meierbachtol, Collin S.; Svyatskiy, Daniil; Delzanno, Gian Luca
We present an electrostatic Particle-In-Cell (PIC) code on multi-block, locally structured, curvilinear meshes called Curvilinear PIC (CPIC). Multi-block meshes are essential to capture complex geometries accurately and with good mesh quality, something that would not be possible with single-block structured meshes that are often used in PIC and for which CPIC was initially developed. In spite of the structured nature of the individual blocks, multi-block meshes resemble unstructured meshes in a global sense and introduce several new challenges, such as the presence of discontinuities in the mesh properties and coordinate orientation changes across adjacent blocks, and polyjunction points where anmore » arbitrary number of blocks meet. In CPIC, these challenges have been met by an approach that features: (1) a curvilinear formulation of the PIC method: each mesh block is mapped from the physical space, where the mesh is curvilinear and arbitrarily distorted, to the logical space, where the mesh is uniform and Cartesian on the unit cube; (2) a mimetic discretization of Poisson's equation suitable for multi-block meshes; and (3) a hybrid (logical-space position/physical-space velocity), asynchronous particle mover that mitigates the performance degradation created by the necessity to track particles as they move across blocks. The numerical accuracy of CPIC was verified using two standard plasma–material interaction tests, which demonstrate good agreement with the corresponding analytic solutions. And compared to PIC codes on unstructured meshes, which have also been used for their flexibility in handling complex geometries but whose performance suffers from issues associated with data locality and indirect data access patterns, PIC codes on multi-block structured meshes may offer the best compromise for capturing complex geometries while also maintaining solution accuracy and computational efficiency.« less
Regional melt-pond fraction and albedo of thin Arctic first-year drift ice in late summer
NASA Astrophysics Data System (ADS)
Divine, D. V.; Granskog, M. A.; Hudson, S. R.; Pedersen, C. A.; Karlsen, T. I.; Divina, S. A.; Renner, A. H. H.; Gerland, S.
2015-02-01
The paper presents a case study of the regional (≈150 km) morphological and optical properties of a relatively thin, 70-90 cm modal thickness, first-year Arctic sea ice pack in an advanced stage of melt. The study combines in situ broadband albedo measurements representative of the four main surface types (bare ice, dark melt ponds, bright melt ponds and open water) and images acquired by a helicopter-borne camera system during ice-survey flights. The data were collected during the 8-day ICE12 drift experiment carried out by the Norwegian Polar Institute in the Arctic, north of Svalbard at 82.3° N, from 26 July to 3 August 2012. A set of > 10 000 classified images covering about 28 km2 revealed a homogeneous melt across the study area with melt-pond coverage of ≈ 0.29 and open-water fraction of ≈ 0.11. A decrease in pond fractions observed in the 30 km marginal ice zone (MIZ) occurred in parallel with an increase in open-water coverage. The moving block bootstrap technique applied to sequences of classified sea-ice images and albedo of the four surface types yielded a regional albedo estimate of 0.37 (0.35; 0.40) and regional sea-ice albedo of 0.44 (0.42; 0.46). Random sampling from the set of classified images allowed assessment of the aggregate scale of at least 0.7 km2 for the study area. For the current setup configuration it implies a minimum set of 300 images to process in order to gain adequate statistics on the state of the ice cover. Variance analysis also emphasized the importance of longer series of in situ albedo measurements conducted for each surface type when performing regional upscaling. The uncertainty in the mean estimates of surface type albedo from in situ measurements contributed up to 95% of the variance of the estimated regional albedo, with the remaining variance resulting from the spatial inhomogeneity of sea-ice cover.
Preventing messaging queue deadlocks in a DMA environment
Blocksome, Michael A; Chen, Dong; Gooding, Thomas; Heidelberger, Philip; Parker, Jeff
2014-01-14
Embodiments of the invention may be used to manage message queues in a parallel computing environment to prevent message queue deadlock. A direct memory access controller of a compute node may determine when a messaging queue is full. In response, the DMA may generate and interrupt. An interrupt handler may stop the DMA and swap all descriptors from the full messaging queue into a larger queue (or enlarge the original queue). The interrupt handler then restarts the DMA. Alternatively, the interrupt handler stops the DMA, allocates a memory block to hold queue data, and then moves descriptors from the full messaging queue into the allocated memory block. The interrupt handler then restarts the DMA. During a normal messaging advance cycle, a messaging manager attempts to inject the descriptors in the memory block into other messaging queues until the descriptors have all been processed.
NASA Astrophysics Data System (ADS)
Sanny, Teuku A.
2017-07-01
The objective of this study is to determine boundary and how to know surrounding area between Lembang Fault and Cimandiri fault. For the detailed study we used three methodologies: (1). Surface deformation modeling by using Boundary Element method and (2) Controlled Source Audiomagneto Telluric (CSAMT). Based on the study by using surface deformation by using Boundary Element Methods (BEM), the direction Lembang fault has a dominant displacement in east direction. The eastward displacement at the nothern fault block is smaller than the eastward displacement at the southern fault block which indicates that each fault block move in left direction relative to each other. From this study we know that Lembang fault in this area has left lateral strike slip component. The western part of the Lembang fault move in west direction different from the eastern part that moves in east direction. Stress distribution map of Lembang fault shows difference between the eastern and western segments of Lembang fault. Displacement distribution map along x-direction and y-direction of Lembang fault shows a linement oriented in northeast-southwest direction right on Tangkuban Perahu Mountain. Displacement pattern of Cimandiri fault indicates that the Cimandiri fault is devided into two segment. Eastern segment has left lateral strike slip component while the western segment has right lateral strike slip component. Based on the displacement distribution map along y-direction, a linement oriented in northwest-southeast direction is observed at the western segment of the Cimandiri fault. The displacement along x-direction and y-direction between the Lembang and Cimandiri fault is nearly equal to zero indicating that the Lembang fault and Cimandiri Fault are not connected to each others. Based on refraction seismic tomography that we know the characteristic of Cimandiri fault as normal fault. Based on CSAMT method th e lembang fault is normal fault that different of dip which formed as graben structure.
Bootstrapping and Maintaining Trust in the Cloud
2016-03-16
of infrastructure-as-a- service (IaaS) cloud computing services such as Ama- zon Web Services, Google Compute Engine, Rackspace, et. al. means that...Implementation We implemented keylime in ∼3.2k lines of Python in four components: registrar, node, CV, and tenant. The registrar offers a REST-based web ...bootstrap key K. It provides an unencrypted REST-based web service for these two functions. As described earlier, the pro- tocols for exchanging data
Reduced Power Laer Designation Systems
2008-06-20
200KD, Ri = = 60Kfl, and R 2 = R4 = 2K yields an overall transimpedance gain of 200K x 30 x 30 = 180MV/A. Figure 3. Three stage photodiode amplifier ...transistor circuit for bootstrap buffering of the input stage, comparing the noise performance of the candidate amplifier designs, selecting the two...transistor bootstrap design as the circuit of choice, and comparing the performance of this circuit against that of a basic transconductance amplifier
Molinos-Senante, María; Donoso, Guillermo; Sala-Garrido, Ramon; Villegas, Andrés
2018-03-01
Benchmarking the efficiency of water companies is essential to set water tariffs and to promote their sustainability. In doing so, most of the previous studies have applied conventional data envelopment analysis (DEA) models. However, it is a deterministic method that does not allow to identify environmental factors influencing efficiency scores. To overcome this limitation, this paper evaluates the efficiency of a sample of Chilean water and sewerage companies applying a double-bootstrap DEA model. Results evidenced that the ranking of water and sewerage companies changes notably whether efficiency scores are computed applying conventional or double-bootstrap DEA models. Moreover, it was found that the percentage of non-revenue water and customer density are factors influencing the efficiency of Chilean water and sewerage companies. This paper illustrates the importance of using a robust and reliable method to increase the relevance of benchmarking tools.
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
Bootstrapping the energy flow in the beginning of life.
Hengeveld, R; Fedonkin, M A
2007-01-01
This paper suggests that the energy flow on which all living structures depend only started up slowly, the low-energy, initial phase starting up a second, slightly more energetic phase, and so on. In this way, the build up of the energy flow follows a bootstrapping process similar to that found in the development of computers, the first generation making possible the calculations necessary for constructing the second one, etc. In the biogenetic upstart of an energy flow, non-metals in the lower periods of the Periodic Table of Elements would have constituted the most primitive systems, their operation being enhanced and later supplanted by elements in the higher periods that demand more energy. This bootstrapping process would put the development of the metabolisms based on the second period elements carbon, nitrogen and oxygen at the end of the evolutionary process rather than at, or even before, the biogenetic event.
Research on target tracking in coal mine based on optical flow method
NASA Astrophysics Data System (ADS)
Xue, Hongye; Xiao, Qingwei
2015-03-01
To recognize, track and count the bolting machine in coal mine video images, a real-time target tracking method based on the Lucas-Kanade sparse optical flow is proposed in this paper. In the method, we judge whether the moving target deviate from its trajectory, predicate and correct the position of the moving target. The method solves the problem of failure to track the target or lose the target because of the weak light, uneven illumination and blocking. Using the VC++ platform and Opencv lib we complete the recognition and tracking. The validity of the method is verified by the result of the experiment.
Cost-utility analysis of a preventive home visit program for older adults in Germany.
Brettschneider, Christian; Luck, Tobias; Fleischer, Steffen; Roling, Gudrun; Beutner, Katrin; Luppa, Melanie; Behrens, Johann; Riedel-Heller, Steffi G; König, Hans-Helmut
2015-04-03
Most older adults want to live independently in a familiar environment instead of moving to a nursing home. Preventive home visits based on multidimensional geriatric assessment can be one strategy to support this preference and might additionally reduce health care costs, due to the avoidance of costly nursing home admissions. The purpose of this study was to analyse the cost-effectiveness of preventive home visits from a societal perspective in Germany. This study is part of a multi-centre, non-blinded, randomised controlled trial aiming at the reduction of nursing home admissions. Participants were older than 80 years and living at home. Up to three home visits were conducted to identify self-care deficits and risk factors, to present recommendations and to implement solutions. The control group received usual care. A cost-utility analysis using quality-adjusted life years (QALY) based on the EQ-5D was performed. Resource utilization was assessed by means of the interview version of a patient questionnaire. A cost-effectiveness acceptability curve controlled for prognostic variables was constructed and a sensitivity analysis to control for the influence of the mode of QALY calculation was performed. 278 individuals (intervention group: 133; control group: 145) were included in the analysis. During 18 months follow-up mean adjusted total cost (mean: +4,401 EUR; bootstrapped standard error: 3,019.61 EUR) and number of QALY (mean: 0.0061 QALY; bootstrapped standard error: 0.0388 QALY) were higher in the intervention group, but differences were not significant. For preventive home visits the probability of an incremental cost-effectiveness ratio <50,000 EUR per QALY was only 15%. The results were robust with respect to the mode of QALY calculation. The evaluated preventive home visits programme is unlikely to be cost-effective. Clinical Trials.gov Identifier: NCT00644826.
Oberije, Cary; De Ruysscher, Dirk; Houben, Ruud; van de Heuvel, Michel; Uyterlinde, Wilma; Deasy, Joseph O; Belderbos, Jose; Dingemans, Anne-Marie C; Rimner, Andreas; Din, Shaun; Lambin, Philippe
2015-07-15
Although patients with stage III non-small cell lung cancer (NSCLC) are homogeneous according to the TNM staging system, they form a heterogeneous group, which is reflected in the survival outcome. The increasing amount of information for an individual patient and the growing number of treatment options facilitate personalized treatment, but they also complicate treatment decision making. Decision support systems (DSS), which provide individualized prognostic information, can overcome this but are currently lacking. A DSS for stage III NSCLC requires the development and integration of multiple models. The current study takes the first step in this process by developing and validating a model that can provide physicians with a survival probability for an individual NSCLC patient. Data from 548 patients with stage III NSCLC were available to enable the development of a prediction model, using stratified Cox regression. Variables were selected by using a bootstrap procedure. Performance of the model was expressed as the c statistic, assessed internally and on 2 external data sets (n=174 and n=130). The final multivariate model, stratified for treatment, consisted of age, gender, World Health Organization performance status, overall treatment time, equivalent radiation dose, number of positive lymph node stations, and gross tumor volume. The bootstrapped c statistic was 0.62. The model could identify risk groups in external data sets. Nomograms were constructed to predict an individual patient's survival probability (www.predictcancer.org). The data set can be downloaded at https://www.cancerdata.org/10.1016/j.ijrobp.2015.02.048. The prediction model for overall survival of patients with stage III NSCLC highlights the importance of combining patient, clinical, and treatment variables. Nomograms were developed and validated. This tool could be used as a first building block for a decision support system. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Chen, Gang; Taylor, Paul A.; Shin, Yong-Wook; Reynolds, Richard C.; Cox, Robert W.
2016-01-01
It has been argued that naturalistic conditions in FMRI studies provide a useful paradigm for investigating perception and cognition through a synchronization measure, inter-subject correlation (ISC). However, one analytical stumbling block has been the fact that the ISC values associated with each single subject are not independent, and our previous paper (Chen et al., 2016) used simulations and analyses of real data to show that the methodologies adopted in the literature do not have the proper control for false positives. In the same paper, we proposed nonparametric subject-wise bootstrapping and permutation testing techniques for one and two groups, respectively, which account for the correlation structure, and these greatly outperformed the prior methods in controlling the false positive rate (FPR); that is, subject-wise bootstrapping (SWB) worked relatively well for both cases with one and two groups, and subject-wise permutation (SWP) testing was virtually ideal for group comparisons. Here we seek to explicate and adopt a parametric approach through linear mixed-effects (LME) modeling for studying the ISC values, building on the previous correlation framework, with the benefit that the LME platform offers wider adaptability, more powerful interpretations, and quality control checking capability than nonparametric methods. We describe both theoretical and practical issues involved in the modeling and the manner in which LME with crossed random effects (CRE) modeling is applied. A data-doubling step further allows us to conveniently track the subject index, and achieve easy implementations. We pit the LME approach against the best nonparametric methods, and find that the LME framework achieves proper control for false positives. The new LME methodologies are shown to be both efficient and robust, and they will be added as an additional option and settings in an existing open source program, 3dLME, in AFNI (http://afni.nimh.nih.gov). PMID:27751943
DOE Office of Scientific and Technical Information (OSTI.GOV)
Inners, J.D.; Sevon, W.D.; Moore, M.E.
1993-03-01
Imposing hilltop rock-cities developed from widely jointed outcrops of Olean conglomerate (Lower Pennsylvanian) create picturesque scenery on the Allegheny High Plateau in Warren Co., Pa. At least six such rock cities 2 to 5 acres in extent are associated with the Late Wisconsinan glacial border in the northern half of the county. Farther to the south, jumbled Olean and Knapp (Lower Mississippian) joint blocks occur on steep slopes below valley-wall cliffs. The rock cities and accumulations of displaced joint blocks are largely relics of Late Wisconsinan periglacial mass-wasting. Frost splitting initiated opening of bedrock joints to form buildings. Gravity, soilmore » wedging, and possibly gelifluction then widened the fissures into streets. Gelifluction moved blocks downslope and oriented their long axes parallel with slope (Warren Rocks). Forward toppling of high, unstable blocks contributed to mass-movement on some steep slopes (Rimrock). Today, rock cities and downslope blocks are stable in areas of gentle (less than 10 percent) slopes, but toppling, solifluction, creep, and debris flows cause continued slow movement of large blocks on moderately steep to steep (greater than 30 percent) slopes. Blocks of Olean and Knapp conglomerate have both stratabound pitting and intricate honeycomb weathering. Deep pitting is controlled largely by variations in silica cementation. Honeycomb weathering is most evident in sandy layers and results from patterns of iron-oxide impregnation. Both are Holocene surface-weathering processes.« less
Bootstrap Current for the Edge Pedestal Plasma in a Diverted Tokamak Geometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koh, S.; Chang, C. S.; Ku, S.
The edge bootstrap current plays a critical role in the equilibrium and stability of the steep edge pedestal plasma. The pedestal plasma has an unconventional and difficult neoclassical property, as compared with the core plasma. It has a narrow passing particle region in velocity space that can be easily modified or destroyed by Coulomb collisions. At the same time, the edge pedestal plasma has steep pressure and electrostatic potential gradients whose scale-lengths are comparable with the ion banana width, and includes a magnetic separatrix surface, across which the topological properties of the magnetic field and particle orbits change abruptly. Amore » driftkinetic particle code XGC0, equipped with a mass-momentum-energy conserving collision operator, is used to study the edge bootstrap current in a realistic diverted magnetic field geometry with a self-consistent radial electric field. When the edge electrons are in the weakly collisional banana regime, surprisingly, the present kinetic simulation confirms that the existing analytic expressions [represented by O. Sauter et al. , Phys. Plasmas 6 , 2834 (1999)] are still valid in this unconventional region, except in a thin radial layer in contact with the magnetic separatrix. The agreement arises from the dominance of the electron contribution to the bootstrap current compared with ion contribution and from a reasonable separation of the trapped-passing dynamics without a strong collisional mixing. However, when the pedestal electrons are in plateau-collisional regime, there is significant deviation of numerical results from the existing analytic formulas, mainly due to large effective collisionality of the passing and the boundary layer trapped particles in edge region. In a conventional aspect ratio tokamak, the edge bootstrap current from kinetic simulation can be significantly less than that from the Sauter formula if the electron collisionality is high. On the other hand, when the aspect ratio is close to unity, the collisional edge bootstrap current can be significantly greater than that from the Sauter formula. Rapid toroidal rotation of the magnetic field lines at the high field side of a tight aspect-ratio tokamak is believed to be the cause of the different behavior. A new analytic fitting formula, as a simple modification to the Sauter formula, is obtained to bring the analytic expression to a better agreement with the edge kinetic simulation results« less
Bootstrap current for the edge pedestal plasma in a diverted tokamak geometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koh, S.; Choe, W.; Chang, C. S.
The edge bootstrap current plays a critical role in the equilibrium and stability of the steep edge pedestal plasma. The pedestal plasma has an unconventional and difficult neoclassical property, as compared with the core plasma. It has a narrow passing particle region in velocity space that can be easily modified or destroyed by Coulomb collisions. At the same time, the edge pedestal plasma has steep pressure and electrostatic potential gradients whose scale-lengths are comparable with the ion banana width, and includes a magnetic separatrix surface, across which the topological properties of the magnetic field and particle orbits change abruptly. Amore » drift-kinetic particle code XGC0, equipped with a mass-momentum-energy conserving collision operator, is used to study the edge bootstrap current in a realistic diverted magnetic field geometry with a self-consistent radial electric field. When the edge electrons are in the weakly collisional banana regime, surprisingly, the present kinetic simulation confirms that the existing analytic expressions [represented by O. Sauter et al., Phys. Plasmas 6, 2834 (1999)] are still valid in this unconventional region, except in a thin radial layer in contact with the magnetic separatrix. The agreement arises from the dominance of the electron contribution to the bootstrap current compared with ion contribution and from a reasonable separation of the trapped-passing dynamics without a strong collisional mixing. However, when the pedestal electrons are in plateau-collisional regime, there is significant deviation of numerical results from the existing analytic formulas, mainly due to large effective collisionality of the passing and the boundary layer trapped particles in edge region. In a conventional aspect ratio tokamak, the edge bootstrap current from kinetic simulation can be significantly less than that from the Sauter formula if the electron collisionality is high. On the other hand, when the aspect ratio is close to unity, the collisional edge bootstrap current can be significantly greater than that from the Sauter formula. Rapid toroidal rotation of the magnetic field lines at the high field side of a tight aspect-ratio tokamak is believed to be the cause of the different behavior. A new analytic fitting formula, as a simple modification to the Sauter formula, is obtained to bring the analytic expression to a better agreement with the edge kinetic simulation results.« less
Communication and Trust: Change at the Onset of Appointment to the Superintendency
ERIC Educational Resources Information Center
Zepeda, Sally J.; Mayers, R. Stewart
2013-01-01
In this case, a new superintendent has communicated what he perceives as a needed change affecting students and teachers on several campuses and a seemingly short timeline for making the decision to move off a block schedule. The abrupt nature of the announcement, combined with the circumstances surrounding the superintendent's recent hiring has…
Evidence of cue synergism in termite corpse response behavior
Michael D. Ulyshen; Thomas G. Shelton
2012-01-01
Subterranean termites of the genus Reticulitermes are known to build walls and tubes and move considerable amounts of soil into wood but the causes of this behavior remain largely unexplored. In laboratory assays, we tested the hypothesis that Reticulitermes virginicus (Banks) would carry more sand into wooden blocks containing corpses compared to corpse-free controls...
Theories of Intelligence, Learning, and Motivation as a Basic Educational Praxis
ERIC Educational Resources Information Center
Van Hook, Steven R.
2008-01-01
This article begins with an examination of the early building blocks of intelligence and learning through signs and symbols, such as examined by Vygotsky and Freire. Then the inquiry moves into methods of achieving resonance as praxis of learning as expanded on by Freire, and connecting with students by addressing their multiple intelligences as…
ERIC Educational Resources Information Center
Kissel, Julie M.
2017-01-01
The purpose of the study was to understand the institutional forces that constructed and shaped the function, nature of funding, and governance of Washtenaw Community College (WCC). This qualitative, historical case study built on organizational theory used archival research to identify themes and the institutional building blocks for the junior…
Clerget 100 hp heavy-oil engine
NASA Technical Reports Server (NTRS)
Leglise, Pierre
1931-01-01
A complete technical description of the Clerget heavy-oil engine is presented along with the general characteristics. The general characteristics are: 9 cylinders, bore 120 mm, stroke 130 mm, four-stroke cycle engine, rated power limited to 100 hp at 1800 rpm; weight 228 kg; propeller with direct drive and air cooling. Moving parts, engine block, and lubrication are all presented.
Kuiken, T A; Dumanian, G A; Lipschutz, R D; Miller, L A; Stubblefield, K A
2004-12-01
A novel method for the control of a myoelectric upper limb prosthesis was achieved in a patient with bilateral amputations at the shoulder disarticulation level. Four independently controlled nerve-muscle units were created by surgically anastomosing residual brachial plexus nerves to dissected and divided aspects of the pectoralis major and minor muscles. The musculocutaneous nerve was anastomosed to the upper pectoralis major; the median nerve was transferred to the middle pectoralis major region; the radial nerve was anastomosed to the lower pectoralis major region; and the ulnar nerve was transferred to the pectoralis minor muscle which was moved out to the lateral chest wall. After five months, three nerve-muscle units were successful (the musculocutaneous, median and radial nerves) in that a contraction could be seen, felt and a surface electromyogram (EMG) could be recorded. Sensory reinnervation also occurred on the chest in an area where the subcutaneous fat was removed. The patient was fitted with a new myoelectric prosthesis using the targeted muscle reinnervation. The patient could simultaneously control two degrees-of-freedom with the experimental prosthesis, the elbow and either the terminal device or wrist. Objective testing showed a doubling of blocks moved with a box and blocks test and a 26% increase in speed with a clothes pin moving test. Subjectively the patient clearly preferred the new prosthesis. He reported that it was easier and faster to use, and felt more natural.
Collective strategy for obstacle navigation during cooperative transport by ants.
McCreery, Helen F; Dix, Zachary A; Breed, Michael D; Nagpal, Radhika
2016-11-01
Group cohesion and consensus have primarily been studied in the context of discrete decisions, but some group tasks require making serial decisions that build on one another. We examine such collective problem solving by studying obstacle navigation during cooperative transport in ants. In cooperative transport, ants work together to move a large object back to their nest. We blocked cooperative transport groups of Paratrechina longicornis with obstacles of varying complexity, analyzing groups' trajectories to infer what kind of strategy the ants employed. Simple strategies require little information, but more challenging, robust strategies succeed with a wider range of obstacles. We found that transport groups use a stochastic strategy that leads to efficient navigation around simple obstacles, and still succeeds at difficult obstacles. While groups navigating obstacles preferentially move directly toward the nest, they change their behavior over time; the longer the ants are obstructed, the more likely they are to move away from the nest. This increases the chance of finding a path around the obstacle. Groups rapidly changed directions and rarely stalled during navigation, indicating that these ants maintain consensus even when the nest direction is blocked. Although some decisions were aided by the arrival of new ants, at many key points, direction changes were initiated within the group, with no apparent external cause. This ant species is highly effective at navigating complex environments, and implements a flexible strategy that works for both simple and more complex obstacles. © 2016. Published by The Company of Biologists Ltd.
Seasonal comparisons of sea ice concentration estimates derived from SSM/I, OKEAN, and RADARSAT data
Belchansky, Gennady I.; Douglas, David C.
2002-01-01
The Special Sensor Microwave Imager (SSM/I) microwave satellite radiometer and its predecessor SMMR are primary sources of information for global sea ice and climate studies. However, comparisons of SSM/I, Landsat, AVHRR, and ERS-1 synthetic aperture radar (SAR) have shown substantial seasonal and regional differences in their estimates of sea ice concentration. To evaluate these differences, we compared SSM/I estimates of sea ice coverage derived with the NASA Team and Bootstrap algorithms to estimates made using RADARSAT, and OKEAN-01 satellite sensor data. The study area included the Barents Sea, Kara Sea, Laptev Sea, and adjacent parts of the Arctic Ocean, during October 1995 through October 1999. Ice concentration estimates from spatially and temporally near-coincident imagery were calculated using independent algorithms for each sensor type. The OKEAN algorithm implemented the satellite's two-channel active (radar) and passive microwave data in a linear mixture model based on the measured values of brightness temperature and radar backscatter. The RADARSAT algorithm utilized a segmentation approach of the measured radar backscatter, and the SSM/I ice concentrations were derived at National Snow and Ice Data Center (NSIDC) using the NASA Team and Bootstrap algorithms. Seasonal and monthly differences between SSM/I, OKEAN, and RADARSAT ice concentrations were calculated and compared. Overall, total sea ice concentration estimates derived independently from near-coincident RADARSAT, OKEAN-01, and SSM/I satellite imagery demonstrated mean differences of less than 5.5% (S.D.<9.5%) during the winter period. Differences between the SSM/I NASA Team and the SSM/I Bootstrap concentrations were no more than 3.1% (S.D.<5.4%) during this period. RADARSAT and OKEAN-01 data both yielded higher total ice concentrations than the NASA Team and the Bootstrap algorithms. The Bootstrap algorithm yielded higher total ice concentrations than the NASA Team algorithm. Total ice concentrations derived from OKEAN-01 and SSM/I satellite imagery were highly correlated during winter, spring, and fall, with mean differences of less than 8.1% (S.D.<15%) for the NASA Team algorithm, and less than 2.8% (S.D.<13.8%) for the Bootstrap algorithm. Respective differences between SSM/I NASA Team and SSM/I Bootstrap total concentrations were less than 5.3% (S.D.<6.9%). Monthly mean differences between SSM/I and OKEAN differed annually by less than 6%, with smaller differences primarily in winter. The NASA Team and Bootstrap algorithms underestimated the total sea ice concentrations relative to the RADARSAT ScanSAR no more than 3.0% (S.D.<9%) and 1.2% (S.D.<7.5%) during cold months, and no more than 12% and 7% during summer, respectively. ScanSAR tended to estimate higher ice concentrations for ice concentrations greater than 50%, when compared to SSM/I during all months. ScanSAR underestimated total sea ice concentration by 2% compared to the OKEAN-01 algorithm during cold months, and gave an overestimation by 2% during spring and summer months. Total NASA Team and Bootstrap sea ice concentration estimates derived from coincident SSM/I and OKEAN-01 data demonstrated mean differences of no more than 5.3% (S.D.<7%), 3.1% (S.D.<5.5%), 2.0% (S.D.<5.5%), and 7.3% (S.D.<10%) for fall, winter, spring, and summer periods, respectively. Large disagreements were observed between the OKEAN and NASA Team results in spring and summer for estimates of the first-year (FY) and multiyear (MY) age classes. The OKEAN-01 algorithm and data tended to estimate, on average, lower concentrations of young or FY ice and higher concentrations of total and MY ice for all months and seasons. Our results contribute to the growing body of documentation about the levels of disparity obtained when seasonal sea ice concentrations are estimated using various types of satellite data and algorithms.
A cluster bootstrap for two-loop MHV amplitudes
Golden, John; Spradlin, Marcus
2015-02-02
We apply a bootstrap procedure to two-loop MHV amplitudes in planar N=4 super-Yang-Mills theory. We argue that the mathematically most complicated part (the Λ 2 B 2 coproduct component) of the n-particle amplitude is uniquely determined by a simple cluster algebra property together with a few physical constraints (dihedral symmetry, analytic structure, supersymmetry, and well-defined collinear limits). Finally, we present a concise, closed-form expression which manifests these properties for all n.
Wrappers for Performance Enhancement and Oblivious Decision Graphs
1995-09-01
always select all relevant features. We test di erent search engines to search the space of feature subsets and introduce compound operators to speed...distinct instances from the original dataset appearing in the test set is thus 0:632m. The 0i accuracy estimate is derived by using bootstrap sample...i for training and the rest of the instances for testing . Given a number b, the number of bootstrap samples, let 0i be the accuracy estimate for
CME Velocity and Acceleration Error Estimates Using the Bootstrap Method
NASA Technical Reports Server (NTRS)
Michalek, Grzegorz; Gopalswamy, Nat; Yashiro, Seiji
2017-01-01
The bootstrap method is used to determine errors of basic attributes of coronal mass ejections (CMEs) visually identified in images obtained by the Solar and Heliospheric Observatory (SOHO) mission's Large Angle and Spectrometric Coronagraph (LASCO) instruments. The basic parameters of CMEs are stored, among others, in a database known as the SOHO/LASCO CME catalog and are widely employed for many research studies. The basic attributes of CMEs (e.g. velocity and acceleration) are obtained from manually generated height-time plots. The subjective nature of manual measurements introduces random errors that are difficult to quantify. In many studies the impact of such measurement errors is overlooked. In this study we present a new possibility to estimate measurements errors in the basic attributes of CMEs. This approach is a computer-intensive method because it requires repeating the original data analysis procedure several times using replicate datasets. This is also commonly called the bootstrap method in the literature. We show that the bootstrap approach can be used to estimate the errors of the basic attributes of CMEs having moderately large numbers of height-time measurements. The velocity errors are in the vast majority small and depend mostly on the number of height-time points measured for a particular event. In the case of acceleration, the errors are significant, and for more than half of all CMEs, they are larger than the acceleration itself.
Theodoratou, Evropi; Farrington, Susan M; Tenesa, Albert; McNeill, Geraldine; Cetnarskyj, Roseanne; Korakakis, Emmanouil; Din, Farhat V N; Porteous, Mary E; Dunlop, Malcolm G; Campbell, Harry
2014-01-01
Colorectal cancer (CRC) accounts for 9.7% of all cancer cases and for 8% of all cancer-related deaths. Established risk factors include personal or family history of CRC as well as lifestyle and dietary factors. We investigated the relationship between CRC and demographic, lifestyle, food and nutrient risk factors through a case-control study that included 2062 patients and 2776 controls from Scotland. Forward and backward stepwise regression was applied and the stability of the models was assessed in 1000 bootstrap samples. The variables that were automatically selected to be included by the forward or backward stepwise regression and whose selection was verified by bootstrap sampling in the current study were family history, dietary energy, 'high-energy snack foods', eggs, juice, sugar-sweetened beverages and white fish (associated with an increased CRC risk) and NSAIDs, coffee and magnesium (associated with a decreased CRC risk). Application of forward and backward stepwise regression in this CRC study identified some already established as well as some novel potential risk factors. Bootstrap findings suggest that examination of the stability of regression models by bootstrap sampling is useful in the interpretation of study findings. 'High-energy snack foods' and high-energy drinks (including sugar-sweetened beverages and fruit juices) as risk factors for CRC have not been reported previously and merit further investigation as such snacks and beverages are important contributors in European and North American diets.
NASA Technical Reports Server (NTRS)
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
Image analysis of representative food structures: application of the bootstrap method.
Ramírez, Cristian; Germain, Juan C; Aguilera, José M
2009-08-01
Images (for example, photomicrographs) are routinely used as qualitative evidence of the microstructure of foods. In quantitative image analysis it is important to estimate the area (or volume) to be sampled, the field of view, and the resolution. The bootstrap method is proposed to estimate the size of the sampling area as a function of the coefficient of variation (CV(Bn)) and standard error (SE(Bn)) of the bootstrap taking sub-areas of different sizes. The bootstrap method was applied to simulated and real structures (apple tissue). For simulated structures, 10 computer-generated images were constructed containing 225 black circles (elements) and different coefficient of variation (CV(image)). For apple tissue, 8 images of apple tissue containing cellular cavities with different CV(image) were analyzed. Results confirmed that for simulated and real structures, increasing the size of the sampling area decreased the CV(Bn) and SE(Bn). Furthermore, there was a linear relationship between the CV(image) and CV(Bn) (.) For example, to obtain a CV(Bn) = 0.10 in an image with CV(image) = 0.60, a sampling area of 400 x 400 pixels (11% of whole image) was required, whereas if CV(image) = 1.46, a sampling area of 1000 x 100 pixels (69% of whole image) became necessary. This suggests that a large-size dispersion of element sizes in an image requires increasingly larger sampling areas or a larger number of images.
A simple theory of molecular organization in fullerene-containing liquid crystals
NASA Astrophysics Data System (ADS)
Peroukidis, S. D.; Vanakaras, A. G.; Photinos, D. J.
2005-10-01
Systematic efforts to synthesize fullerene-containing liquid crystals have produced a variety of successful model compounds. We present a simple molecular theory, based on the interconverting shape approach [Vanakaras and Photinos, J. Mater. Chem. 15, 2002 (2005)], that relates the self-organization observed in these systems to their molecular structure. The interactions are modeled by dividing each molecule into a number of submolecular blocks to which specific interactions are assigned. Three types of blocks are introduced, corresponding to fullerene units, mesogenic units, and nonmesogenic linkage units. The blocks are constrained to move on a cubic three-dimensional lattice and molecular flexibility is allowed by retaining a number of representative conformations within the block representation of the molecule. Calculations are presented for a variety of molecular architectures including twin mesogenic branch monoadducts of C60, twin dendromesogenic branch monoadducts, and conical (badminton shuttlecock) multiadducts of C60. The dependence of the phase diagrams on the interaction parameters is explored. In spite of its many simplifications and the minimal molecular modeling used (three types of chemically distinct submolecular blocks with only repulsive interactions), the theory accounts remarkably well for the phase behavior of these systems.
NASA Astrophysics Data System (ADS)
Bourke, Mary; Nield, Jo; Diniega, Serina; Hansen, Candy; McElwaine, Jim
2016-04-01
The seasonal sublimation of CO2 ice is an active driver of present-day surface change on Mars. Diniega et al (2013) proposed that a discrete type of Martian gully, found on southern hemisphere dunes, were formed by the movement of CO2 seasonal ice blocks. These 'Linear Gullies' consist primarily of long (100 m - 2.5 km) grooves with near-uniform width (few-10 m wide), and typical depth of <2 m. They are near-linear throughout most of their length but sometimes contains zones of low-to-high sinuosity. They are commonly bounded by levées. The groove is generally prefaced by a small alcove that originates at the dune brink. We present the results of a set of field experiments that were undertaken at the Coral Pink sand dunes, Utah. These are sister experiments to those undertaken in Arizona (Bourke et al, 2016). The experiments were undertaken on an active barchan dune with a 16 m long lee slope (30.3°). Ambient air temperature was 30°C and relative humidity was 25%; sand surface temperatures were 26.5°C. A CO2 ice block (60x205x210 mm) was placed at the dune brink and with a gentle nudge it moved downslope. The dynamics of the block movement were recorded using a pair of high resolution video cameras. Geomorphological observations were noted and topographic change was quantified using a Leica P20 terrestrial laser scanner with a resolution of 0.8 mm at 10 m, and change detection limits less than 3 mm. The block run was repeated a total of 10 times and launched from the same location at the dune brink. The experiment ran for 45 minutes. The block size was reduced to (45 x 190 x 195 mm) by the end of the run series. The resultant geomorphology shows that the separate block runs occupied different tracks leading to a triangular plan form shape with a maximum width of 3.5 m. This is different from the findings in Arizona where a narrower track span was recorded (1.7m) (Bourke et al, 2016). Similar block dynamics were observed at both sites (as blocks moved straight, swiveled and bounced downslope). Distinctive pits with arcuate rims on their downslope edge were formed where blocks bounced on the surface. These pits are at an almost equidistant spacing. Despite a longer slope (16 m as opposed to 8m at Grand Falls), no depositional apron was formed. Levee development was less consistent compared to the Arizona site, but a pronounced unpaired-levee formed towards the base of the lee slope. These data show that sublimating blocks of CO2 ice leave signatures of transport paths and are capable of eroding and transporting sediment. Diniega,S. et al (2013) A new dry hypothesis for the formation of Martian linear gullies. Icarus. Vol. 225, 1, p. 526-537. Bourke, M.C. et al (2016) The geomorphic effect of sublimating CO2 blocks on dune lee slopes at Grand Falls, Arizona. LPSC
A sparse representation-based approach for copy-move image forgery detection in smooth regions
NASA Astrophysics Data System (ADS)
Abdessamad, Jalila; ElAdel, Asma; Zaied, Mourad
2017-03-01
Copy-move image forgery is the act of cloning a restricted region in the image and pasting it once or multiple times within that same image. This procedure intends to cover a certain feature, probably a person or an object, in the processed image or emphasize it through duplication. Consequences of this malicious operation can be unexpectedly harmful. Hence, the present paper proposes a new approach that automatically detects Copy-move Forgery (CMF). In particular, this work broaches a widely common open issue in CMF research literature that is detecting CMF within smooth areas. Indeed, the proposed approach represents the image blocks as a sparse linear combination of pre-learned bases (a mixture of texture and color-wise small patches) which allows a robust description of smooth patches. The reported experimental results demonstrate the effectiveness of the proposed approach in identifying the forged regions in CM attacks.
Increasing available FIFO space to prevent messaging queue deadlocks in a DMA environment
Blocksome, Michael A [Rochester, MN; Chen, Dong [Croton On Hudson, NY; Gooding, Thomas [Rochester, MN; Heidelberger, Philip [Cortlandt Manor, NY; Parker, Jeff [Rochester, MN
2012-02-07
Embodiments of the invention may be used to manage message queues in a parallel computing environment to prevent message queue deadlock. A direct memory access controller of a compute node may determine when a messaging queue is full. In response, the DMA may generate an interrupt. An interrupt handler may stop the DMA and swap all descriptors from the full messaging queue into a larger queue (or enlarge the original queue). The interrupt handler then restarts the DMA. Alternatively, the interrupt handler stops the DMA, allocates a memory block to hold queue data, and then moves descriptors from the full messaging queue into the allocated memory block. The interrupt handler then restarts the DMA. During a normal messaging advance cycle, a messaging manager attempts to inject the descriptors in the memory block into other messaging queues until the descriptors have all been processed.
Ability of crime, demographic and business data to forecast areas of increased violence.
Bowen, Daniel A; Mercer Kollar, Laura M; Wu, Daniel T; Fraser, David A; Flood, Charles E; Moore, Jasmine C; Mays, Elizabeth W; Sumner, Steven A
2018-05-24
Identifying geographic areas and time periods of increased violence is of considerable importance in prevention planning. This study compared the performance of multiple data sources to prospectively forecast areas of increased interpersonal violence. We used 2011-2014 data from a large metropolitan county on interpersonal violence (homicide, assault, rape and robbery) and forecasted violence at the level of census block-groups and over a one-month moving time window. Inputs to a Random Forest model included historical crime records from the police department, demographic data from the US Census Bureau, and administrative data on licensed businesses. Among 279 block groups, a model utilizing all data sources was found to prospectively improve the identification of the top 5% most violent block-group months (positive predictive value = 52.1%; negative predictive value = 97.5%; sensitivity = 43.4%; specificity = 98.2%). Predictive modelling with simple inputs can help communities more efficiently focus violence prevention resources geographically.
Liu, Chunbo; Pan, Feng; Li, Yun
2016-07-29
Glutamate is of great importance in food and pharmaceutical industries. There is still lack of effective statistical approaches for fault diagnosis in the fermentation process of glutamate. To date, the statistical approach based on generalized additive model (GAM) and bootstrap has not been used for fault diagnosis in fermentation processes, much less the fermentation process of glutamate with small samples sets. A combined approach of GAM and bootstrap was developed for the online fault diagnosis in the fermentation process of glutamate with small sample sets. GAM was first used to model the relationship between glutamate production and different fermentation parameters using online data from four normal fermentation experiments of glutamate. The fitted GAM with fermentation time, dissolved oxygen, oxygen uptake rate and carbon dioxide evolution rate captured 99.6 % variance of glutamate production during fermentation process. Bootstrap was then used to quantify the uncertainty of the estimated production of glutamate from the fitted GAM using 95 % confidence interval. The proposed approach was then used for the online fault diagnosis in the abnormal fermentation processes of glutamate, and a fault was defined as the estimated production of glutamate fell outside the 95 % confidence interval. The online fault diagnosis based on the proposed approach identified not only the start of the fault in the fermentation process, but also the end of the fault when the fermentation conditions were back to normal. The proposed approach only used a small sample sets from normal fermentations excitements to establish the approach, and then only required online recorded data on fermentation parameters for fault diagnosis in the fermentation process of glutamate. The proposed approach based on GAM and bootstrap provides a new and effective way for the fault diagnosis in the fermentation process of glutamate with small sample sets.
Wang, Jung-Han; Abdel-Aty, Mohamed; Wang, Ling
2017-07-01
There have been plenty of studies intended to use different methods, for example, empirical Bayes before-after methods, to get accurate estimation of CMFs. All of them have different assumptions toward crash count if there was no treatment. Additionally, another major assumption is that multiple sites share the same true CMF. Under this assumption, the CMF at an individual intersection is randomly drawn from a normally distributed population of CMFs at all intersections. Since CMFs are non-zero values, the population of all CMFs might not follow normal distributions, and even if it does, the true mean of CMFs at some intersections may be different from that at others. Therefore, a bootstrap method based on before-after empirical Bayes theory was proposed to estimate CMFs, but it did not make distributional assumptions. This bootstrap procedure has the added benefit of producing a measure of CMF stability. Furthermore, based on the bootstrapped CMF, a new CMF precision rating method was proposed to evaluate the reliability of CMFs. This study chose 29 urban four-legged intersections as treated sites, and their controls were changed from stop-controlled to signal-controlled. Meanwhile, 124 urban four-legged stop-controlled intersections were selected as reference sites. At first, different safety performance functions (SPFs) were applied to five crash categories, and it was found that each crash category had different optimal SPF form. Then, the CMFs of these five crash categories were estimated using the bootstrap empirical Bayes method. The results of the bootstrapped method showed that signalization significantly decreased Angle+Left-Turn crashes, and its CMF had the highest precision. While, the CMF for Rear-End crashes was unreliable. For KABCO, KABC, and KAB crashes, their CMFs were proved to be reliable for the majority of intersections, but the estimated effect of signalization may be not accurate at some sites. Copyright © 2017 Elsevier Ltd. All rights reserved.
Crow, Thomas; Cross, Dorthie; Powers, Abigail; Bradley, Bekh
2014-10-01
Abuse and neglect in childhood are well-established risk factors for later psychopathology. Past research has suggested that childhood emotional abuse may be particularly harmful to psychological development. The current cross-sectional study employed multiple regression techniques to assess the effects of childhood trauma on adulthood depression and emotion dysregulation in a large sample of mostly low-income African Americans recruited in an urban hospital. Bootstrap analyses were used to test emotion dysregulation as a potential mediator between emotional abuse in childhood and current depression. Childhood emotional abuse significantly predicted depressive symptoms even when accounting for all other childhood trauma types, and we found support for a complementary mediation of this relationship by emotion dysregulation. Our findings highlight the importance of emotion dysregulation and childhood emotional abuse in relation to adult depression. Moving forward, clinicians should consider the particular importance of emotional abuse in the development of depression, and future research should seek to identify mechanisms through which emotional abuse increases risk for depression and emotion dysregulation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Crow, Thomas; Cross, Dorthie; Powers, Abigail; Bradley, Bekh
2014-01-01
Abuse and neglect in childhood are well-established risk factors for later psychopathology. Past research has suggested that childhood emotional abuse may be particularly harmful to psychological development. The current cross-sectional study employed multiple regression techniques to assess the effects of childhood trauma on adulthood depression and emotion dysregulation in a large sample of mostly low-income African Americans recruited in an urban hospital. Bootstrap analyses were used to test emotion dysregulation as a potential mediator between emotional abuse in childhood and current depression. Childhood emotional abuse significantly predicted depressive symptoms even when accounting for all other childhood trauma types, and we found support for a complementary mediation of this relationship by emotion dysregulation. Our findings highlight the importance of emotion dysregulation and childhood emotional abuse in relation to adult depression. Moving forward, clinicians should consider the particular importance of emotional abuse in the development of depression, and future research should seek to identify mechanisms through which emotional abuse increases risk for depression and emotion dysregulation. PMID:25035171
Robust analysis of semiparametric renewal process models
Lin, Feng-Chang; Truong, Young K.; Fine, Jason P.
2013-01-01
Summary A rate model is proposed for a modulated renewal process comprising a single long sequence, where the covariate process may not capture the dependencies in the sequence as in standard intensity models. We consider partial likelihood-based inferences under a semiparametric multiplicative rate model, which has been widely studied in the context of independent and identical data. Under an intensity model, gap times in a single long sequence may be used naively in the partial likelihood with variance estimation utilizing the observed information matrix. Under a rate model, the gap times cannot be treated as independent and studying the partial likelihood is much more challenging. We employ a mixing condition in the application of limit theory for stationary sequences to obtain consistency and asymptotic normality. The estimator's variance is quite complicated owing to the unknown gap times dependence structure. We adapt block bootstrapping and cluster variance estimators to the partial likelihood. Simulation studies and an analysis of a semiparametric extension of a popular model for neural spike train data demonstrate the practical utility of the rate approach in comparison with the intensity approach. PMID:24550568
Yang, Chan; Xu, Bing; Zhang, Zhi-Qiang; Wang, Xin; Shi, Xin-Yuan; Fu, Jing; Qiao, Yan-Jiang
2016-10-01
Blending uniformity is essential to ensure the homogeneity of Chinese medicine formula particles within each batch. This study was based on the blending process of ebony spray dried powder and dextrin(the proportion of dextrin was 10%),in which the analysis of near infrared (NIR) diffuse reflectance spectra was collected from six different sampling points in combination with moving window F test method in order to assess the blending uniformity of the blending process.The method was validated by the changes of citric acid content determined by the HPLC. The results of moving window F test method showed that the ebony spray dried powder and dextrin was homogeneous during 200-300 r and was segregated during 300-400 r. An advantage of this method is that the threshold value is defined statistically, not empirically and thus does not suffer from threshold ambiguities in common with the moving block standard deviatiun (MBSD). And this method could be employed to monitor other blending process of Chinese medicine powders on line. Copyright© by the Chinese Pharmaceutical Association.
A bird's eye view of "Understanding volcanoes in the Vanuatu arc"
NASA Astrophysics Data System (ADS)
Vergniolle, S.; Métrich, N.
2016-08-01
The Vanuatu intra-oceanic arc, located between 13 and 22°S in the southwest Pacific Ocean (Fig. 1), is one of the most seismically active regions with almost 39 earthquakes magnitude 7 + in the past 43 years (Baillard et al., 2015). Active deformation in both the Vanuatu subduction zone and the back-arc North-Fiji basin accommodates the variation of convergence rates which are c.a. 90-120 mm/yr along most of the arc (Taylor et al., 1995; Pelletier et al., 1998). The convergence rate is slowed down to 25-43 mm/yr (Baillard et al., 2015) in the central segment where the D'Entrecasteaux ridge - an Eocene-Oligocene island arc complex on the Australian subducting plate - collides and is subducted beneath the fore-arc (Taylor et al., 2005). Hence, the Vanuatu arc is segmented in three blocks which move independently; as the north block rotates counter-clockwise in association with rapid back-arc spreading ( 80 mm/year), the central block translates eastward and the south block rotates clockwise (Calmant et al., 2003; Bergeot et al., 2009). (See Fig. 1.)
What's your number? The effects of trial order on the one-target advantage.
Bested, Stephen R; Khan, Michael A; Lawrence, Gavin P; Tremblay, Luc
2018-05-01
When moving our upper-limb towards a single target, movement times are typically shorter than when movement to a second target is required. This is known as the one-target advantage. Most studies that have demonstrated the one-target advantage have employed separate trial blocks for the one- and two-segment movements. To test if the presence of the one-target advantage depends on advance knowledge of the number of segments, the present study investigated whether the one-target advantage would emerge under different trial orders/sequences. One- and two-segment responses were organized in blocked (i.e., 1-1-1, 2-2-2), alternating (i.e., 1-2-1-2-1-2), and random (i.e., 1-1-2-1-2-2) trial sequences. Similar to previous studies, where only blocked schedules have typically been utilized, the one-target advantage emerged during the blocked and alternate conditions, but not in the random condition. This finding indicates that the one-target advantage is contingent on participants knowing the number of movement segments prior to stimulus onset. Copyright © 2018 Elsevier B.V. All rights reserved.
A novel resource sharing algorithm based on distributed construction for radiant enclosure problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finzell, Peter; Bryden, Kenneth M.
This study demonstrates a novel approach to solving inverse radiant enclosure problems based on distributed construction. Specifically, the problem of determining the temperature distribution needed on the heater surfaces to achieve a desired design surface temperature profile is recast as a distributed construction problem in which a shared resource, temperature, is distributed by computational agents moving blocks. The sharing of blocks between agents enables them to achieve their desired local state, which in turn achieves the desired global state. Each agent uses the current state of their local environment and a simple set of rules to determine when to exchangemore » blocks, each block representing a discrete unit of temperature change. This algorithm is demonstrated using the established two-dimensional inverse radiation enclosure problem. The temperature profile on the heater surfaces is adjusted to achieve a desired temperature profile on the design surfaces. The resource sharing algorithm was able to determine the needed temperatures on the heater surfaces to obtain the desired temperature distribution on the design surfaces in the nine cases examined.« less
A novel resource sharing algorithm based on distributed construction for radiant enclosure problems
Finzell, Peter; Bryden, Kenneth M.
2017-03-06
This study demonstrates a novel approach to solving inverse radiant enclosure problems based on distributed construction. Specifically, the problem of determining the temperature distribution needed on the heater surfaces to achieve a desired design surface temperature profile is recast as a distributed construction problem in which a shared resource, temperature, is distributed by computational agents moving blocks. The sharing of blocks between agents enables them to achieve their desired local state, which in turn achieves the desired global state. Each agent uses the current state of their local environment and a simple set of rules to determine when to exchangemore » blocks, each block representing a discrete unit of temperature change. This algorithm is demonstrated using the established two-dimensional inverse radiation enclosure problem. The temperature profile on the heater surfaces is adjusted to achieve a desired temperature profile on the design surfaces. The resource sharing algorithm was able to determine the needed temperatures on the heater surfaces to obtain the desired temperature distribution on the design surfaces in the nine cases examined.« less
Method for welding an article and terminating the weldment within the perimeter of the article
NASA Technical Reports Server (NTRS)
Snyder, John H. (Inventor); Smashey, Russell W. (Inventor); Boerger, Eric J. (Inventor); Borne, Bruce L. (Inventor)
2000-01-01
An article is welded, as in weld repair of a defect, by positioning a weld lift-off block at a location on the surface of the article adjacent to the intended location of the end of the weldment on the surface of the article. The weld lift-off block has a wedge shape including a base contacting the surface of the article, and an upper face angled upwardly from the base from a base leading edge. A weld pool is formed on the surface of the article by directly heating the surface of the article using a heat source. The heat source is moved relative to the surface of the article and onto the upper surface of the weld lift-off block by crossing the leading edge of the wedge, without discontinuing the direct heating of the article by the heat source. The heating of the article with the heat source is discontinued only after the heat source is directly heating the upper face of the weld lift-off block, and not the article.
Rosenblatt, Steven David; Crane, Benjamin Thomas
2015-01-01
A moving visual field can induce the feeling of self-motion or vection. Illusory motion from static repeated asymmetric patterns creates a compelling visual motion stimulus, but it is unclear if such illusory motion can induce a feeling of self-motion or alter self-motion perception. In these experiments, human subjects reported the perceived direction of self-motion for sway translation and yaw rotation at the end of a period of viewing set visual stimuli coordinated with varying inertial stimuli. This tested the hypothesis that illusory visual motion would influence self-motion perception in the horizontal plane. Trials were arranged into 5 blocks based on stimulus type: moving star field with yaw rotation, moving star field with sway translation, illusory motion with yaw, illusory motion with sway, and static arrows with sway. Static arrows were used to evaluate the effect of cognitive suggestion on self-motion perception. Each trial had a control condition; the illusory motion controls were altered versions of the experimental image, which removed the illusory motion effect. For the moving visual stimulus, controls were carried out in a dark room. With the arrow visual stimulus, controls were a gray screen. In blocks containing a visual stimulus there was an 8s viewing interval with the inertial stimulus occurring over the final 1s. This allowed measurement of the visual illusion perception using objective methods. When no visual stimulus was present, only the 1s motion stimulus was presented. Eight women and five men (mean age 37) participated. To assess for a shift in self-motion perception, the effect of each visual stimulus on the self-motion stimulus (cm/s) at which subjects were equally likely to report motion in either direction was measured. Significant effects were seen for moving star fields for both translation (p = 0.001) and rotation (p<0.001), and arrows (p = 0.02). For the visual motion stimuli, inertial motion perception was shifted in the direction consistent with the visual stimulus. Arrows had a small effect on self-motion perception driven by a minority of subjects. There was no significant effect of illusory motion on self-motion perception for either translation or rotation (p>0.1 for both). Thus, although a true moving visual field can induce self-motion, results of this study show that illusory motion does not.
Forensic Analysis of Digital Image Tampering
2004-12-01
analysis of when each method fails, which Chapter 4 discusses. Finally, a test image containing an invisible watermark using LSB steganography is...2.2 – Example of invisible watermark using Steganography Software F5 ............. 8 Figure 2.3 – Example of copy-move image forgery [12...Figure 3.11 – Algorithm for JPEG Block Technique ....................................................... 54 Figure 3.12 – “Forged” Image with Result
An engine awaits processing in the new engine shop at KSC
NASA Technical Reports Server (NTRS)
1998-01-01
In the Space Shuttle Main Engine Processing Facility (SSMEPF), a new Block 2A engine sits on the transport cradle before being moved to the workstand. The engine is scheduled to fly on the Space Shuttle Endeavour during the STS-88 mission in December 1998. The SSMEPF officially opened on July 6, replacing the Shuttle Main Engine Shop.
Give Teams a Running Start: Take Steps to Build Shared Vision, Trust, and Collaboration Skills
ERIC Educational Resources Information Center
Kise, Jane A. G.
2012-01-01
Consider for a moment how launching a professional learning community is similar to starting a race. Athletes know the danger of false starts--moving before the starting signal. Until recently, a false start meant that all racers returned to the blocks to begin again, their adrenalin gone, their concentration broken. Because these effects could…
Three Methods of Rational Emotive Behavior Therapy That Make My Psychotherapy Effective.
ERIC Educational Resources Information Center
Ellis, Albert
This paper discusses three serious cognitive-emotive errors clients make when they are confronted with situations that block their important goals and how to act against self-defeating errors and move on to greater mental health and self-actualization. Three of the main ways in which clients think, feel, and act against their best interests are:…
Kulhánek, Tomáš; Ježek, Filip; Mateják, Marek; Šilar, Jan; Kofránek, Jří
2015-08-01
This work introduces experiences of teaching modeling and simulation for graduate students in the field of biomedical engineering. We emphasize the acausal and object-oriented modeling technique and we have moved from teaching block-oriented tool MATLAB Simulink to acausal and object oriented Modelica language, which can express the structure of the system rather than a process of computation. However, block-oriented approach is allowed in Modelica language too and students have tendency to express the process of computation. Usage of the exemplar acausal domains and approach allows students to understand the modeled problems much deeper. The causality of the computation is derived automatically by the simulation tool.
Blackbody emission from light interacting with an effective moving dispersive medium.
Petev, M; Westerberg, N; Moss, D; Rubino, E; Rimoldi, C; Cacciatori, S L; Belgiorno, F; Faccio, D
2013-07-26
Intense laser pulses excite a nonlinear polarization response that may create an effective flowing medium and, under appropriate conditions, a blocking horizon for light. Here, we analyze in detail the interaction of light with such laser-induced flowing media, fully accounting for the medium dispersion properties. An analytical model based on a first Born approximation is found to be in excellent agreement with numerical simulations based on Maxwell's equations and shows that when a blocking horizon is formed, the stimulated medium scatters light with a blackbody emission spectrum. Based on these results, diamond is proposed as a promising candidate medium for future studies of Hawking emission from artificial, dispersive horizons.
Construction and testing of a Scanning Laser Radar (SLR), phase 2
NASA Technical Reports Server (NTRS)
Flom, T.; Coombes, H. D.
1971-01-01
The scanning laser radar overall system is described. Block diagrams and photographs of the hardware are included with the system description. Detailed descriptions of all the subsystems that make up the scanning laser radar system are included. Block diagrams, photographs, and detailed optical and electronic schematics are used to help describe such subsystem hardware as the laser, beam steerer, receiver optics and detector, control and processing electronics, visual data displays, and the equipment used on the target. Tests were performed on the scanning laser radar to determine its acquisition and tracking performance and to determine its range and angle accuracies while tracking a moving target. The tests and test results are described.
The effects of catecholamines and adrenoceptor blocking drugs on the canine peripheral lymph flow.
De Micheli, P; Glässer, A H
1975-01-01
Blood flow through the femoral artery, lymph flow in a lymphatic vessel in the femoral triangle and metatarsal distal venous pressure were measured simultaneously in a canine moving hind limb. 2. Low intra-arterial doses of adrenaline and noradrenaline increased lymph flow even in the presence of marked arterial vasoconstriction. In contrast, isoprenaline increased arterial blood flow without affecting lymph flow rate. 3. Phenoxybenzamine, dihydroergotoxine, and nicergoline did not inhibit the lymphatic flow increase induced by adrenaline at doses active on arterial or venous vascular alpha-adrenoceptors. 4. Propranolol given intra-arterially into animals pretreated with alpha-adrenoceptor blocking agents restored the vasoconstrictor effect of adrenaline (reversal of adrenaline reversal). PMID:238702
2003-07-18
KENNEDY SPACE CENTER, FLA. - On Launch Complex 17-B, Cape Canaveral Air Force Station, the first stage of a Delta II rocket is moved into the mobile service tower. The rocket is being erected to launch the Space InfraRed Telescope Facility (SIRTF). Consisting of an 0.85-meter telescope and three cryogenically cooled science instruments, SIRTF is one of NASA's largest infrared telescopes to be launched. SIRTF will obtain images and spectra by detecting the infrared energy, or heat, radiated by objects in space. Most of this infrared radiation is blocked by the Earth's atmosphere and cannot be observed from the ground.
2003-07-18
KENNEDY SPACE CENTER, FLA. - On Launch Complex 17-B, Cape Canaveral Air Force Station, the first stage of a Delta II rocket is nearly erect for its move into the mobile service tower. The rocket is being erected to launch the Space InfraRed Telescope Facility (SIRTF). Consisting of an 0.85-meter telescope and three cryogenically cooled science instruments, SIRTF is one of NASA's largest infrared telescopes to be launched. SIRTF will obtain images and spectra by detecting the infrared energy, or heat, radiated by objects in space. Most of this infrared radiation is blocked by the Earth's atmosphere and cannot be observed from the ground.
Lithospheric Decoupling and Rotations: Hints from Ethiopian Rift
NASA Astrophysics Data System (ADS)
Muluneh, A. A.; Cuffaro, M.; Doglioni, C.; Kidane, T.
2014-12-01
Plates move relative to the mantle because some torques are acting on them. The shear in the low-velocity zone (LVZ) at the base of the lithosphere is the expression of these torques. The decoupling is allowed by the low viscosity in the LVZ, which is likely few orders of magnitudes lower than previously estimated. The viscosity value in the LVZ controls the degree of coupling/decoupling between the lithosphere and the underlying mantle. Lateral variations in viscosity within the LVZ may explain the velocity gradient among tectonic plates as the one determining the Ethiopian Rift (ER) separating Africa from Somalia. While it remains not fully understood the mechanisms of the torques acting on the lithosphere (thermally driven mantle convection or the combination of mantle convection with astronomical forces such as the Earth's rotation and tidal drag), the stresses are transmitted across the different mechanical layers (e.g., the brittle upper crust, down to the viscous-plastic ductile lower crust and upper mantle). Differential basal shear traction at the base of the lithosphere beneath the two sides of the East African Rift System (EARS) is assumed to drive and sustain rifting. In our analysis, the differential torques acting on the lithospheric/crustal blocks drive kinematics and block rotations. Since, ER involves the whole lithosphere, we do not expect large amount of rotation. Rotation can be the result of the whole plate motion on the sphere moving along the tectonic equator, or the second order sub-rotation of a single plate. Further rotation may occur along oblique plate boundaries (e.g., left lateral transtensional setting at the ER). Small amount of vertical axis rotation of blocks in northern ER could be related to the presence of local, shallower decollement layers. Shallow brittle-ductile transition (BDT) zone and differential tilting of crustal blocks in the northern ER could hint a possibility of detachment surface between the flow in the lower crust relative to the brittle crust above. Our study suggests that kinematics of crustal blocks in the ER is controlled by Africa and Somalia plates interaction at different scale and layers.
Quantitative body DW-MRI biomarkers uncertainty estimation using unscented wild-bootstrap.
Freiman, M; Voss, S D; Mulkern, R V; Perez-Rossello, J M; Warfield, S K
2011-01-01
We present a new method for the uncertainty estimation of diffusion parameters for quantitative body DW-MRI assessment. Diffusion parameters uncertainty estimation from DW-MRI is necessary for clinical applications that use these parameters to assess pathology. However, uncertainty estimation using traditional techniques requires repeated acquisitions, which is undesirable in routine clinical use. Model-based bootstrap techniques, for example, assume an underlying linear model for residuals rescaling and cannot be utilized directly for body diffusion parameters uncertainty estimation due to the non-linearity of the body diffusion model. To offset this limitation, our method uses the Unscented transform to compute the residuals rescaling parameters from the non-linear body diffusion model, and then applies the wild-bootstrap method to infer the body diffusion parameters uncertainty. Validation through phantom and human subject experiments shows that our method identify the regions with higher uncertainty in body DWI-MRI model parameters correctly with realtive error of -36% in the uncertainty values.
Bootstrapping the (A1, A2) Argyres-Douglas theory
NASA Astrophysics Data System (ADS)
Cornagliotto, Martina; Lemos, Madalena; Liendo, Pedro
2018-03-01
We apply bootstrap techniques in order to constrain the CFT data of the ( A 1 , A 2) Argyres-Douglas theory, which is arguably the simplest of the Argyres-Douglas models. We study the four-point function of its single Coulomb branch chiral ring generator and put numerical bounds on the low-lying spectrum of the theory. Of particular interest is an infinite family of semi-short multiplets labeled by the spin ℓ. Although the conformal dimensions of these multiplets are protected, their three-point functions are not. Using the numerical bootstrap we impose rigorous upper and lower bounds on their values for spins up to ℓ = 20. Through a recently obtained inversion formula, we also estimate them for sufficiently large ℓ, and the comparison of both approaches shows consistent results. We also give a rigorous numerical range for the OPE coefficient of the next operator in the chiral ring, and estimates for the dimension of the first R-symmetry neutral non-protected multiplet for small spin.
López, Erick B; Yamashita, Takashi
2017-02-01
This study examined whether household income mediates the relationship between acculturation and vegetable consumption among Latino adults in the U.S. Data from the 2009 to 2010 National Health and Nutrition Examination Survey were analyzed. Vegetable consumption index was created based on the frequencies of five kinds of vegetables intake. Acculturation was measured with the degree of English language use at home. Path model with bootstrapping technique was employed for mediation analysis. A significant partial mediation relationship was identified. Greater acculturation [95 % bias corrected bootstrap confident interval (BCBCI) = (0.02, 0.33)] was associated with the higher income and in turn, greater vegetable consumption. At the same time, greater acculturation was associated with lower vegetable consumption [95 % BCBCI = (-0.88, -0.07)]. Findings regarding the income as a mediator of the acculturation-dietary behavior relationship inform unique intervention programs and policy changes to address health disparities by race/ethnicity.
Transport barriers in bootstrap-driven tokamaks
NASA Astrophysics Data System (ADS)
Staebler, G. M.; Garofalo, A. M.; Pan, C.; McClenaghan, J.; Van Zeeland, M. A.; Lao, L. L.
2018-05-01
Experiments have demonstrated improved energy confinement due to the spontaneous formation of an internal transport barrier in high bootstrap fraction discharges. Gyrokinetic analysis, and quasilinear predictive modeling, demonstrates that the observed transport barrier is caused by the suppression of turbulence primarily from the large Shafranov shift. It is shown that the Shafranov shift can produce a bifurcation to improved confinement in regions of positive magnetic shear or a continuous reduction in transport for weak or negative magnetic shear. Operation at high safety factor lowers the pressure gradient threshold for the Shafranov shift-driven barrier formation. Two self-organized states of the internal and edge transport barrier are observed. It is shown that these two states are controlled by the interaction of the bootstrap current with magnetic shear, and the kinetic ballooning mode instability boundary. Election scale energy transport is predicted to be dominant in the inner 60% of the profile. Evidence is presented that energetic particle-driven instabilities could be playing a role in the thermal energy transport in this region.
Im, Subin; Min, Soonhong
2013-04-01
Exploratory factor analyses of the Kirton Adaption-Innovation Inventory (KAI), which serves to measure individual cognitive styles, generally indicate three factors: sufficiency of originality, efficiency, and rule/group conformity. In contrast, a 2005 study by Im and Hu using confirmatory factor analysis supported a four-factor structure, dividing the sufficiency of originality dimension into two subdimensions, idea generation and preference for change. This study extends Im and Hu's (2005) study of a derived version of the KAI by providing additional evidence of the four-factor structure. Specifically, the authors test the robustness of the parameter estimates to the violation of normality assumptions in the sample using bootstrap methods. A bias-corrected confidence interval bootstrapping procedure conducted among a sample of 356 participants--members of the Arkansas Household Research Panel, with middle SES and average age of 55.6 yr. (SD = 13.9)--showed that the four-factor model with two subdimensions of sufficiency of originality fits the data significantly better than the three-factor model in non-normality conditions.
How to bootstrap a human communication system.
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified items to a partner using repeated non-linguistic vocalization, repeated gesture, or repeated non-linguistic vocalization plus gesture (but without using their existing language system). Gesture proved more effective (measured by communication success) and more efficient (measured by the time taken to communicate) than non-linguistic vocalization across a range of item categories (emotion, object, and action). Combining gesture and vocalization did not improve performance beyond gesture alone. We experimentally demonstrate that gesture is a more effective means of bootstrapping a human communication system. We argue that gesture outperforms non-linguistic vocalization because it lends itself more naturally to the production of motivated signs. © 2013 Cognitive Science Society, Inc.
Measuring and Benchmarking Technical Efficiency of Public Hospitals in Tianjin, China
Li, Hao; Dong, Siping
2015-01-01
China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. PMID:26396090
Impact of bootstrap current and Landau-fluid closure on ELM crashes and transport
NASA Astrophysics Data System (ADS)
Chen, J. G.; Xu, X. Q.; Ma, C. H.; Lei, Y. A.
2018-05-01
Results presented here are from 6-field Landau-Fluid simulations using shifted circular cross-section tokamak equilibria on BOUT++ framework. Linear benchmark results imply that the collisional and collisionless Landau resonance closures make a little difference on linear growth rate spectra which are quite close to the results with the flux limited Spitzer-Härm parallel flux. Both linear and nonlinear simulations show that the plasma current profile plays dual roles on the peeling-ballooning modes that it can drive the low-n peeling modes and stabilize the high-n ballooning modes. For fixed total pressure and current, as the pedestal current decreases due to the bootstrap current which becomes smaller when the density (collisionality) increases, the operational point is shifted downwards vertically in the Jped - α diagram, resulting in threshold changes of different modes. The bootstrap current can slightly increase radial turbulence spreading range and enhance the energy and particle transports by increasing the perturbed amplitude and broadening cross-phase frequency distribution.
NASA Astrophysics Data System (ADS)
Yang, Yong-Tai
2013-11-01
Interactions at plate boundaries induce stresses that constitute critical controls on the structural evolution of intraplate regions. However, the traditional tectonic model for the East Asian margin during the Mesozoic, invoking successive episodes of paleo-Pacific oceanic subduction, does not provide an adequate context for important Late Cretaceous dynamics across East Asia, including: continental-scale orogenic processes, significant sinistral strike-slip faulting, and several others. The integration of numerous documented field relations requires a new tectonic model, as proposed here. The Okhotomorsk continental block, currently residing below the Okhotsk Sea in Northeast Asia, was located in the interior of the Izanagi Plate before the Late Cretaceous. It moved northwestward with the Izanagi Plate and collided with the South China Block at about 100 Ma. The indentation of the Okhotomorsk Block within East Asia resulted in the formation of a sinistral strike-slip fault system in South China, formation of a dextral strike-slip fault system in North China, and regional northwest-southeast shortening and orogenic uplift in East Asia. Northeast-striking mountain belts over 500 km wide extended from Southeast China to Southwest Japan and South Korea. The peak metamorphism at about 89 Ma of the Sanbagawa high-pressure metamorphic belt in Southwest Japan was probably related to the continental subduction of the Okhotomorsk Block beneath the East Asian margin. Subsequently, the north-northwestward change of motion direction of the Izanagi Plate led to the northward movement of the Okhotomorsk Block along the East Asian margin, forming a significant sinistral continental transform boundary similar to the San Andreas fault system in California. Sanbagawa metamorphic rocks in Southwest Japan were rapidly exhumed through the several-kilometer wide ductile shear zone at the lower crust and upper mantle level. Accretionary complexes successively accumulated along the East Asian margin during the Jurassic-Early Cretaceous were subdivided into narrow and subparallel belts by the upper crustal strike-slip fault system. The departure of the Okhotomorsk Block from the northeast-striking Asian margin resulted in the occurrence of an extensional setting and formation of a wide magmatic belt to the west of the margin. In the Campanian, the block collided with the Siberian margin, in Northeast Asia. At about 77 Ma, a new oceanic subduction occurred to the south of the Okhotomorsk Block, ending its long-distance northward motion. Based on the new tectonic model, the abundant Late Archean to Early Proterozoic detrital zircons in the Cretaceous sandstones in Kamchatka, Southwest Japan, and Taiwan are interpreted to have been sourced from the Okhotomorsk Block basement which possibly formed during the Late Archean and Early Proterozoic. The new model suggests a rapidly northward-moving Okhotomorsk Block at an average speed of 22.5 cm/yr during 89-77 Ma. It is hypothesized that the Okhotomorsk-East Asia collision during 100-89 Ma slowed down the northwestward motion of the Izanagi Plate, while slab pull forces produced from the subducting Izanagi Plate beneath the Siberian margin redirected the plate from northwestward to north-northwestward motion at about 90-89 Ma.
Morse, J; Terrasini, N; Wehbe, M; Philippona, C; Zaouter, C; Cyr, S; Hemmerling, T M
2014-06-01
This study focuses on a recently developed robotic nerve block system and its impact on learning regional anaesthesia skills. We compared success rates, learning curves, performance times, and inter-subject performance variability of robot-assisted vs manual ultrasound (US)-guided nerve block needle guidance. The hypothesis of this study is that robot assistance will result in faster skill acquisition than manual needle guidance. Five co-authors with different experience with nerve blocks and the robotic system performed both manual and robot-assisted, US-guided nerve blocks on two different nerves of a nerve phantom. Ten trials were performed for each of the four procedures. Time taken to move from a shared starting position till the needle was inserted into the target nerve was defined as the performance time. A successful block was defined as the insertion of the needle into the target nerve. Average performance times were compared using analysis of variance. P<0.05 was considered significant. Data presented as mean (standard deviation). All blocks were successful. There were significant differences in performance times between co-authors to perform the manual blocks, either superficial (P=0.001) or profound (P=0.0001); no statistical difference between co-authors was noted for the robot-assisted blocks. Linear regression indicated that the average decrease in time between consecutive trials for robot-assisted blocks of 1.8 (1.6) s was significantly (P=0.007) greater than the decrease for manual blocks of 0.3 (0.3) s. Robot assistance of nerve blocks allows for faster learning of needle guidance over manual positioning and reduces inter-subject performance variability. © The Author [2014]. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A system architecture for a planetary rover
NASA Technical Reports Server (NTRS)
Smith, D. B.; Matijevic, J. R.
1989-01-01
Each planetary mission requires a complex space vehicle which integrates several functions to accomplish the mission and science objectives. A Mars Rover is one of these vehicles, and extends the normal spacecraft functionality with two additional functions: surface mobility and sample acquisition. All functions are assembled into a hierarchical and structured format to understand the complexities of interactions between functions during different mission times. It can graphically show data flow between functions, and most importantly, the necessary control flow to avoid unambiguous results. Diagrams are presented organizing the functions into a structured, block format where each block represents a major function at the system level. As such, there are six blocks representing telecomm, power, thermal, science, mobility and sampling under a supervisory block called Data Management/Executive. Each block is a simple collection of state machines arranged into a hierarchical order very close to the NASREM model for Telerobotics. Each layer within a block represents a level of control for a set of state machines that do the three primary interface functions: command, telemetry, and fault protection. This latter function is expanded to include automatic reactions to the environment as well as internal faults. Lastly, diagrams are presented that trace the system operations involved in moving from site to site after site selection. The diagrams clearly illustrate both the data and control flows. They also illustrate inter-block data transfers and a hierarchical approach to fault protection. This systems architecture can be used to determine functional requirements, interface specifications and be used as a mechanism for grouping subsystems (i.e., collecting groups of machines, or blocks consistent with good and testable implementations).
Self-Organization in 2D Traffic Flow Model with Jam-Avoiding Drive
NASA Astrophysics Data System (ADS)
Nagatani, Takashi
1995-04-01
A stochastic cellular automaton (CA) model is presented to investigate the traffic jam by self-organization in the two-dimensional (2D) traffic flow. The CA model is the extended version of the 2D asymmetric exclusion model to take into account jam-avoiding drive. Each site contains either a car moving to the up, a car moving to the right, or is empty. A up car can shift right with probability p ja if it is blocked ahead by other cars. It is shown that the three phases (the low-density phase, the intermediate-density phase and the high-density phase) appear in the traffic flow. The intermediate-density phase is characterized by the right moving of up cars. The jamming transition to the high-density jamming phase occurs with higher density of cars than that without jam-avoiding drive. The jamming transition point p 2c increases with the shifting probability p ja. In the deterministic limit of p ja=1, it is found that a new jamming transition occurs from the low-density synchronized-shifting phase to the high-density moving phase with increasing density of cars. In the synchronized-shifting phase, all up cars do not move to the up but shift to the right by synchronizing with the move of right cars. We show that the jam-avoiding drive has an important effect on the dynamical jamming transition.
Classifier performance prediction for computer-aided diagnosis using a limited dataset.
Sahiner, Berkman; Chan, Heang-Ping; Hadjiiski, Lubomir
2008-04-01
In a practical classifier design problem, the true population is generally unknown and the available sample is finite-sized. A common approach is to use a resampling technique to estimate the performance of the classifier that will be trained with the available sample. We conducted a Monte Carlo simulation study to compare the ability of the different resampling techniques in training the classifier and predicting its performance under the constraint of a finite-sized sample. The true population for the two classes was assumed to be multivariate normal distributions with known covariance matrices. Finite sets of sample vectors were drawn from the population. The true performance of the classifier is defined as the area under the receiver operating characteristic curve (AUC) when the classifier designed with the specific sample is applied to the true population. We investigated methods based on the Fukunaga-Hayes and the leave-one-out techniques, as well as three different types of bootstrap methods, namely, the ordinary, 0.632, and 0.632+ bootstrap. The Fisher's linear discriminant analysis was used as the classifier. The dimensionality of the feature space was varied from 3 to 15. The sample size n2 from the positive class was varied between 25 and 60, while the number of cases from the negative class was either equal to n2 or 3n2. Each experiment was performed with an independent dataset randomly drawn from the true population. Using a total of 1000 experiments for each simulation condition, we compared the bias, the variance, and the root-mean-squared error (RMSE) of the AUC estimated using the different resampling techniques relative to the true AUC (obtained from training on a finite dataset and testing on the population). Our results indicated that, under the study conditions, there can be a large difference in the RMSE obtained using different resampling methods, especially when the feature space dimensionality is relatively large and the sample size is small. Under this type of conditions, the 0.632 and 0.632+ bootstrap methods have the lowest RMSE, indicating that the difference between the estimated and the true performances obtained using the 0.632 and 0.632+ bootstrap will be statistically smaller than those obtained using the other three resampling methods. Of the three bootstrap methods, the 0.632+ bootstrap provides the lowest bias. Although this investigation is performed under some specific conditions, it reveals important trends for the problem of classifier performance prediction under the constraint of a limited dataset.
Wang, Honggang; Zhou, Yue; Zhang, Zhengfeng
2016-05-01
Minimally invasive transforaminal lumbar interbody fusion (misTLIF) can potentially lead to dorsal root ganglion (DRG) injury which may cause postoperative dysesthesia (POD). The purpose of retrospective study was to describe the uncommon complication of POD in misTLIF. Between January 2010 and December 2014, 539 patients were treated with misTLIF in investigator group. POD was defined as dysesthetic pain or burning dysesthesia at a proper DRG innervated region, whether spontaneous or evoked. Non-steroidal antiinflammatory drugs, central non-opioid analgesic agent, neuropathic pain drugs and/or intervertebral foramen block were selectively used to treat POD. There were five cases of POD (5/539, 0.9 %), which consisted of one patient in recurrent lumbar disc herniation (1/36, 3 %), one patient in far lateral lumbar disc herniation (1/34, 3 %), and 3 patients in lumbar spondylolisthesis (3/201, 1 %). Two DRG injury cases were confirmed by revision surgery. After the treatment by drugs administration plus DRG block, all patients presented pain relief with duration from 22 to 50 days. A gradual pain moving to distal end of a proper DRG innervated region was found as the beginning of end. Although POD is a unique and rare complication and maybe misdiagnosed as nerve root injury in misTLIF, combination drug therapy and DRG block have an effective result of pain relief. The appearance of a gradual pain moving to distal end of a proper DRG innervated region during recovery may be used as a sign for the good prognosis.
Utilization of the Building-Block Approach in Structural Mechanics Research
NASA Technical Reports Server (NTRS)
Rouse, Marshall; Jegley, Dawn C.; McGowan, David M.; Bush, Harold G.; Waters, W. Allen
2005-01-01
In the last 20 years NASA has worked in collaboration with industry to develop enabling technologies needed to make aircraft safer and more affordable, extend their lifetime, improve their reliability, better understand their behavior, and reduce their weight. To support these efforts, research programs starting with ideas and culminating in full-scale structural testing were conducted at the NASA Langley Research Center. Each program contained development efforts that (a) started with selecting the material system and manufacturing approach; (b) moved on to experimentation and analysis of small samples to characterize the system and quantify behavior in the presence of defects like damage and imperfections; (c) progressed on to examining larger structures to examine buckling behavior, combined loadings, and built-up structures; and (d) finally moved to complicated subcomponents and full-scale components. Each step along the way was supported by detailed analysis, including tool development, to prove that the behavior of these structures was well-understood and predictable. This approach for developing technology became known as the "building-block" approach. In the Advanced Composites Technology Program and the High Speed Research Program the building-block approach was used to develop a true understanding of the response of the structures involved through experimentation and analysis. The philosophy that if the structural response couldn't be accurately predicted, it wasn't really understood, was critical to the progression of these programs. To this end, analytical techniques including closed-form and finite elements were employed and experimentation used to verify assumptions at each step along the way. This paper presents a discussion of the utilization of the building-block approach described previously in structural mechanics research and development programs at NASA Langley Research Center. Specific examples that illustrate the use of this approach are included from recent research and development programs for both subsonic and supersonic transports.
Effects of Grip-Force, Contact, and Acceleration Feedback on a Teleoperated Pick-and-Place Task.
Khurshid, Rebecca P; Fitter, Naomi T; Fedalei, Elizabeth A; Kuchenbecker, Katherine J
2017-01-01
The multifaceted human sense of touch is fundamental to direct manipulation, but technical challenges prevent most teleoperation systems from providing even a single modality of haptic feedback, such as force feedback. This paper postulates that ungrounded grip-force, fingertip-contact-and-pressure, and high-frequency acceleration haptic feedback will improve human performance of a teleoperated pick-and-place task. Thirty subjects used a teleoperation system consisting of a haptic device worn on the subject's right hand, a remote PR2 humanoid robot, and a Vicon motion capture system to move an object to a target location. Each subject completed the pick-and-place task 10 times under each of the eight haptic conditions obtained by turning on and off grip-force feedback, contact feedback, and acceleration feedback. To understand how object stiffness affects the utility of the feedback, half of the subjects completed the task with a flexible plastic cup, and the others used a rigid plastic block. The results indicate that the addition of grip-force feedback with gain switching enables subjects to hold both the flexible and rigid objects more stably, and it also allowed subjects who manipulated the rigid block to hold the object more delicately and to better control the motion of the remote robot's hand. Contact feedback improved the ability of subjects who manipulated the flexible cup to move the robot's arm in space, but it deteriorated this ability for subjects who manipulated the rigid block. Contact feedback also caused subjects to hold the flexible cup less stably, but the rigid block more securely. Finally, adding acceleration feedback slightly improved the subject's performance when setting the object down, as originally hypothesized; interestingly, it also allowed subjects to feel vibrations produced by the robot's motion, causing them to be more careful when completing the task. This study supports the utility of grip-force and high-frequency acceleration feedback in teleoperation systems and motivates further improvements to fingertip-contact-and-pressure feedback.
Steady rotation of the Cascade arc
Wells, Ray E.; McCaffrey, Robert
2013-01-01
Displacement of the Miocene Cascade volcanic arc (northwestern North America) from the active arc is in the same sense and at nearly the same rate as the present clockwise block motions calculated from GPS velocities in a North American reference frame. Migration of the ancestral arc over the past 16 m.y. can be explained by clockwise rotation of upper-plate blocks at 1.0°/m.y. over a linear melting source moving westward 1–4.5 km/m.y. due to slab rollback. Block motion and slab rollback are in opposite directions in the northern arc, but both are westerly in the southern extensional arc, where rollback may be enhanced by proximity to the edge of the Juan de Fuca slab. Similarities between post–16 Ma arc migration, paleomagnetic rotation, and modern GPS block motions indicate that the secular block motions from decadal GPS can be used to calculate long-term strain rates and earthquake hazards. Northwest-directed Basin and Range extension of 140 km is predicted behind the southern arc since 16 Ma, and 70 km of shortening is predicted in the northern arc. The GPS rotation poles overlie a high-velocity slab of the Siletzia terrane dangling into the mantle beneath Idaho (United States), which may provide an anchor for the rotations.
Lopez, A M; Sala-Blanch, X; Castillo, R; Hadzic, A
2014-01-01
The recommendations for the level of injection and ideal placement of the needle tip required for successful ultrasound-guided sciatic popliteal block vary among authors. A hypothesis was made that, when the local anesthetic is injected at the division of the sciatic nerve within the common connective tissue sheath, the block has a higher success rate than an injection outside the sheath. Thirty-four patients scheduled for hallux valgus repair surgery were randomized to receive either a sub-sheath block (n=16) or a peri-sheath block (n=18) at the level of the division of the sciatic nerve at the popliteal fossa. For the sub-sheath block, the needle was advanced out of plane until the tip was positioned between the tibial and peroneal nerves, and local anesthetic was then injected without moving the needle. For the peri-sheath block, the needle was advanced out of plane both sides of the sciatic nerve, to surround the sheath. Mepivacaine 1.5% and levobupivacaine 0.5% 30mL were used in both groups. The progression of motor and sensory block was assessed at 5min intervals. Duration of block was recorded. Adequate surgical block was achieved in all patients in the subsheath group (100%) compared to 12 patients (67%) in the peri-sheath group at 30min. Sensory block was achieved faster in the subsheath than peri-sheath (9.1±7.4min vs. 19.0±4.0; p<.001). Our study suggests that for successful sciatic popliteal block in less than 30min, local anesthetic should be injected within the sheath. Copyright © 2013 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Published by Elsevier España. All rights reserved.
Concept Innateness, Concept Continuity, and Bootstrapping
Carey, Susan
2011-01-01
The commentators raised issues relevant to all three important theses of The Origin of Concepts (TOOC). Some questioned the very existence of innate representational primitives, and others questioned my claims about their richness and whether they should be thought of as concepts. Some questioned the existence of conceptual discontinuity in the course of knowledge acquisition and others argued that discontinuity is much more common than portrayed in TOOC. Some raised issues with my characterization of Quinian bootstrapping, and others questioned the dual factor theory of concepts motivated by my picture of conceptual development. PMID:23264705
Direct measurement of fast transients by using boot-strapped waveform averaging
NASA Astrophysics Data System (ADS)
Olsson, Mattias; Edman, Fredrik; Karki, Khadga Jung
2018-03-01
An approximation to coherent sampling, also known as boot-strapped waveform averaging, is presented. The method uses digital cavities to determine the condition for coherent sampling. It can be used to increase the effective sampling rate of a repetitive signal and the signal to noise ratio simultaneously. The method is demonstrated by using it to directly measure the fluorescence lifetime from Rhodamine 6G by digitizing the signal from a fast avalanche photodiode. The obtained lifetime of 4.0 ns is in agreement with the known values.
Validation of neoclassical bootstrap current models in the edge of an H-mode plasma.
Wade, M R; Murakami, M; Politzer, P A
2004-06-11
Analysis of the parallel electric field E(parallel) evolution following an L-H transition in the DIII-D tokamak indicates the generation of a large negative pulse near the edge which propagates inward, indicative of the generation of a noninductive edge current. Modeling indicates that the observed E(parallel) evolution is consistent with a narrow current density peak generated in the plasma edge. Very good quantitative agreement is found between the measured E(parallel) evolution and that expected from neoclassical theory predictions of the bootstrap current.
NASA Astrophysics Data System (ADS)
Fujii, Toshitsugu; Nakada, Setsuya
1999-04-01
Large-scale collapse of a dacite dome in the late afternoon of 15 September 1991 generated a series of pyroclastic-flow events at Unzen Volcano. Pyroclastic flows with a volume of 1×10 6 m 3 (as DRE) descended the northeastern slope of the volcano, changing their courses to the southeast due to topographic control. After they exited a narrow gorge, an ash-cloud surge rushed straight ahead, detaching the main body of the flow that turned and followed the topographic lows to the east. The surge swept the Kita-Kamikoba area, which had been devastated by the previous pyroclastic-flow events, and transported a car as far as 120 m. Following detachment, the surge lost its force after it moved several hundred meters, but maintained a high temperature. The deposits consist of a bottom layer of better-sorted ash (unit 1), a thick layer of block and ash (unit 2), and a thin top layer of fall-out ash (unit 3). Unit 2 overlies unit 1 with an erosional contact. The upper part of unit 2 grades into better-sorted ash. At distal block-and-ash flow deposits, the bottom part of unit 2 also consists of better-sorted ash, and the contact with the unit 1 deposits becomes ambiguous. Video footage of cascading pyroclastic flows during the 1991-1995 eruption, traveling over surfaces without any topographic barriers, revealed that lobes of ash cloud protruded intermittently from the moving head and sides, and that these lobes surged ahead on the ground surface. This fact, together with the inspection by helicopter shortly after the events, suggests that the protruded lobes consisted of better-sorted ash, and resulted in the deposits of unit 1. The highest ash-cloud plume at the Oshigadani valley exit, and the thickest deposition of fall-out ash over Kita-Kamikoba and Ohnokoba, indicate that abundant ash was also produced when the flow passed through a narrow gorge. In the model presented here, the ash clouds from the pyroclastic flows were composed of a basal turbulent current of high concentration (main body), an overriding and intermediate fluidization zone, and an overlying dilute cloud. Release of pressurized gas in lava block pores, due to collisions among blocks and the resulting upward current, caused a zone of fluidization just above the main body. The mixture of gas and ash sorted in the fluidization zone moved ahead and to the side of the main body as a gravitational current, where the ash was deposited as surge deposits. The main body, which had high internal friction and shear near its base, then overran the surge deposits, partially eroding them. When the upward current of gas (fluidization) waned, better-sorted ash suspended in the fluidization zone was deposited on block-and-ash deposits. In the distal places of block-and-ash deposits, unit 2 probably was deposited in non-turbulent fashion without any erosion of the underlying layer (unit 1).
An introduction to scriptwriting for video and multimedia.
Guth, J
1995-06-01
The elements of audiovisual productions are explained and illustrated, including words, moving images, still images, graphics, narration, music, landscape sounds, pacing and tilting and font styles. Three different production styles are analysed, and examples of those styles are discussed. Rules for writing spoken words, composing blocks of information, and explaining technical information to a lay audience are also provided. Storyboard and scripting forms and examples are included.
USAF (United States Air Force) Avionics Master Plan.
1982-12-01
is updated annually by MAPG activities to reflect changes in emphasis resulting from new direction, threat developments , and other armament and...many different kinds of functional electronic subsystems, a building block approach to the development of new subsystems can be taken. This approach...technologies targeted for precision all weather weapon delivery. A new program will develop the capability to detect and locate ground moving targets not
Development of Recuperator Manufacturing Techniques. Phase 2
1983-06-01
Continue on reveree side If necessary and Identify by block number) AGT 1500 Turbine Exhaust Heat Recuperator Laser Welding Inconel 625 Comnuter.CQntrol...2 millimeter (0. 008 inch) thick nickel based alloy ( Inconel 625 ) used. Two computer/moving mirror systems were evaluated and programs for each...92 APPENDIX B. SPECIFICATON FOR NICKEL BASE ALLOY, SHEET, CORROSION, AND HEAT RESISTANT ( INCONEL 625 ) ... 94 APPENDIX C. SPECIFICATION FOR
Maturation of large scale mass-wasting along the Hawaiian Ridge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torresan, M.E.; Clague, D.A.; Moore, J.G.
1990-06-01
Extensive GLORIA side-scan sonar mapping of the Hawaiian Ridge from Hawaii to St. Rogatien Bank shows that massive slumps and blocky debris avalanches are the major degradational processes that affect the island and ridge areas. About 30 failures have been imaged in the region surveyed; they range in area from 250 to > 6,000 km{sup 2} and in volume from 500 to > 5,000 km{sup 3}. Four are rotational slumps, and the rest are blocky debris avalanches. Such deposits cover 125,000 km{sup 2} of the Hawaiian Ridge and adjacent seafloor. The slumps are wide (up to 110 km), short (30-35more » km), thick (about 10 km), and slow moving. They are broken into comparatively few major rotational blocks that have not moved far and are characterized by steep toes and transverse ridges. Back rotation of the blocks has elevated their seaward edges, producing transverse ridges and perched basins filled with 5 to > 35 m of sediment. Compared to the slumps, the debris avalanches are lobate, long (up to 230 km), thin (0.5-2 km), and fast-moving. These deposits cross the Hawaiian Trough and run upslope onto the Hawaiian Arch (up to 550 m in elevation over a distance of 140 km). These failures commonly have amphitheaters and subaerial canyons at their heads. Their distal ends are hummocky, and blocky debris litters the seafloor adjacent to the ridge. As one proceeds west from Hawaii to St. Rogatien Bank, the GLORIA sonographs and seismic reflection profiles show a progression from youthful to mature failures and from active to about 12 Ma volcanoes. The Alika and Hilina slide complexes are examples of youthful failures on active volcanoes. Slumping in the Hilina slide is ongoing (7.2 magnitude earthquake in 1975). Little to no sediment covers the blocks and hummocky terrane of the Alika (about 100 ka), whereas the older deposits along the western part of the ridge are covered by up to 30 m of transparent sediment.« less
Progress Toward Steady State Tokamak Operation Exploiting the high bootstrap current fraction regime
NASA Astrophysics Data System (ADS)
Ren, Q.
2015-11-01
Recent DIII-D experiments have advanced the normalized fusion performance of the high bootstrap current fraction tokamak regime toward reactor-relevant steady state operation. The experiments, conducted by a joint team of researchers from the DIII-D and EAST tokamaks, developed a fully noninductive scenario that could be extended on EAST to a demonstration of long pulse steady-state tokamak operation. Fully noninductive plasmas with extremely high values of the poloidal beta, βp >= 4 , have been sustained at βT >= 2 % for long durations with excellent energy confinement quality (H98y,2 >= 1 . 5) and internal transport barriers (ITBs) generated at large minor radius (>= 0 . 6) in all channels (Te, Ti, ne, VTf). Large bootstrap fraction (fBS ~ 80 %) has been obtained with high βp. ITBs have been shown to be compatible with steady state operation. Because of the unusually large ITB radius, normalized pressure is not limited to low βN values by internal ITB-driven modes. βN up to ~4.3 has been obtained by optimizing the plasma-wall distance. The scenario is robust against several variations, including replacing some on-axis with off-axis neutral beam injection (NBI), adding electron cyclotron (EC) heating, and reducing the NBI torque by a factor of 2. This latter observation is particularly promising for extension of the scenario to EAST, where maximum power is obtained with balanced NBI injection, and to a reactor, expected to have low rotation. However, modeling of this regime has provided new challenges to state-of-the-art modeling capabilities: quasilinear models can dramatically underpredict the electron transport, and the Sauter bootstrap current can be insufficient. The analysis shows first-principle NEO is in good agreement with experiments for the bootstrap current calculation and ETG modes with a larger saturated amplitude or EM modes may provide the missing electron transport. Work supported in part by the US DOE under DE-FC02-04ER54698, DE-AC52-07NA27344, DE-AC02-09CH11466, and the NMCFP of China under 2015GB110000 and 2015GB102000.
Brunelli, Alessandro; Tentzeris, Vasileios; Sandri, Alberto; McKenna, Alexandra; Liew, Shan Liung; Milton, Richard; Chaudhuri, Nilanjan; Kefaloyannis, Emmanuel; Papagiannopoulos, Kostas
2016-05-01
To develop a clinically risk-adjusted financial model to estimate the cost associated with a video-assisted thoracoscopic surgery (VATS) lobectomy programme. Prospectively collected data of 236 VATS lobectomy patients (August 2012-December 2013) were analysed retrospectively. Fixed and variable intraoperative and postoperative costs were retrieved from the Hospital Accounting Department. Baseline and surgical variables were tested for a possible association with total cost using a multivariable linear regression and bootstrap analyses. Costs were calculated in GBP and expressed in Euros (EUR:GBP exchange rate 1.4). The average total cost of a VATS lobectomy was €11 368 (range €6992-€62 535). Average intraoperative (including surgical and anaesthetic time, overhead, disposable materials) and postoperative costs [including ward stay, high dependency unit (HDU) or intensive care unit (ICU) and variable costs associated with management of complications] were €8226 (range €5656-€13 296) and €3029 (range €529-€51 970), respectively. The following variables remained reliably associated with total costs after linear regression analysis and bootstrap: carbon monoxide lung diffusion capacity (DLCO) <60% predicted value (P = 0.02, bootstrap 63%) and chronic obstructive pulmonary disease (COPD; P = 0.035, bootstrap 57%). The following model was developed to estimate the total costs: 10 523 + 1894 × COPD + 2376 × DLCO < 60%. The comparison between predicted and observed costs was repeated in 1000 bootstrapped samples to verify the stability of the model. The two values were not different (P > 0.05) in 86% of the samples. A hypothetical patient with COPD and DLCO less than 60% would cost €4270 more than a patient without COPD and with higher DLCO values (€14 793 vs €10 523). Risk-adjusting financial data can help estimate the total cost associated with VATS lobectomy based on clinical factors. This model can be used to audit the internal financial performance of a VATS lobectomy programme for budgeting, planning and for appropriate bundled payment reimbursements. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
Hager, Robert; Chang, C. S.
2016-04-08
As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. In conclusion, a new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less
Bootstrapping language acquisition.
Abend, Omri; Kwiatkowski, Tom; Smith, Nathaniel J; Goldwater, Sharon; Steedman, Mark
2017-07-01
The semantic bootstrapping hypothesis proposes that children acquire their native language through exposure to sentences of the language paired with structured representations of their meaning, whose component substructures can be associated with words and syntactic structures used to express these concepts. The child's task is then to learn a language-specific grammar and lexicon based on (probably contextually ambiguous, possibly somewhat noisy) pairs of sentences and their meaning representations (logical forms). Starting from these assumptions, we develop a Bayesian probabilistic account of semantically bootstrapped first-language acquisition in the child, based on techniques from computational parsing and interpretation of unrestricted text. Our learner jointly models (a) word learning: the mapping between components of the given sentential meaning and lexical words (or phrases) of the language, and (b) syntax learning: the projection of lexical elements onto sentences by universal construction-free syntactic rules. Using an incremental learning algorithm, we apply the model to a dataset of real syntactically complex child-directed utterances and (pseudo) logical forms, the latter including contextually plausible but irrelevant distractors. Taking the Eve section of the CHILDES corpus as input, the model simulates several well-documented phenomena from the developmental literature. In particular, the model exhibits syntactic bootstrapping effects (in which previously learned constructions facilitate the learning of novel words), sudden jumps in learning without explicit parameter setting, acceleration of word-learning (the "vocabulary spurt"), an initial bias favoring the learning of nouns over verbs, and one-shot learning of words and their meanings. The learner thus demonstrates how statistical learning over structured representations can provide a unified account for these seemingly disparate phenomena. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hager, Robert; Chang, C. S.
As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. In conclusion, a new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hager, Robert, E-mail: rhager@pppl.gov; Chang, C. S., E-mail: cschang@pppl.gov
As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steepmore » edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. A new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.« less
Effect of non-normality on test statistics for one-way independent groups designs.
Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R
2012-02-01
The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.
Moving overlapping grids with adaptive mesh refinement for high-speed reactive and non-reactive flow
NASA Astrophysics Data System (ADS)
Henshaw, William D.; Schwendeman, Donald W.
2006-08-01
We consider the solution of the reactive and non-reactive Euler equations on two-dimensional domains that evolve in time. The domains are discretized using moving overlapping grids. In a typical grid construction, boundary-fitted grids are used to represent moving boundaries, and these grids overlap with stationary background Cartesian grids. Block-structured adaptive mesh refinement (AMR) is used to resolve fine-scale features in the flow such as shocks and detonations. Refinement grids are added to base-level grids according to an estimate of the error, and these refinement grids move with their corresponding base-level grids. The numerical approximation of the governing equations takes place in the parameter space of each component grid which is defined by a mapping from (fixed) parameter space to (moving) physical space. The mapped equations are solved numerically using a second-order extension of Godunov's method. The stiff source term in the reactive case is handled using a Runge-Kutta error-control scheme. We consider cases when the boundaries move according to a prescribed function of time and when the boundaries of embedded bodies move according to the surface stress exerted by the fluid. In the latter case, the Newton-Euler equations describe the motion of the center of mass of the each body and the rotation about it, and these equations are integrated numerically using a second-order predictor-corrector scheme. Numerical boundary conditions at slip walls are described, and numerical results are presented for both reactive and non-reactive flows that demonstrate the use and accuracy of the numerical approach.
Deerinck, T J; Shone, T M; Bushong, E A; Ramachandra, R; Peltier, S T; Ellisman, M H
2018-05-01
A longstanding limitation of imaging with serial block-face scanning electron microscopy is specimen surface charging. This charging is largely due to the difficulties in making biological specimens and the resins in which they are embedded sufficiently conductive. Local accumulation of charge on the specimen surface can result in poor image quality and distortions. Even minor charging can lead to misalignments between sequential images of the block-face due to image jitter. Typically, variable-pressure SEM is used to reduce specimen charging, but this results in a significant reduction to spatial resolution, signal-to-noise ratio and overall image quality. Here we show the development and application of a simple system that effectively mitigates specimen charging by using focal gas injection of nitrogen over the sample block-face during imaging. A standard gas injection valve is paired with a precisely positioned but retractable application nozzle, which is mechanically coupled to the reciprocating action of the serial block-face ultramicrotome. This system enables the application of nitrogen gas precisely over the block-face during imaging while allowing the specimen chamber to be maintained under high vacuum to maximise achievable SEM image resolution. The action of the ultramicrotome drives the nozzle retraction, automatically moving it away from the specimen area during the cutting cycle of the knife. The device described was added to a Gatan 3View system with minimal modifications, allowing high-resolution block-face imaging of even the most charge prone of epoxy-embedded biological samples. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.
GENDER AND THE MBA: Differences in Career Trajectories, Institutional Support, and Outcomes
Damaske, Sarah; Sheroff, Christen
2018-01-01
This study asks how men’s and women’s careers diverge following MBA graduation from an elite university, using qualitative interview data from 74 respondents. We discover men and women follow three career pathways post-graduation: lockstep (stable employment), transitory (3 or more employers), and exit (left workforce). While similar proportions of men and women followed the lockstep pathways and launched accelerated careers, sizable gender differences emerged on the transitory pathway; men’s careers soared as women’s faltered on this path—the modal category for both. On the transitory path, men fared much better than women when moving to new organizations, suggesting that gender may become more salient when people have a shorter work history with a company. Our findings suggest that clear building blocks to promotions reduce gender bias and ambiguity in the promotion process, but multiple external moves hamper women, putting them at a clear disadvantage to men whose forward progress is less likely to be stalled by such moves. PMID:29706689
Talhelm, Thomas; Zhang, Xuemin; Oishi, Shigehiro
2018-04-01
Traditional paddy rice farmers had to share labor and coordinate irrigation in a way that most wheat farmers did not. We observed people in everyday life to test whether these agricultural legacies gave rice-farming southern China a more interdependent culture and wheat-farming northern China a more independent culture. In Study 1, we counted 8964 people sitting in cafes in six cities and found that people in northern China were more likely to be sitting alone. In Study 2, we moved chairs together in Starbucks across the country so that they were partially blocking the aisle ( n = 678). People in northern China were more likely to move the chair out of the way, which is consistent with findings that people in individualistic cultures are more likely to try to control the environment. People in southern China were more likely to adjust the self to the environment by squeezing through the chairs. Even in China's most modern cities, rice-wheat differences live on in everyday life.
Nonuniformity correction for an infrared focal plane array based on diamond search block matching.
Sheng-Hui, Rong; Hui-Xin, Zhou; Han-Lin, Qin; Rui, Lai; Kun, Qian
2016-05-01
In scene-based nonuniformity correction algorithms, artificial ghosting and image blurring degrade the correction quality severely. In this paper, an improved algorithm based on the diamond search block matching algorithm and the adaptive learning rate is proposed. First, accurate transform pairs between two adjacent frames are estimated by the diamond search block matching algorithm. Then, based on the error between the corresponding transform pairs, the gradient descent algorithm is applied to update correction parameters. During the process of gradient descent, the local standard deviation and a threshold are utilized to control the learning rate to avoid the accumulation of matching error. Finally, the nonuniformity correction would be realized by a linear model with updated correction parameters. The performance of the proposed algorithm is thoroughly studied with four real infrared image sequences. Experimental results indicate that the proposed algorithm can reduce the nonuniformity with less ghosting artifacts in moving areas and can also overcome the problem of image blurring in static areas.
Real-time three-dimensional ultrasound-assisted axillary plexus block defines soft tissue planes.
Clendenen, Steven R; Riutort, Kevin; Ladlie, Beth L; Robards, Christopher; Franco, Carlo D; Greengrass, Roy A
2009-04-01
Two-dimensional (2D) ultrasound is commonly used for regional block of the axillary brachial plexus. In this technical case report, we described a real-time three-dimensional (3D) ultrasound-guided axillary block. The difference between 2D and 3D ultrasound is similar to the difference between plain radiograph and computer tomography. Unlike 2D ultrasound that captures a planar image, 3D ultrasound technology acquires a 3D volume of information that enables multiple planes of view by manipulating the image without movement of the ultrasound probe. Observation of the brachial plexus in cross-section demonstrated distinct linear hyperechoic tissue structures (loose connective tissue) that initially inhibited the flow of the local anesthesia. After completion of the injection, we were able to visualize the influence of arterial pulsation on the spread of the local anesthesia. Possible advantages of this novel technology over current 2D methods are wider image volume and the capability to manipulate the planes of the image without moving the probe.
Frey, H Christopher; Zhao, Yuchao
2004-11-15
Probabilistic emission inventories were developed for urban air toxic emissions of benzene, formaldehyde, chromium, and arsenic for the example of Houston. Variability and uncertainty in emission factors were quantified for 71-97% of total emissions, depending upon the pollutant and data availability. Parametric distributions for interunit variability were fit using maximum likelihood estimation (MLE), and uncertainty in mean emission factors was estimated using parametric bootstrap simulation. For data sets containing one or more nondetected values, empirical bootstrap simulation was used to randomly sample detection limits for nondetected values and observations for sample values, and parametric distributions for variability were fit using MLE estimators for censored data. The goodness-of-fit for censored data was evaluated by comparison of cumulative distributions of bootstrap confidence intervals and empirical data. The emission inventory 95% uncertainty ranges are as small as -25% to +42% for chromium to as large as -75% to +224% for arsenic with correlated surrogates. Uncertainty was dominated by only a few source categories. Recommendations are made for future improvements to the analysis.
Bootstrapping non-commutative gauge theories from L∞ algebras
NASA Astrophysics Data System (ADS)
Blumenhagen, Ralph; Brunner, Ilka; Kupriyanov, Vladislav; Lüst, Dieter
2018-05-01
Non-commutative gauge theories with a non-constant NC-parameter are investigated. As a novel approach, we propose that such theories should admit an underlying L∞ algebra, that governs not only the action of the symmetries but also the dynamics of the theory. Our approach is well motivated from string theory. We recall that such field theories arise in the context of branes in WZW models and briefly comment on its appearance for integrable deformations of AdS5 sigma models. For the SU(2) WZW model, we show that the earlier proposed matrix valued gauge theory on the fuzzy 2-sphere can be bootstrapped via an L∞ algebra. We then apply this approach to the construction of non-commutative Chern-Simons and Yang-Mills theories on flat and curved backgrounds with non-constant NC-structure. More concretely, up to the second order, we demonstrate how derivative and curvature corrections to the equations of motion can be bootstrapped in an algebraic way from the L∞ algebra. The appearance of a non-trivial A∞ algebra is discussed, as well.
A symbol of uniqueness: the cluster bootstrap for the 3-loop MHV heptagon
Drummond, J. M.; Papathanasiou, G.; Spradlin, M.
2015-03-16
Seven-particle scattering amplitudes in planar super-Yang-Mills theory are believed to belong to a special class of generalised polylogarithm functions called heptagon functions. These are functions with physical branch cuts whose symbols may be written in terms of the 42 cluster A-coordinates on Gr(4, 7). Motivated by the success of the hexagon bootstrap programme for constructing six-particle amplitudes we initiate the systematic study of the symbols of heptagon functions. We find that there is exactly one such symbol of weight six which satisfies the MHV last-entry condition and is finite in the 7 ll 6 collinear limit. This unique symbol ismore » both dihedral and parity-symmetric, and remarkably its collinear limit is exactly the symbol of the three-loop six-particle MHV amplitude, although none of these properties were assumed a priori. It must therefore be the symbol of the threeloop seven-particle MHV amplitude. The simplicity of its construction suggests that the n-gon bootstrap may be surprisingly powerful for n > 6.« less
NASA Technical Reports Server (NTRS)
Yoshikawa, H. H.; Madison, I. B.
1971-01-01
This study was performed in support of the NASA Task B-2 Study Plan for Space Basing. The nature of space-based operations implies that orbital transfer of propellant is a prime consideration. The intent of this report is (1) to report on the findings and recommendations of existing literature on space-based propellant transfer techniques, and (2) to determine possible alternatives to the recommended methods. The reviewed literature recommends, in general, the use of conventional liquid transfer techniques (i.e., pumping) in conjunction with an artificially induced gravitational field. An alternate concept that was studied, the Thermal Bootstrap Transfer Process, is based on the compression of a two-phase fluid with subsequent condensation to a liquid (vapor compression/condensation). This concept utilizes the intrinsic energy capacities of the tanks and propellant by exploiting temperature differentials and available energy differences. The results indicate the thermodynamic feasibility of the Thermal Bootstrap Transfer Process for a specific range of tank sizes, temperatures, fill-factors and receiver tank heat transfer coefficients.
Creating Reconfigurable Materials Using ``Colonies'' of Oscillating Polymer Gels
NASA Astrophysics Data System (ADS)
Deb, Debabrata; Dayal, Pratyush; Kuksenok, Olga; Balazs, Anna
2013-03-01
Species ranging from single-cell organisms to social insects can undergo auto-chemotaxis, where the entities move towards a chemo-attractant that they themselves emit. This mode of signaling allows the organisms to form large-scale structures. Using computational modeling, we show that millimeter-sized polymer gels can display similar auto-chemotaxis. In particular, we demonstrate that gels undergoing the self-oscillating Belousov-Zhabotinsky (BZ) reaction not only respond to a chemical signal from the surrounding solution, but also emit this signal and thus, multiple gel pieces can spontaneously self-aggregate. We focus on the collective behavior of ``colonies'' of BZ gels and show that communication between the individual pieces critically depends on all the neighboring gels. We isolate the conditions at which the BZ gels can undergo a type of self-recombining: if a larger gel is cut into distinct pieces that are moved relatively far apart, then their auto-chemotactic behavior drives them to move and autonomously recombine into a structure resembling the original, uncut sample. These findings reveal that the BZ gels can be used as autonomously moving building blocks to construct multiple structures and thus, provide a new route for creating dynamically reconfigurable materials.
Brain oscillatory signatures of motor tasks
Birbaumer, Niels
2015-01-01
Noninvasive brain-computer-interfaces (BCI) coupled with prosthetic devices were recently introduced in the rehabilitation of chronic stroke and other disorders of the motor system. These BCI systems and motor rehabilitation in general involve several motor tasks for training. This study investigates the neurophysiological bases of an EEG-oscillation-driven BCI combined with a neuroprosthetic device to define the specific oscillatory signature of the BCI task. Controlling movements of a hand robotic orthosis with motor imagery of the same movement generates sensorimotor rhythm oscillation changes and involves three elements of tasks also used in stroke motor rehabilitation: passive and active movement, motor imagery, and motor intention. We recorded EEG while nine healthy participants performed five different motor tasks consisting of closing and opening of the hand as follows: 1) motor imagery without any external feedback and without overt hand movement, 2) motor imagery that moves the orthosis proportional to the produced brain oscillation change with online proprioceptive and visual feedback of the hand moving through a neuroprosthetic device (BCI condition), 3) passive and 4) active movement of the hand with feedback (seeing and feeling the hand moving), and 5) rest. During the BCI condition, participants received contingent online feedback of the decrease of power of the sensorimotor rhythm, which induced orthosis movement and therefore proprioceptive and visual information from the moving hand. We analyzed brain activity during the five conditions using time-frequency domain bootstrap-based statistical comparisons and Morlet transforms. Activity during rest was used as a reference. Significant contralateral and ipsilateral event-related desynchronization of sensorimotor rhythm was present during all motor tasks, largest in contralateral-postcentral, medio-central, and ipsilateral-precentral areas identifying the ipsilateral precentral cortex as an integral part of motor regulation. Changes in task-specific frequency power compared with rest were similar between motor tasks, and only significant differences in the time course and some narrow specific frequency bands were observed between motor tasks. We identified EEG features representing active and passive proprioception (with and without muscle contraction) and active intention and passive involvement (with and without voluntary effort) differentiating brain oscillations during motor tasks that could substantially support the design of novel motor BCI-based rehabilitation therapies. The BCI task induced significantly different brain activity compared with the other motor tasks, indicating neural processes unique to the use of body actuators control in a BCI context. PMID:25810484
2017-01-12
The linear depressions in this VIS image are graben. Graben are formed from tectonic activity with large blocks of material moving downward between paired faults. The crater in the bottom half of the image is oval rather than round, which could have been due to impacting into this region of tectonic deformation. Orbit Number: 66271 Latitude: -29.9918 Longitude: 211.199 Instrument: VIS Captured: 2016-11-21 15:19 http://photojournal.jpl.nasa.gov/catalog/PIA21287
A Participatory Evaluation of the Use of Social Networking Tools in a High School Math Class
ERIC Educational Resources Information Center
Wormald, Randy J.
2012-01-01
As we move into the 21st century, the needs of our students are more variable than ever. There has been a proliferation of social networking usage in society yet there has been little use of those emerging tools in schools as a means to enhance student learning. It is a common practice in school districts to block social networking sites and…
Validation of a Three-Dimensional Ablation and Thermal Response Simulation Code
NASA Technical Reports Server (NTRS)
Chen, Yih-Kanq; Milos, Frank S.; Gokcen, Tahir
2010-01-01
The 3dFIAT code simulates pyrolysis, ablation, and shape change of thermal protection materials and systems in three dimensions. The governing equations, which include energy conservation, a three-component decomposition model, and a surface energy balance, are solved with a moving grid system to simulate the shape change due to surface recession. This work is the first part of a code validation study for new capabilities that were added to 3dFIAT. These expanded capabilities include a multi-block moving grid system and an orthotropic thermal conductivity model. This paper focuses on conditions with minimal shape change in which the fluid/solid coupling is not necessary. Two groups of test cases of 3dFIAT analyses of Phenolic Impregnated Carbon Ablator in an arc-jet are presented. In the first group, axisymmetric iso-q shaped models are studied to check the accuracy of three-dimensional multi-block grid system. In the second group, similar models with various through-the-thickness conductivity directions are examined. In this group, the material thermal response is three-dimensional, because of the carbon fiber orientation. Predictions from 3dFIAT are presented and compared with arcjet test data. The 3dFIAT predictions agree very well with thermocouple data for both groups of test cases.
2003-07-18
KENNEDY SPACE CENTER, FLA. - On Launch Complex 17-B, Cape Canaveral Air Force Station, the first stage of a Delta II rocket is raised off the transporter before lifting and moving it into the mobile service tower. The rocket is being erected to launch the Space InfraRed Telescope Facility (SIRTF). Consisting of an 0.85-meter telescope and three cryogenically cooled science instruments, SIRTF is one of NASA's largest infrared telescopes to be launched. SIRTF will obtain images and spectra by detecting the infrared energy, or heat, radiated by objects in space. Most of this infrared radiation is blocked by the Earth's atmosphere and cannot be observed from the ground.
2003-07-18
KENNEDY SPACE CENTER, FLA. - On Launch Complex 17-B, Cape Canaveral Air Force Station, the first stage of a Delta II rocket is raised off the transporter before lifting it up and moved into the mobile service tower. The rocket is being erected to launch the Space InfraRed Telescope Facility (SIRTF). Consisting of an 0.85-meter telescope and three cryogenically cooled science instruments, SIRTF is one of NASA's largest infrared telescopes to be launched. SIRTF will obtain images and spectra by detecting the infrared energy, or heat, radiated by objects in space. Most of this infrared radiation is blocked by the Earth's atmosphere and cannot be observed from the ground.
2003-07-18
KENNEDY SPACE CENTER, FLA. - On Launch Complex 17-B, Cape Canaveral Air Force Station, the first stage of a Delta II rocket waits to be lifted up and moved into the mobile service tower. The rocket is being erected to launch the Space InfraRed Telescope Facility (SIRTF). Consisting of an 0.85-meter telescope and three cryogenically cooled science instruments, SIRTF is one of NASA's largest infrared telescopes to be launched. SIRTF will obtain images and spectra by detecting the infrared energy, or heat, radiated by objects in space. Most of this infrared radiation is blocked by the Earth's atmosphere and cannot be observed from the ground.
2003-08-07
KENNEDY SPACE CENTER, FLA. - Workers at Hangar A&E, Cape Canaveral Air Force Station, lift the upper canister to move it to the Space Infrared Telescope Facility (SIRTF) at right. After encapsulation, the spacecraft will be transported to Launch Complex 17-B for mating with its launch vehicle, the Delta II rocket. SIRTF consists of three cryogenically cooled science instruments and an 0.85-meter telescope, and is one of NASA's largest infrared telescopes to be launched. SIRTF will obtain images and spectra by detecting the infrared energy, or heat, radiated by objects in space. Most of this infrared radiation is blocked by the Earth's atmosphere and cannot be observed from the ground.
Neighborhood greenspace and health in a large urban center
NASA Astrophysics Data System (ADS)
Kardan, Omid; Gozdyra, Peter; Misic, Bratislav; Moola, Faisal; Palmer, Lyle J.; Paus, Tomáš; Berman, Marc G.
2015-07-01
Studies have shown that natural environments can enhance health and here we build upon that work by examining the associations between comprehensive greenspace metrics and health. We focused on a large urban population center (Toronto, Canada) and related the two domains by combining high-resolution satellite imagery and individual tree data from Toronto with questionnaire-based self-reports of general health perception, cardio-metabolic conditions and mental illnesses from the Ontario Health Study. Results from multiple regressions and multivariate canonical correlation analyses suggest that people who live in neighborhoods with a higher density of trees on their streets report significantly higher health perception and significantly less cardio-metabolic conditions (controlling for socio-economic and demographic factors). We find that having 10 more trees in a city block, on average, improves health perception in ways comparable to an increase in annual personal income of $10,000 and moving to a neighborhood with $10,000 higher median income or being 7 years younger. We also find that having 11 more trees in a city block, on average, decreases cardio-metabolic conditions in ways comparable to an increase in annual personal income of $20,000 and moving to a neighborhood with $20,000 higher median income or being 1.4 years younger.
A condition for small bootstrap current in three-dimensional toroidal configurations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikhailov, M. I., E-mail: mikhaylov-mi@nrcki.ru; Nührenberg, J.; Zille, R.
2016-11-15
It is shown that, if the maximum of the magnetic field strength on a magnetic surface in a threedimensional magnetic confinement configuration with stellarator symmetry constitutes a line that is orthogonal to the field lines and crosses the symmetry line, then the bootstrap current density is smaller compared to that in quasi-axisymmetric (qa) [J. Nührenberg et al., in Proc. of Joint Varenna−Lausanne Int. Workshop on Theory of Fusion Plasmas, Varenna, 1994, p. 3] and quasi-helically (qh) symmetric [J. Nührenberg and R. Zille, Phys. Lett. A 129, 113 (1988)] configurations.
NASA Astrophysics Data System (ADS)
Hasegawa, Chika; Nakayama, Yu
2018-03-01
In this paper, we solve the two-point function of the lowest dimensional scalar operator in the critical ϕ4 theory on 4 ‑ 𝜖 dimensional real projective space in three different methods. The first is to use the conventional perturbation theory, and the second is to impose the cross-cap bootstrap equation, and the third is to solve the Schwinger-Dyson equation under the assumption of conformal invariance. We find that the three methods lead to mutually consistent results but each has its own advantage.
On critical exponents without Feynman diagrams
NASA Astrophysics Data System (ADS)
Sen, Kallol; Sinha, Aninda
2016-11-01
In order to achieve a better analytic handle on the modern conformal bootstrap program, we re-examine and extend the pioneering 1974 work of Polyakov’s, which was based on consistency between the operator product expansion and unitarity. As in the bootstrap approach, this method does not depend on evaluating Feynman diagrams. We show how this approach can be used to compute the anomalous dimensions of certain operators in the O(n) model at the Wilson-Fisher fixed point in 4-ɛ dimensions up to O({ɛ }2). AS dedicates this work to the loving memory of his mother.
New Methods for Estimating Seasonal Potential Climate Predictability
NASA Astrophysics Data System (ADS)
Feng, Xia
This study develops two new statistical approaches to assess the seasonal potential predictability of the observed climate variables. One is the univariate analysis of covariance (ANOCOVA) model, a combination of autoregressive (AR) model and analysis of variance (ANOVA). It has the advantage of taking into account the uncertainty of the estimated parameter due to sampling errors in statistical test, which is often neglected in AR based methods, and accounting for daily autocorrelation that is not considered in traditional ANOVA. In the ANOCOVA model, the seasonal signals arising from external forcing are determined to be identical or not to assess any interannual variability that may exist is potentially predictable. The bootstrap is an attractive alternative method that requires no hypothesis model and is available no matter how mathematically complicated the parameter estimator. This method builds up the empirical distribution of the interannual variance from the resamplings drawn with replacement from the given sample, in which the only predictability in seasonal means arises from the weather noise. These two methods are applied to temperature and water cycle components including precipitation and evaporation, to measure the extent to which the interannual variance of seasonal means exceeds the unpredictable weather noise compared with the previous methods, including Leith-Shukla-Gutzler (LSG), Madden, and Katz. The potential predictability of temperature from ANOCOVA model, bootstrap, LSG and Madden exhibits a pronounced tropical-extratropical contrast with much larger predictability in the tropics dominated by El Nino/Southern Oscillation (ENSO) than in higher latitudes where strong internal variability lowers predictability. Bootstrap tends to display highest predictability of the four methods, ANOCOVA lies in the middle, while LSG and Madden appear to generate lower predictability. Seasonal precipitation from ANOCOVA, bootstrap, and Katz, resembling that for temperature, is more predictable over the tropical regions, and less predictable in extropics. Bootstrap and ANOCOVA are in good agreement with each other, both methods generating larger predictability than Katz. The seasonal predictability of evaporation over land bears considerably similarity with that of temperature using ANOCOVA, bootstrap, LSG and Madden. The remote SST forcing and soil moisture reveal substantial seasonality in their relations with the potentially predictable seasonal signals. For selected regions, either SST or soil moisture or both shows significant relationships with predictable signals, hence providing indirect insight on slowly varying boundary processes involved to enable useful seasonal climate predication. A multivariate analysis of covariance (MANOCOVA) model is established to identify distinctive predictable patterns, which are uncorrelated with each other. Generally speaking, the seasonal predictability from multivariate model is consistent with that from ANOCOVA. Besides unveiling the spatial variability of predictability, MANOCOVA model also reveals the temporal variability of each predictable pattern, which could be linked to the periodic oscillations.
Self-cleaning threaded rod spinneret for high-efficiency needleless electrospinning
NASA Astrophysics Data System (ADS)
Zheng, Gaofeng; Jiang, Jiaxin; Wang, Xiang; Li, Wenwang; Zhong, Weizheng; Guo, Shumin
2018-07-01
High-efficiency production of nanofibers is the key to the application of electrospinning technology. This work focuses on multi-jet electrospinning, in which a threaded rod electrode is utilized as the needless spinneret to achieve high-efficiency production of nanofibers. A slipper block, which fits into and moves through the threaded rod, is designed to transfer polymer solution evenly to the surface of the rod spinneret. The relative motion between the slipper block and the threaded rod electrode promotes the instable fluctuation of the solution surface, thus the rotation of threaded rod electrode decreases the critical voltage for the initial multi-jet ejection and the diameter of nanofibers. The residual solution on the surface of threaded rod is cleaned up by the moving slipper block, showing a great self-cleaning ability, which ensures the stable multi-jet ejection and increases the productivity of nanofibers. Each thread of the threaded rod electrode serves as an independent spinneret, which enhances the electric field strength and constrains the position of the Taylor cone, resulting in high productivity of uniform nanofibers. The diameter of nanofibers decreases with the increase of threaded rod rotation speed, and the productivity increases with the solution flow rate. The rotation of electrode provides an excess force for the ejection of charged jets, which also contributes to the high-efficiency production of nanofibers. The maximum productivity of nanofibers from the threaded rod spinneret is 5-6 g/h, about 250-300 times as high as that from the single-needle spinneret. The self-cleaning threaded rod spinneret is an effective way to realize continuous multi-jet electrospinning, which promotes industrial applications of uniform nanofibrous membrane.
Evaluation of laser ablation microtomy for correlative microscopy of hard tissues.
Boyde, A
2018-02-27
Laser ablation machining or microtomy (LAM) is a relatively new approach to producing slide mounted sections of translucent materials. We evaluated the method with a variety of problems from the bone, joint and dental tissues fields where we require thin undecalcified and undistorted sections for correlative light microscopy (LM) and backscattered electron scanning electron microscopy (BSE SEM). All samples were embedded in poly-methylmethacrlate (PMMA) and flat block surfaces had been previously studied by BSE-SEM and confocal scanning light microscopy (CSLM). Most were also studied by X-yay microtomography (XMT). The block surface is stuck to a glass slide with cyanoacrylate adhesive. Setting the section thickness and levelling uses inbuilt optical coherence tomographic imaging. Tight focusing of near-infrared laser radiation in the sectioning plane gives extreme intensities causing photodisruption of material at the focal point. The laser beam is moved by a fast scanner to write a cutting line, which is simultaneously moved by an XY positioning unit to create a sectioning plane. The block is thereby released from the slide, leaving the section stuck to the slide. Light, wet polishing on the finest grade (4000 grit) silicon carbide polishing paper is used to remove a 1-2 μm thick damaged layer at the surface of the section. Sections produced by laser cutting are fine in quality and superior to those produced by mechanical cutting and can be thinner than the 'voxel' in most laboratory X-ray microtomography systems. The present extensive pilot studies have shown that it works to produce samples which we can study by both light and electron microscopy. © 2018 The Authors Journal of Microscopy © 2018 Royal Microscopical Society.
Postoperative dysesthesia in lumbar three-column resection osteotomies.
Zhang, Zhengfeng; Wang, Honggang; Zheng, Wenjie
2016-08-01
Three-column lumbar spinal resection osteotomies including pedicle subtraction osteotomy (PSO), vertebral column resection (VCR), and total en bloc spondylectomy (TES) can potentially lead to dorsal root ganglion (DRG) injury which may cause postoperative dysesthesia (POD). The purpose of retrospective study was to describe the uncommon complication of POD in lumbar spinal resection osteotomies. Between January 2009 and December 2013, 64 patients were treated with lumbar three-column spinal resection osteotomies (PSO, n = 31; VCR, n = 29; TES, n = 4) in investigator group. POD was defined as dysesthetic pain or burning dysesthesia at a proper DRG innervated region, whether spontaneous or evoked. Non-steroidal antiinflammatory drugs, central none-opioid analgesic agent, neuropathic pain drugs and/or intervertebral foramen block were selectively used to treat POD. There were 5 cases of POD (5/64, 7.8 %), which consisted of 1 patient in PSO (1/31, 3.2 %), 3 patients in PVCR (3/29, 10.3 %), and 1 patient in TES (1/4, 25 %). After the treatment by drugs administration plus DRG block, all patients presented pain relief with duration from 8 to 38 days. A gradual pain moving to distal end of a proper DRG innervated region was found as the beginning of end. Although POD is a unique and rare complication and maybe misdiagnosed as nerve root injury in lumbar spinal resection osteotomies, combination drug therapy and DRG block have an effective result of pain relief. The appearance of a gradual pain moving to distal end of a proper DRG innervated region during recovering may be used as a sign for the good prognosis.
Kayen, R.E.; Schwab, W.C.; Lee, H.J.; Torresan, M.E.; Hein, J.R.; Quinterno, P.J.; Levin, L.A.
1989-01-01
Mass movement and erosion have been identified on the pelagic sediment cap of Horizon Guyot, a seamount in the Mid-Pacific Mountains. Trends in the size, shape and preservation of bedforms and sediment textural trends on the pelagic cap indicate that bottom-current-generated sediment transport direction is upslope. Slumping of the sediment cap occurred on and that the net bedload transport direction is upslope. Slumping of the sediment cap occurred on the northwest side of the guyot on a 1.6?? to 2.0?? slope in the zone of enhanced bottom-current activity. Submersible investigations of these slump blocks show them to be discrete and to have a relief of 6-15 m, with nodular chert beds cropping out along the headwall of individual rotated blocks. An evaluation of the stability of the sediment cap suggests that the combination of the current-induced beveling of the sea floor and infrequent earthquake loading accompanied by cyclic strength reduction is responsible for the initiation of slumps. The sediment in the area of slumping moved short distances in relatively coherent masses, whereas sediment that has moved beyond the summit cap perimeter has fully mobilized into sediment gravity flows and traveled large distances. A steady-state geotechnical analysis of Horizon Guyot sediment indicates the predisposition of deeply buried sediment towards disintegrative flow failure on appropriately steep slopes. Thus, slope failure in this deeper zone would include large amounts of internal deformation. However, gravitational stress in the near-surface sediment of the summit cap (sub-bottom depth < 14 m) is insufficient to maintain downslope movement after initial failure occurs. The predicted morphology of coherent slump blocks displaced and rafted upon a weakened zone at depth corresponds well with seismic-reflection data and submersible observations. ?? 1990.
Motor training reduces surround inhibition in the motor cortex.
Akkad, Haya; Di Stasio, Flavio; Tibold, Robert; Kassavetis, Panagiotis; Rothwell, John C; Edwards, Mark J
2016-06-01
Surround inhibition (SI) is thought to facilitate focal contraction of a hand muscle by keeping nearby muscles silent. Unexpectedly, SI is reduced in skilled pianists. We tested whether repeated practice of focal contraction in non-pianists could reduce SI. Motor-evoked potentials were elicited by transcranial magnetic stimulation in the relaxed abductor digiti minimi randomly at the onset and 5s after offset of a 2s focal contraction (10% maximum) of the first dorsal interosseous (FDI). Over 5 blocks of 40 trials participants obtained points for increasing contraction speed and stability in FDI. In a final block, the interval between contractions was varied randomly to increase attention to the task. Over the first 5 blocks, SI declined as performance (points scored) improved. In the final "attention" block SI increased towards baseline without affecting performance. Although SI may be useful during the early stages of learning, skilled focal finger movement does not require SI to prevent activity in non-involved muscles. This could be due to better targeting of the excitatory command to move. Results from the final block suggest that increased attention can re-engage SI when task parameters change. SI is not necessary for successful focal contraction, but may contribute during learning and during attention to task. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Patient satisfaction after pulmonary resection for lung cancer: a multicenter comparative analysis.
Pompili, Cecilia; Brunelli, Alessandro; Rocco, Gaetano; Salvi, Rosario; Xiumé, Francesco; La Rocca, Antonello; Sabbatini, Armando; Martucci, Nicola
2013-01-01
Patient satisfaction reflects the perception of the customer about the level of quality of care received during the episode of hospitalization. To compare the levels of satisfaction of patients submitted to lung resection in two different thoracic surgical units. Prospective analysis of 280 consecutive patients submitted to pulmonary resection for neoplastic disease in two centers (center A: 139 patients; center B: 141 patients; 2009-2010). Patients' satisfaction was assessed at discharge through the EORTC-InPatSat32 module, a 32-item, multi-scale self-administered anonymous questionnaire. Each scale (ranging from 0 to 100 in score) was compared between the two units. Multivariable regression and bootstrap were used to verify factors associated with the patients' general satisfaction (dependent variable). Patients from unit B reported a higher general satisfaction (91.5 vs. 88.3, p = 0.04), mainly due to a significantly higher satisfaction in the doctor-related scales (doctors' technical skill: p = 0.001; doctors' interpersonal skill: p = 0.008; doctors' availability: p = 0.005, and doctors information provision: p = 0.0006). Multivariable regression analysis and bootstrap confirmed that level of care in unit B (p = 0.006, bootstrap frequency 60%) along with lower level of education of the patient population (p = 0.02, bootstrap frequency 62%) were independent factors associated with a higher general patient satisfaction. We were able to show a different level of patient satisfaction in patients operated on in two different thoracic surgery units. A reduced level of patient satisfaction may trigger changes in the management policy of individual units in order to meet patients' expectations and improve organizational efficiency. Copyright © 2012 S. Karger AG, Basel.
Wilcox, Thomas P; Zwickl, Derrick J; Heath, Tracy A; Hillis, David M
2002-11-01
Four New World genera of dwarf boas (Exiliboa, Trachyboa, Tropidophis, and Ungaliophis) have been placed by many systematists in a single group (traditionally called Tropidophiidae). However, the monophyly of this group has been questioned in several studies. Moreover, the overall relationships among basal snake lineages, including the placement of the dwarf boas, are poorly understood. We obtained mtDNA sequence data for 12S, 16S, and intervening tRNA-val genes from 23 species of snakes representing most major snake lineages, including all four genera of New World dwarf boas. We then examined the phylogenetic position of these species by estimating the phylogeny of the basal snakes. Our phylogenetic analysis suggests that New World dwarf boas are not monophyletic. Instead, we find Exiliboa and Ungaliophis to be most closely related to sand boas (Erycinae), boas (Boinae), and advanced snakes (Caenophidea), whereas Tropidophis and Trachyboa form an independent clade that separated relatively early in snake radiation. Our estimate of snake phylogeny differs significantly in other ways from some previous estimates of snake phylogeny. For instance, pythons do not cluster with boas and sand boas, but instead show a strong relationship with Loxocemus and Xenopeltis. Additionally, uropeltids cluster strongly with Cylindrophis, and together are embedded in what has previously been considered the macrostomatan radiation. These relationships are supported by both bootstrapping (parametric and nonparametric approaches) and Bayesian analysis, although Bayesian support values are consistently higher than those obtained from nonparametric bootstrapping. Simulations show that Bayesian support values represent much better estimates of phylogenetic accuracy than do nonparametric bootstrap support values, at least under the conditions of our study. Copyright 2002 Elsevier Science (USA)
Impact of Sampling Density on the Extent of HIV Clustering
Novitsky, Vlad; Moyo, Sikhulile; Lei, Quanhong; DeGruttola, Victor
2014-01-01
Abstract Identifying and monitoring HIV clusters could be useful in tracking the leading edge of HIV transmission in epidemics. Currently, greater specificity in the definition of HIV clusters is needed to reduce confusion in the interpretation of HIV clustering results. We address sampling density as one of the key aspects of HIV cluster analysis. The proportion of viral sequences in clusters was estimated at sampling densities from 1.0% to 70%. A set of 1,248 HIV-1C env gp120 V1C5 sequences from a single community in Botswana was utilized in simulation studies. Matching numbers of HIV-1C V1C5 sequences from the LANL HIV Database were used as comparators. HIV clusters were identified by phylogenetic inference under bootstrapped maximum likelihood and pairwise distance cut-offs. Sampling density below 10% was associated with stochastic HIV clustering with broad confidence intervals. HIV clustering increased linearly at sampling density >10%, and was accompanied by narrowing confidence intervals. Patterns of HIV clustering were similar at bootstrap thresholds 0.7 to 1.0, but the extent of HIV clustering decreased with higher bootstrap thresholds. The origin of sampling (local concentrated vs. scattered global) had a substantial impact on HIV clustering at sampling densities ≥10%. Pairwise distances at 10% were estimated as a threshold for cluster analysis of HIV-1 V1C5 sequences. The node bootstrap support distribution provided additional evidence for 10% sampling density as the threshold for HIV cluster analysis. The detectability of HIV clusters is substantially affected by sampling density. A minimal genotyping density of 10% and sampling density of 50–70% are suggested for HIV-1 V1C5 cluster analysis. PMID:25275430
Visceral sensitivity, anxiety, and smoking among treatment-seeking smokers.
Zvolensky, Michael J; Bakhshaie, Jafar; Norton, Peter J; Smits, Jasper A J; Buckner, Julia D; Garey, Lorra; Manning, Kara
2017-12-01
It is widely recognized that smoking is related to abdominal pain and discomfort, as well as gastrointestinal disorders. Research has shown that visceral sensitivity, experiencing anxiety around gastrointestinal sensations, is associated with poorer gastrointestinal health and related health outcomes. Visceral sensitivity also increases anxiety symptoms and mediates the relation with other risk factors, including gastrointestinal distress. No work to date, however, has evaluated visceral sensitivity in the context of smoking despite the strong association between smoking and poor physical and mental health. The current study sought to examine visceral sensitivity as a unique predictor of cigarette dependence, threat-related smoking abstinence expectancies (somatic symptoms and harmful consequences), and perceived barriers for cessation via anxiety symptoms. Eighty-four treatment seeking adult daily smokers (M age =45.1years [SD=10.4]; 71.6% male) participated in this study. There was a statistically significant indirect effect of visceral sensitivity via general anxiety symptoms on cigarette dependence (b=0.02, SE=0.01, Bootstrapped 95% CI [0.006, 0.05]), smoking abstinence somatic expectancies (b=0.10, SE=0.03, Bootstrapped 95% CI [0.03, 0.19]), smoking abstinence harmful experiences (b=0.13, SE=0.05, Bootstrapped 95% CI [0.03, 0.25]), and barriers to cessation (b=0.05, SE=0.06, Bootstrapped 95% CI [0.01, 0.13]). Overall, the present study serves as an initial investigation into the nature of the associations between visceral sensitivity, anxiety symptoms, and clinically significant smoking processes among treatment-seeking smokers. Future work is needed to explore the extent to which anxiety accounts for relations between visceral sensitivity and other smoking processes (e.g., withdrawal, cessation outcome). Copyright © 2017 Elsevier Ltd. All rights reserved.
Explanation of Two Anomalous Results in Statistical Mediation Analysis.
Fritz, Matthew S; Taylor, Aaron B; Mackinnon, David P
2012-01-01
Previous studies of different methods of testing mediation models have consistently found two anomalous results. The first result is elevated Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap tests not found in nonresampling tests or in resampling tests that did not include a bias correction. This is of special concern as the bias-corrected bootstrap is often recommended and used due to its higher statistical power compared with other tests. The second result is statistical power reaching an asymptote far below 1.0 and in some conditions even declining slightly as the size of the relationship between X and M , a , increased. Two computer simulations were conducted to examine these findings in greater detail. Results from the first simulation found that the increased Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap are a function of an interaction between the size of the individual paths making up the mediated effect and the sample size, such that elevated Type I error rates occur when the sample size is small and the effect size of the nonzero path is medium or larger. Results from the second simulation found that stagnation and decreases in statistical power as a function of the effect size of the a path occurred primarily when the path between M and Y , b , was small. Two empirical mediation examples are provided using data from a steroid prevention and health promotion program aimed at high school football players (Athletes Training and Learning to Avoid Steroids; Goldberg et al., 1996), one to illustrate a possible Type I error for the bias-corrected bootstrap test and a second to illustrate a loss in power related to the size of a . Implications of these findings are discussed.
Sample size determination for mediation analysis of longitudinal data.
Pan, Haitao; Liu, Suyu; Miao, Danmin; Yuan, Ying
2018-03-27
Sample size planning for longitudinal data is crucial when designing mediation studies because sufficient statistical power is not only required in grant applications and peer-reviewed publications, but is essential to reliable research results. However, sample size determination is not straightforward for mediation analysis of longitudinal design. To facilitate planning the sample size for longitudinal mediation studies with a multilevel mediation model, this article provides the sample size required to achieve 80% power by simulations under various sizes of the mediation effect, within-subject correlations and numbers of repeated measures. The sample size calculation is based on three commonly used mediation tests: Sobel's method, distribution of product method and the bootstrap method. Among the three methods of testing the mediation effects, Sobel's method required the largest sample size to achieve 80% power. Bootstrapping and the distribution of the product method performed similarly and were more powerful than Sobel's method, as reflected by the relatively smaller sample sizes. For all three methods, the sample size required to achieve 80% power depended on the value of the ICC (i.e., within-subject correlation). A larger value of ICC typically required a larger sample size to achieve 80% power. Simulation results also illustrated the advantage of the longitudinal study design. The sample size tables for most encountered scenarios in practice have also been published for convenient use. Extensive simulations study showed that the distribution of the product method and bootstrapping method have superior performance to the Sobel's method, but the product method was recommended to use in practice in terms of less computation time load compared to the bootstrapping method. A R package has been developed for the product method of sample size determination in mediation longitudinal study design.
Testing the weak-form efficiency of the WTI crude oil futures market
NASA Astrophysics Data System (ADS)
Jiang, Zhi-Qiang; Xie, Wen-Jie; Zhou, Wei-Xing
2014-07-01
The weak-form efficiency of energy futures markets has long been studied and empirical evidence suggests controversial conclusions. In this work, nonparametric methods are adopted to estimate the Hurst indexes of the WTI crude oil futures prices (1983-2012) and a strict statistical test in the spirit of bootstrapping is put forward to verify the weak-form market efficiency hypothesis. The results show that the crude oil futures market is efficient when the whole period is considered. When the whole series is divided into three sub-series separated by the outbreaks of the Gulf War and the Iraq War, it is found that the Gulf War reduced the efficiency of the market. If the sample is split into two sub-series based on the signing date of the North American Free Trade Agreement, the market is found to be inefficient in the sub-periods during which the Gulf War broke out. The same analysis on short-time series in moving windows shows that the market is inefficient only when some turbulent events occur, such as the oil price crash in 1985, the Gulf war, and the oil price crash in 2008.
NASA Astrophysics Data System (ADS)
LeBlanc, B.; Batha, S.; Bell, R.; Bernabei, S.; Blush, L.; de la Luna, E.; Doerner, R.; Dunlap, J.; England, A.; Garcia, I.; Ignat, D.; Isler, R.; Jones, S.; Kaita, R.; Kaye, S.; Kugel, H.; Levinton, F.; Luckhardt, S.; Mutoh, T.; Okabayashi, M.; Ono, M.; Paoletti, F.; Paul, S.; Petravich, G.; Post-Zwicker, A.; Sauthoff, N.; Schmitz, L.; Sesnic, S.; Takahashi, H.; Talvard, M.; Tighe, W.; Tynan, G.; von Goeler, S.; Woskov, P.; Zolfaghari, A.
1995-03-01
Application of Ion Bernstein Wave Heating (IBWH) into the Princeton Beta Experiment-Modification (PBX-M) [Phys. Fluids B 2, 1271 (1990)] tokamak stabilizes sawtooth oscillations and generates peaked density profiles. A transport barrier, spatially correlated with the IBWH power deposition profile, is observed in the core of IBWH-assisted neutral beam injection (NBI) discharges. A precursor to the fully developed barrier is seen in the soft x-ray data during edge localized mode (ELM) activity. Sustained IBWH operation is conducive to a regime where the barrier supports large ∇ne, ∇Te, ∇νφ, and ∇Ti, delimiting the confinement zone. This regime is reminiscent of the H(high) mode, but with a confinement zone moved inward. The core region has better than H-mode confinement while the peripheral region is L(low)-mode-like. The peaked profile enhances NBI core deposition and increases nuclear reactivity. An increase in central Ti results from χi reduction (compared to the H mode) and better beam penetration. Bootstrap current fractions of up to 0.32-0.35 locally and 0.28 overall were obtained when an additional NBI burst is applied to this plasma.
Random matrix theory filters and currency portfolio optimisation
NASA Astrophysics Data System (ADS)
Daly, J.; Crane, M.; Ruskin, H. J.
2010-04-01
Random matrix theory (RMT) filters have recently been shown to improve the optimisation of financial portfolios. This paper studies the effect of three RMT filters on realised portfolio risk, using bootstrap analysis and out-of-sample testing. We considered the case of a foreign exchange and commodity portfolio, weighted towards foreign exchange, and consisting of 39 assets. This was intended to test the limits of RMT filtering, which is more obviously applicable to portfolios with larger numbers of assets. We considered both equally and exponentially weighted covariance matrices, and observed that, despite the small number of assets involved, RMT filters reduced risk in a way that was consistent with a much larger S&P 500 portfolio. The exponential weightings indicated showed good consistency with the value suggested by Riskmetrics, in contrast to previous results involving stocks. This decay factor, along with the low number of past moves preferred in the filtered, equally weighted case, displayed a trend towards models which were reactive to recent market changes. On testing portfolios with fewer assets, RMT filtering provided less or no overall risk reduction. In particular, no long term out-of-sample risk reduction was observed for a portfolio consisting of 15 major currencies and commodities.
Verification of continuum drift kinetic equation solvers in NIMROD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Held, E. D.; Ji, J.-Y.; Kruger, S. E.
Verification of continuum solutions to the electron and ion drift kinetic equations (DKEs) in NIMROD [C. R. Sovinec et al., J. Comp. Phys. 195, 355 (2004)] is demonstrated through comparison with several neoclassical transport codes, most notably NEO [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)]. The DKE solutions use NIMROD's spatial representation, 2D finite-elements in the poloidal plane and a 1D Fourier expansion in toroidal angle. For 2D velocity space, a novel 1D expansion in finite elements is applied for the pitch angle dependence and a collocation grid is used for the normalized speedmore » coordinate. The full, linearized Coulomb collision operator is kept and shown to be important for obtaining quantitative results. Bootstrap currents, parallel ion flows, and radial particle and heat fluxes show quantitative agreement between NIMROD and NEO for a variety of tokamak equilibria. In addition, velocity space distribution function contours for ions and electrons show nearly identical detailed structure and agree quantitatively. A Θ-centered, implicit time discretization and a block-preconditioned, iterative linear algebra solver provide efficient electron and ion DKE solutions that ultimately will be used to obtain closures for NIMROD's evolving fluid model.« less
Movement of moisture in refrigerated cheese samples transferred to room temperature.
Emmons, D B; Bradley, R L; Campbell, C; Sauvé, J P
2001-01-01
When cheese samples refrigerated at 4 degrees C in 120 mL plastic tubs were transferred to room temperature at 23 degrees C, moisture began to move from the warmer surface to the cooler interior; the difference after 1 h was 0.2-0.4%. Others had observed that moisture moved from the interior of warmer blocks of cheese to the cooler surface during cooling at the end of cheese manufacture. In loosely packed cheese prepared for analysis, part of the moisture movement may have been due to evaporation from the warmer surface and condensation on the cooler cheese. It is recommended that cheese be prepared for analysis immediately before weighing. Cheese samples that have been refrigerated, as in interlaboratory trials, should also be remixed or prepared again.
DOE Office of Scientific and Technical Information (OSTI.GOV)
B.C. Lyons, S.C. Jardin, and J.J. Ramos
2012-06-28
A new code, the Neoclassical Ion-Electron Solver (NIES), has been written to solve for stationary, axisymmetric distribution functions (f ) in the conventional banana regime for both ions and elec trons using a set of drift-kinetic equations (DKEs) with linearized Fokker-Planck-Landau collision operators. Solvability conditions on the DKEs determine the relevant non-adiabatic pieces of f (called h ). We work in a 4D phase space in which Ψ defines a flux surface, θ is the poloidal angle, v is the total velocity referenced to the mean flow velocity, and λ is the dimensionless magnetic moment parameter. We expand h inmore » finite elements in both v and λ . The Rosenbluth potentials, φ and ψ, which define the integral part of the collision operator, are expanded in Legendre series in cos χ , where χ is the pitch angle, Fourier series in cos θ , and finite elements in v . At each ψ , we solve a block tridiagonal system for hi (independent of fe ), then solve another block tridiagonal system for he (dependent on fi ). We demonstrate that such a formulation can be accurately and efficiently solved. NIES is coupled to the MHD equilibrium code JSOLVER [J. DeLucia, et al., J. Comput. Phys. 37 , pp 183-204 (1980).] allowing us to work with realistic magnetic geometries. The bootstrap current is calculated as a simple moment of the distribution function. Results are benchmarked against the Sauter analytic formulas and can be used as a kinetic closure for an MHD code (e.g., M3D-C1 [S.C. Jardin, et al ., Computational Science & Discovery, 4 (2012).]).« less
Kinesthetic information facilitates saccades towards proprioceptive-tactile targets.
Voudouris, Dimitris; Goettker, Alexander; Mueller, Stefanie; Fiehler, Katja
2016-05-01
Saccades to somatosensory targets have longer latencies and are less accurate and precise than saccades to visual targets. Here we examined how different somatosensory information influences the planning and control of saccadic eye movements. Participants fixated a central cross and initiated a saccade as fast as possible in response to a tactile stimulus that was presented to either the index or the middle fingertip of their unseen left hand. In a static condition, the hand remained at a target location for the entire block of trials and the stimulus was presented at a fixed time after an auditory tone. Therefore, the target location was derived only from proprioceptive and tactile information. In a moving condition, the hand was first actively moved to the same target location and the stimulus was then presented immediately. Thus, in the moving condition additional kinesthetic information about the target location was available. We found shorter saccade latencies in the moving compared to the static condition, but no differences in accuracy or precision of saccadic endpoints. In a second experiment, we introduced variable delays after the auditory tone (static condition) or after the end of the hand movement (moving condition) in order to reduce the predictability of the moment of the stimulation and to allow more time to process the kinesthetic information. Again, we found shorter latencies in the moving compared to the static condition but no improvement in saccade accuracy or precision. In a third experiment, we showed that the shorter saccade latencies in the moving condition cannot be explained by the temporal proximity between the relevant event (auditory tone or end of hand movement) and the moment of the stimulation. Our findings suggest that kinesthetic information facilitates planning, but not control, of saccadic eye movements to proprioceptive-tactile targets. Copyright © 2016 Elsevier Ltd. All rights reserved.
Lathe Attachment Finishes Inner Surface of Tubes
NASA Technical Reports Server (NTRS)
Lancki, A. J.
1982-01-01
Extremely smooth finishes are machined on inside surfaces of tubes by new attachment for a lathe. The relatively inexpensive accessory, called a "microhone," holds a honing stone against workpiece by rigid tangs instead of springs as in conventional honing tools. Inner rod permits adjustment of microhoning stone, while outer tube supports assembly. Outer tube is held between split blocks on lathe toolpost. Microhoning can be done with either microhone or workpiece moving and other member stationary.
Why do workaholics experience depression? A study with Chinese University teachers.
Nie, Yingzhi; Sun, Haitao
2016-10-01
This study focuses on the relationships of workaholism to job burnout and depression of university teachers. The direct and indirect (via job burnout) effects of workaholism on depression were investigated in 412 Chinese university teachers. Structural equation modeling and bootstrap method were used. Results revealed that workaholism, job burnout, and depression significantly correlated with each other. Structural equation modeling and bootstrap test indicated the partial mediation role of job burnout on the relationship between workaholism and depression. The findings shed some light on how workaholism influenced depression and provided valuable evidence for prevention of depression in work. © The Author(s) 2015.
Blank, Jos L T; van Hulst, Bart Laurents
2011-10-01
This paper describes the efficiency of Dutch hospitals using the Data Envelopment Analysis (DEA) method with bootstrapping. In particular, the analysis focuses on accounting for cost inefficiency measures on the part of hospital corporate governance. We use bootstrap techniques, as introduced by Simar and Wilson (J. Econom. 136(1):31-64, 2007), in order to obtain more efficient estimates of the effects of governance on the efficiency. The results show that part of the cost efficiency can be explained with governance. In particular we find that a higher remuneration of the board as well as a higher remuneration of the supervisory board does not implicate better performance.
Construction of prediction intervals for Palmer Drought Severity Index using bootstrap
NASA Astrophysics Data System (ADS)
Beyaztas, Ufuk; Bickici Arikan, Bugrayhan; Beyaztas, Beste Hamiye; Kahya, Ercan
2018-04-01
In this study, we propose an approach based on the residual-based bootstrap method to obtain valid prediction intervals using monthly, short-term (three-months) and mid-term (six-months) drought observations. The effects of North Atlantic and Arctic Oscillation indexes on the constructed prediction intervals are also examined. Performance of the proposed approach is evaluated for the Palmer Drought Severity Index (PDSI) obtained from Konya closed basin located in Central Anatolia, Turkey. The finite sample properties of the proposed method are further illustrated by an extensive simulation study. Our results revealed that the proposed approach is capable of producing valid prediction intervals for future PDSI values.
Spotorno O, Angel E; Córdova, Luis; Solari I, Aldo
2008-12-01
To identify and characterize chilean samples of Trypanosoma cruzi and their association with hosts, the first 516 bp of the mitochondrial cytochrome b gene were sequenced from eight biological samples, and phylogenetically compared with other known 20 American sequences. The molecular characterization of these 28 sequences in a maximum likelihood phylogram (-lnL = 1255.12, tree length = 180, consistency index = 0.79) allowed the robust identification (bootstrap % > 99) of three previously known discrete typing units (DTU): DTU IIb, IIa, and I. An apparently undescribed new sequence found in four new chilean samples was detected and designated as DTU Ib; they were separated by 24.7 differences, but robustly related (bootstrap % = 97 in 500 replicates) to those of DTU I by sharing 12 substitutions, among which four were nonsynonymous ones. Such new DTU Ib was also robust (bootstrap % = 100), and characterized by 10 unambiguous substitutions, with a single nonsynonymous G to T change at site 409. The fact that two of such new sequences were found in parasites from a chilean endemic caviomorph rodent, Octodon degus, and that they were closely related to the ancient DTU I suggested old origins and a long association to caviomorph hosts.
Fernández-Caballero Rico, Jose Ángel; Chueca Porcuna, Natalia; Álvarez Estévez, Marta; Mosquera Gutiérrez, María Del Mar; Marcos Maeso, María Ángeles; García, Federico
2018-02-01
To show how to generate a consensus sequence from the information of massive parallel sequences data obtained from routine HIV anti-retroviral resistance studies, and that may be suitable for molecular epidemiology studies. Paired Sanger (Trugene-Siemens) and next-generation sequencing (NGS) (454 GSJunior-Roche) HIV RT and protease sequences from 62 patients were studied. NGS consensus sequences were generated using Mesquite, using 10%, 15%, and 20% thresholds. Molecular evolutionary genetics analysis (MEGA) was used for phylogenetic studies. At a 10% threshold, NGS-Sanger sequences from 17/62 patients were phylogenetically related, with a median bootstrap-value of 88% (IQR83.5-95.5). Association increased to 36/62 sequences, median bootstrap 94% (IQR85.5-98)], using a 15% threshold. Maximum association was at the 20% threshold, with 61/62 sequences associated, and a median bootstrap value of 99% (IQR98-100). A safe method is presented to generate consensus sequences from HIV-NGS data at 20% threshold, which will prove useful for molecular epidemiological studies. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Highton, R
1993-12-01
An analysis of the relationship between the number of loci utilized in an electrophoretic study of genetic relationships and the statistical support for the topology of UPGMA trees is reported for two published data sets. These are Highton and Larson (Syst. Zool.28:579-599, 1979), an analysis of the relationships of 28 species of plethodonine salamanders, and Hedges (Syst. Zool., 35:1-21, 1986), a similar study of 30 taxa of Holarctic hylid frogs. As the number of loci increases, the statistical support for the topology at each node in UPGMA trees was determined by both the bootstrap and jackknife methods. The results show that the bootstrap and jackknife probabilities supporting the topology at some nodes of UPGMA trees increase as the number of loci utilized in a study is increased, as expected for nodes that have groupings that reflect phylogenetic relationships. The pattern of increase varies and is especially rapid in the case of groups with no close relatives. At nodes that likely do not represent correct phylogenetic relationships, the bootstrap probabilities do not increase and often decline with the addition of more loci.
Comparison of mode estimation methods and application in molecular clock analysis
NASA Technical Reports Server (NTRS)
Hedges, S. Blair; Shah, Prachi
2003-01-01
BACKGROUND: Distributions of time estimates in molecular clock studies are sometimes skewed or contain outliers. In those cases, the mode is a better estimator of the overall time of divergence than the mean or median. However, different methods are available for estimating the mode. We compared these methods in simulations to determine their strengths and weaknesses and further assessed their performance when applied to real data sets from a molecular clock study. RESULTS: We found that the half-range mode and robust parametric mode methods have a lower bias than other mode methods under a diversity of conditions. However, the half-range mode suffers from a relatively high variance and the robust parametric mode is more susceptible to bias by outliers. We determined that bootstrapping reduces the variance of both mode estimators. Application of the different methods to real data sets yielded results that were concordant with the simulations. CONCLUSION: Because the half-range mode is a simple and fast method, and produced less bias overall in our simulations, we recommend the bootstrapped version of it as a general-purpose mode estimator and suggest a bootstrap method for obtaining the standard error and 95% confidence interval of the mode.
Bootstrap percolation on spatial networks
NASA Astrophysics Data System (ADS)
Gao, Jian; Zhou, Tao; Hu, Yanqing
2015-10-01
Bootstrap percolation is a general representation of some networked activation process, which has found applications in explaining many important social phenomena, such as the propagation of information. Inspired by some recent findings on spatial structure of online social networks, here we study bootstrap percolation on undirected spatial networks, with the probability density function of long-range links’ lengths being a power law with tunable exponent. Setting the size of the giant active component as the order parameter, we find a parameter-dependent critical value for the power-law exponent, above which there is a double phase transition, mixed of a second-order phase transition and a hybrid phase transition with two varying critical points, otherwise there is only a second-order phase transition. We further find a parameter-independent critical value around -1, about which the two critical points for the double phase transition are almost constant. To our surprise, this critical value -1 is just equal or very close to the values of many real online social networks, including LiveJournal, HP Labs email network, Belgian mobile phone network, etc. This work helps us in better understanding the self-organization of spatial structure of online social networks, in terms of the effective function for information spreading.
NASA Astrophysics Data System (ADS)
Angrisano, Antonio; Maratea, Antonio; Gaglione, Salvatore
2018-01-01
In the absence of obstacles, a GPS device is generally able to provide continuous and accurate estimates of position, while in urban scenarios buildings can generate multipath and echo-only phenomena that severely affect the continuity and the accuracy of the provided estimates. Receiver autonomous integrity monitoring (RAIM) techniques are able to reduce the negative consequences of large blunders in urban scenarios, but require both a good redundancy and a low contamination to be effective. In this paper a resampling strategy based on bootstrap is proposed as an alternative to RAIM, in order to estimate accurately position in case of low redundancy and multiple blunders: starting with the pseudorange measurement model, at each epoch the available measurements are bootstrapped—that is random sampled with replacement—and the generated a posteriori empirical distribution is exploited to derive the final position. Compared to standard bootstrap, in this paper the sampling probabilities are not uniform, but vary according to an indicator of the measurement quality. The proposed method has been compared with two different RAIM techniques on a data set collected in critical conditions, resulting in a clear improvement on all considered figures of merit.
2009-01-01
Background The International Commission on Radiological Protection (ICRP) recommended annual occupational dose limit is 20 mSv. Cancer mortality in Japanese A-bomb survivors exposed to less than 20 mSv external radiation in 1945 was analysed previously, using a latency model with non-linear dose response. Questions were raised regarding statistical inference with this model. Methods Cancers with over 100 deaths in the 0 - 20 mSv subcohort of the 1950-1990 Life Span Study are analysed with Poisson regression models incorporating latency, allowing linear and non-linear dose response. Bootstrap percentile and Bias-corrected accelerated (BCa) methods and simulation of the Likelihood Ratio Test lead to Confidence Intervals for Excess Relative Risk (ERR) and tests against the linear model. Results The linear model shows significant large, positive values of ERR for liver and urinary cancers at latencies from 37 - 43 years. Dose response below 20 mSv is strongly non-linear at the optimal latencies for the stomach (11.89 years), liver (36.9), lung (13.6), leukaemia (23.66), and pancreas (11.86) and across broad latency ranges. Confidence Intervals for ERR are comparable using Bootstrap and Likelihood Ratio Test methods and BCa 95% Confidence Intervals are strictly positive across latency ranges for all 5 cancers. Similar risk estimates for 10 mSv (lagged dose) are obtained from the 0 - 20 mSv and 5 - 500 mSv data for the stomach, liver, lung and leukaemia. Dose response for the latter 3 cancers is significantly non-linear in the 5 - 500 mSv range. Conclusion Liver and urinary cancer mortality risk is significantly raised using a latency model with linear dose response. A non-linear model is strongly superior for the stomach, liver, lung, pancreas and leukaemia. Bootstrap and Likelihood-based confidence intervals are broadly comparable and ERR is strictly positive by bootstrap methods for all 5 cancers. Except for the pancreas, similar estimates of latency and risk from 10 mSv are obtained from the 0 - 20 mSv and 5 - 500 mSv subcohorts. Large and significant cancer risks for Japanese survivors exposed to less than 20 mSv external radiation from the atomic bombs in 1945 cast doubt on the ICRP recommended annual occupational dose limit. PMID:20003238
Flynn-Evans, Erin E.; Lockley, Steven W.
2016-01-01
Study Objectives: There is currently no questionnaire-based pre-screening tool available to detect non-24-hour sleep-wake rhythm disorder (N24HSWD) among blind patients. Our goal was to develop such a tool, derived from gold standard, objective hormonal measures of circadian entrainment status, for the detection of N24HSWD among those with visual impairment. Methods: We evaluated the contribution of 40 variables in their ability to predict N24HSWD among 127 blind women, classified using urinary 6-sulfatoxymelatonin period, an objective marker of circadian entrainment status in this population. We subjected the 40 candidate predictors to 1,000 bootstrapped iterations of a logistic regression forward selection model to predict N24HSWD, with model inclusion set at the p < 0.05 level. We removed any predictors that were not selected at least 1% of the time in the 1,000 bootstrapped models and applied a second round of 1,000 bootstrapped logistic regression forward selection models to the remaining 23 candidate predictors. We included all questions that were selected at least 10% of the time in the final model. We subjected the selected predictors to a final logistic regression model to predict N24SWD over 1,000 bootstrapped models to calculate the concordance statistic and adjusted optimism of the final model. We used this information to generate a predictive model and determined the sensitivity and specificity of the model. Finally, we applied the model to a cohort of 1,262 blind women who completed the survey, but did not collect urine samples. Results: The final model consisted of eight questions. The concordance statistic, adjusted for bootstrapping, was 0.85. The positive predictive value was 88%, the negative predictive value was 79%. Applying this model to our larger dataset of women, we found that 61% of those without light perception, and 27% with some degree of light perception, would be referred for further screening for N24HSWD. Conclusions: Our model has predictive utility sufficient to serve as a pre-screening questionnaire for N24HSWD among the blind. Citation: Flynn-Evans EE, Lockley SW. A pre-screening questionnaire to predict non-24-hour sleep-wake rhythm disorder (N24HSWD) among the blind. J Clin Sleep Med 2016;12(5):703–710. PMID:26951421
Thura, David; Cos, Ignasi; Trung, Jessica; Cisek, Paul
2014-12-03
Speed-accuracy tradeoffs (SATs) exist in both decision-making and movement control, and are generally studied separately. However, in natural behavior animals are free to adjust the time invested in deciding and moving so as to maximize their reward rate. Here, we investigate whether shared mechanisms exist for SAT adjustment in both decisions and actions. Two monkeys performed a reach decision task in which they watched 15 tokens jump, one every 200 ms, from a central circle to one of two peripheral targets, and had to guess which target would ultimately receive the majority of tokens. The monkeys could decide at any time, and once a target was reached, the remaining token movements accelerated to either 50 ms ("fast" block) or 150 ms ("slow" block). Decisions were generally earlier and less accurate in fast than slow blocks, and in both blocks, the criterion of accuracy decreased over time within each trial. This could be explained by a simple model in which sensory information is combined with a linearly growing urgency signal. Remarkably, the duration of the reaching movements produced after the decision decreased over time in a similar block-dependent manner as the criterion of accuracy estimated by the model. This suggests that SATs for deciding and acting are influenced by a shared urgency/vigor signal. Consistent with this, we observed that the vigor of saccades performed during the decision process was higher in fast than in slow blocks, suggesting the influence of a context-dependent global arousal. Copyright © 2014 the authors 0270-6474/14/3416442-13$15.00/0.
NASA Astrophysics Data System (ADS)
Prebble, Warwick M.; Williams, Ann L.
2016-06-01
Block slides have developed on extremely weak, thin clay seams of tectonic origin, parallel to bedding in gently dipping sandstones and mudstones of Tertiary age. Two areas of noted instability are investigated at Auckland and the Rangitikei valley. Dimensions range from 100 m across × 100 m long for short displacement block slides up to 4 km across × 3 km long for large landslide complexes in which block slides are a major component. Displacements of blocks range from incipient (cm) through short (30 m) to 2 or 3 km for large slides. Many of the Auckland slides are dormant but likely to move in a 2000 year return period earthquake or 100 year high intensity rain storm. At Rangitikei there are many active, younger slides. Sliding rates for active failures vary from a few cm/year to 50 m in 30 min. Host rocks are weak to very weak clayey sandstones and sandy mudstones. The seams are rich in smectite. They have polished and crushed walls, may have slickensides and some contain rounded rock fragments. Laboratory shear strength of the seams is 13 kPa cohesion and 13° friction, with a lower bound of 8° at zero cohesion. Strength is increased at the field scale by waviness, steps and splays. Continuity can be demonstrated over distances of hundreds of metres. Key investigation methods were mapping, shafts and trenches. Tectonic uplift, folding and faulting of the weak Tertiary strata and river down-cutting are perpetuating block slide development.
Improving Block-level Efficiency with scsi-mq
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caldwell, Blake A
2015-01-01
Current generation solid-state storage devices are exposing a new bottlenecks in the SCSI and block layers of the Linux kernel, where IO throughput is limited by lock contention, inefficient interrupt handling, and poor memory locality. To address these limitations, the Linux kernel block layer underwent a major rewrite with the blk-mq project to move from a single request queue to a multi-queue model. The Linux SCSI subsystem rework to make use of this new model, known as scsi-mq, has been merged into the Linux kernel and work is underway for dm-multipath support in the upcoming Linux 4.0 kernel. These piecesmore » were necessary to make use of the multi-queue block layer in a Lustre parallel filesystem with high availability requirements. We undertook adding support of the 3.18 kernel to Lustre with scsi-mq and dm-multipath patches to evaluate the potential of these efficiency improvements. In this paper we evaluate the block-level performance of scsi-mq with backing storage hardware representative of a HPC-targerted Lustre filesystem. Our findings show that SCSI write request latency is reduced by as much as 13.6%. Additionally, when profiling the CPU usage of our prototype Lustre filesystem, we found that CPU idle time increased by a factor of 7 with Linux 3.18 and blk-mq as compared to a standard 2.6.32 Linux kernel. Our findings demonstrate increased efficiency of the multi-queue block layer even with disk-based caching storage arrays used in existing parallel filesystems.« less
Processive motions of MreB micro-filaments coordinate cell wall growth
NASA Astrophysics Data System (ADS)
Garner, Ethan
2012-02-01
Rod-shaped bacteria elongate by the action of cell-wall synthesis complexes linked to underlying dynamic MreB filaments, but how these proteins function to allow continued elongation as a rod remains unknown. To understand how the movement of these filaments relates to cell wall synthesis, we characterized the dynamics of MreB and the cell wall elongation machinery using high-resolution particle tracking in Bacillus subtilis. We found that both MreB and the elongation machinery move in linear paths across the cell, moving at similar rates (˜20nm / second) and angles to the cell body, suggesting they function as single complexes. These proteins move circumferentially around the cell, principally perpendicular to its length. We find that the motions of these complexes are independent, as they can pause and reverse,and also as nearby complexes move independently in both directions across one surface of the cell. Inhibition of cell wall synthesis with antibiotics or depletions in the cell wall synthesis machinery blocked MreB movement, suggesting that the cell wall synthetic machinery is the motor in this system. We propose that bacteria elongate by the uncoordinated, circumferential movements of synthetic complexes that span the plasma membrane and insert radial hoops of new peptidoglycan during their transit.
Plate tectonics and crustal deformation around the Japanese Islands
NASA Technical Reports Server (NTRS)
Hashimoto, Manabu; Jackson, David D.
1993-01-01
We analyze over a century of geodetic data to study crustal deformation and plate motion around the Japanese Islands, using the block-fault model for crustal deformation developed by Matsu'ura et al. (1986). We model the area including the Japanese Islands with 19 crustal blocks and 104 faults based on the distribution of active faults and seismicity. Geodetic data are used to obtain block motions and average slip rates of faults. This geodetic model predicts that the Pacific plate moves N deg 69 +/- 2 deg W at about 80 +/- 3 mm/yr relative to the Eurasian plate which is much lower than that predicted in geologic models. Substantial aseismic slip occurs on the subduction boundaries. The block containing the Izu Peninsula may be separated from the rigid part of the Philippine Sea plate. The faults on the coast of Japan Sea and the western part of the Median Tectonic Line have slip rates exceeding 4 mm/yr, while the Fossa Magna does not play an important role in the tectonics of the central Japan. The geodetic model requires the division of northeastern Japan, contrary to the hypothesis that northeastern Japan is a part of the North American plate. Owing to rapid convergence, the seismic risk in the Nankai trough may be larger than that of the Tokai gap.
A study of the extended-range forecasting problem blocking
NASA Technical Reports Server (NTRS)
Chen, T. C.; Marshall, H. G.; Shukla, J.
1981-01-01
Wavenumber frequency spectral analysis of a 90 day winter (Jan. 15 - April 14) wind field simulated by a climate experiment of the GLAS atmospheric circulation model is made using the space time Fourier analysis which is modified with Tukey's numerical spectral analysis. Computations are also made to examine how the model wave disturbances in the wavenumber frequency domain are maintained by nonlinear interactions. Results are compared with observation. It is found that equatorial easterlies do not show up in this climate experiment at 200 mb. The zonal kinetic energy and momentum transport of stationary waves are too small in the model's Northern Hemisphere. The wavenumber and frequency spectra of the model are generally in good agreement with observation. However, some distinct features of the model's spectra are revealed. The wavenumber spectra of kinetic energy show that the eastward moving waves of low wavenumbers have stronger zonal motion while the eastward moving waves of intermediate wavenumbers have larger meridional motion compared with observation. Furthermore, the eastward moving waves show a band of large spectral value in the medium frequency regime.
Precise positioning of an ion in an integrated Paul trap-cavity system using radiofrequency signals
NASA Astrophysics Data System (ADS)
Kassa, Ezra; Takahashi, Hiroki; Christoforou, Costas; Keller, Matthias
2018-03-01
We report a novel miniature Paul ion trap design with an integrated optical fibre cavity which can serve as a building block for a fibre-linked quantum network. In such cavity quantum electrodynamic set-ups, the optimal coupling of the ions to the cavity mode is of vital importance and this is achieved by moving the ion relative to the cavity mode. The trap presented herein features an endcap-style design complemented with extra electrodes on which additional radiofrequency voltages are applied to fully control the pseudopotential minimum in three dimensions. This method lifts the need to use three-dimensional translation stages for moving the fibre cavity with respect to the ion and achieves high integrability, mechanical rigidity and scalability. Not based on modifying the capacitive load of the trap, this method leads to precise control of the pseudopotential minimum allowing the ion to be moved with precisions limited only by the ion's position spread. We demonstrate this by coupling the ion to the fibre cavity and probing the cavity mode profile.